24" Widescreen CRT (FW900) From Ebay arrived,Comments.

polarizer mod.
Do you have a direct link to this sort of mod. I like mods, and even if I don't entertain the idea of doing it, knowing is half the battle.
black almost not visible
When I removed the glare screen, I marveled at how much light it blocked. I see what you are saying, and did set my brightness to 0. Contrast is 65. 8/8/12 on the RGB BIAS looks like a really nice color. Blacks are certainly more "black". I am in a well lit room, so I don't expect to be able to get a good impression until this evening during "real gaming hours". Is there a good comparison photo that will guide me better? I have a suite of larger "Test" images I have been running as a screen saver for the last 10 years. I never got into purity of color and blacks. Close counts, and I am as delicate as a sledge hammer. This monitor would probably look better WITH any sort of help in the black department.

If you knew what my monitor looked like after I destroyed the plastic anti-glare, you would understand that this is an exponential improvement. That is a story for another post however.

CustomPattern.png Stepchart_large_color2.jpg
 
That is a story for another post however.
About 3 years ago, I moved. I never had an issue with the monitor screen, and ALWAYS used utmost care to keep it defect free. I got lazy and moved it without covering it like I usually do. Either my belt buckle or a button put a healthy scratch dead center. I knew about the glare coat removal, so I wasn't too hot. I just didn't feel like putting any work into it. I decided that I would ignore everything ever posted on the internet, and try to "buff it out". Meanwhile I was hoping that this was not one of the coats that would leave adhesive behind. I really managed to ruin the screen. But because I have the common sense of a rino, I continued to buff it. I work around metal, so I am familiar with buffing materials etc. I figured I could just shine it all up as best I can and it would be good enough. Here is a photo from right before I removed the coat....for good laughs. It was slowly coming off, but I gave up/gave in to the no-glare film screen meme.
glare coat ruined.jpg
 
Original FW900 coating is pretty terrible for sure. It have nasty greenish tint to it and is actually pretty bad at blocking glare or improving contrast.
Other Trinitrons I used/use have very dark gray coatings, so much better than they put on FW900... such a shame.

Still any original coating have pretty terrible glare and reflections compared to polarizer.
Procedure is simple, get film without glue (I bought one with glue - a big mistake, impossible to put it properly - left with some air bubbles inside - still workable though) and put some water with sugar (as someone suggested on internet) and just put it. Any air bubbles will be easy to 'massage out'. I should be ennough to hold film in place for years to come. Any mistake in this case is easily correctible. Not so much with polarizer with glue... :(

Downside to polarizer is that it block a lot of light going from monitor. More than half. So you better have super bright tube to begin with.
I put linear polarizer.
Circular polarizers have this awesome effec:
circular-figure-2.png

Basically they could block most outside light reflected by phosphor making monitor have inkly blacks with full light on in your room.
They block even more light though...
 
So you better have super bright tube to begin with.
I currently have the brightness set at 1. I think I could get away with.

Circular polarizers
Would you recommend circular polarized or linear? The only thing I know about Circular polarized light is that a former pet of mine could see this type of light. I also knew to keep my hands clear. I know that linear polarization is used in sun glasses, and screws up how ATM machine screens look. Is the benefit of the Circular polarized that it just blocks more outside light then the linear?

impossible to put it properly
I have friends who install auto glass film professionally. I like the sugar water trick with non-adhesive film however. I still would pay them a 6 pack or two of cheap american lager to install for me. They have access to techniques to make sure no air bubbles, and are better at it then I am. I think knowing what is involved, it would be best to do ASAP as the glass is still fresh and without micro-scratches.

Is there a brand/name/source that you can mention for any of the films you are talking about? I think I could manage the installation of sugar water method with little issue, and IF there is it would be easy to clean off and try again. Flat glass is much more forgiving then curved rear windows. From what I read, the non-adhesive version would be easy to replace if/when it gets worn also.

I may try both in the very near future knowing that it is only temporary.

*Edited in my former pet that could see circular polarized light* Peacock mantis shrimp. God's way of punishing errant fingers in the oceans

mantis in the rocks.jpg
 
Last edited:
Its hard to catch a photo with the monitor refreshing, and my 2011 camera phone sucks. The screen still seems too bright/light, and I am 99% sure I will be trying to improve this. Resolution was at 1680x1050 100hz. Except for the game, all the images are ultra-hd desktop/screensavers. They all look very similar to what I see. It may be a help to me if I include one of these as its native format. Perhaps you can compare your screens to my pictures and let me know what you think.

I just sat here and stared at the last photo of the beach vs what it really is. The camera is not picking that one up like it really looks. The gameplay of strider looks more blurry then in actuality.

20180413_231243.jpg 20180413_233129.jpg 20180413_231221.jpg 20180413_231237.jpg

Sunset on beach HD.jpg
 
Last edited:
I currently have the brightness set at 1. I think I could get away with.
Brightness in this content is what CRT monitors refer as setting 'Contrast' or 'GAIN' or if I remember correctly also some times 'drive'
Setting 'Brightness' or 'BIAS' is basically black level.

Would you recommend circular polarized or linear?
Circular should be better
Transmittance of linear is 43% and circular is 42% so not a massive difference for potential gains in blocking outside light more efficiently.

Is the benefit of the Circular polarized that it just blocks more outside light then the linear?
It twists light in some fancy way making it more suitable for our purpose as explained in last image I posted previously. In other way I am not that knowledgeable in how it works to explain it any better.
Thing to note is that this effect is dependent on how you put it. Before applying it to screen permanently just try which orientation works the best. I had two small testers one left handed and one right handed. There were differences, one orientation would block more glare but I am not sure how about this effect of blocking more outside light because those were like 10cm x 10cm, too small to really tell much
115726_YRXVOGS.jpg


I have friends who install auto glass film professionally. I like the sugar water trick with non-adhesive film however. I still would pay them a 6 pack or two of cheap american lager to install for me. They have access to techniques to make sure no air bubbles, and are better at it then I am. I think knowing what is involved, it would be best to do ASAP as the glass is still fresh and without micro-scratches.
Sounds good

Is there a brand/name/source that you can mention for any of the films you are talking about? I think I could manage the installation of sugar water method with little issue, and IF there is it would be easy to clean off and try again. Flat glass is much more forgiving then curved rear windows. From what I read, the non-adhesive version would be easy to replace if/when it gets worn also.
https://www.3dlens.com/shop/circular-polarizer-500x1000mm.php
Left handed is probably better. It seemed that way, not sure why.

*Edited in my former pet that could see circular polarized light* Peacock mantis shrimp. God's way of punishing errant fingers in the oceans
Strange but awesome pet :dead:

Perhaps you can compare your screens to my pictures and let me know what you think.
Some of the effects like bright black bars above and under very bright image are inherent to how CRTs work so it is hard to judge eg. if you have G2 issue.
Next time can you post for comparison black screen and this picture scaled to full screen http://www.lagom.nl/lcd-test/black.php ?
 
black screen and this picture

Its odd what my camera phone picks up that is not visible to the eye.
First is full black, 2nd is the test you suggested at 2 distances, and the 4th is another image I have used over the years.
black screen.jpg this picture close.jpg this picture far.jpg black to white.jpg

Thanks for the link. I may end up ordering it in the morning.
 
Its hard to catch a photo with the monitor refreshing, and my 2011 camera phone sucks. The screen still seems too bright/light, and I am 99% sure I will be trying to improve this. Resolution was at 1680x1050 100hz. Except for the game, all the images are ultra-hd desktop/screensavers. They all look very similar to what I see. It may be a help to me if I include one of these as its native format. Perhaps you can compare your screens to my pictures and let me know what you think.

I just sat here and stared at the last photo of the beach vs what it really is. The camera is not picking that one up like it really looks. The gameplay of strider looks more blurry then in actuality.
It's completely normal that the display is too bright with the coating off. It blocks 34% of the light in the first place.

The very least thing to do in that situation is to perform a white point balance procedure to adapt the settings to the lack of AR film. That won't solve everything (contrast should remain worse than with the film on), but the gray balance / black / white should improve.

Setting RGB bias very low is an absurd trick, it means trying to decrease brightness by shutting down the color balance circuits (and ruining the color balance doing so). Using a car analogy, that's decreasing your engine power to try offsetting defective brakes.
 
It's completely normal that the display is too bright with the coating off. It blocks 34% of the light in the first place.
Polarizer block 57% of light which make it nearly four times better at blocking light from outside.
And polarizer make screen screen to have very nice non-greenish tint. Definitely the way to go as far as picture quality and aesthetics go.
It reduces longevity by making monitor work harder but at least it protects the screen so should anything happen.

The very least thing to do in that situation is to perform a white point balance procedure to adapt the settings to the lack of AR film. That won't solve everything (contrast should remain worse than with the film on), but the gray balance / black / white should improve.
It seems RandomNameAndNumber have G2 issue or camera blown everything out of proportions completely... which seems not be the case because I can see black keyboard being illuminated by screen and it is pretty dark.

Setting RGB bias very low is an absurd trick, it means trying to decrease brightness by shutting down the color balance circuits (and ruining the color balance doing so). Using a car analogy, that's decreasing your engine power to try offsetting defective brakes.
It is an easily available trick and when it works it works.
In this case (brightness 0 and 8/8/12 RGB BIAS) it obviously doesn't work anymore.
Definitely based on pink and green on last photo this monitor would use some adjustment so doing full WPB is a very good idea.

As for color accuracy vs G2 voltage I did not seen any change using G2 at its limit (making around 0 brightness to have desired black level) and lowering it so that Brighness had to be very high. Not a very scientific test without battery of measurements using sensitive device but if there were any differences anyone should worry about they should be amplified to the point of being easily visible.

I have CRT (Dell P1110 - had done G2 voltage fix on it - it was looking like these photos before :dead: ) and i1 Display Pro so I might do some measurements and this highly praised WPB thingy.

Its odd what my camera phone picks up that is not visible to the eye.
First is full black, 2nd is the test you suggested at 2 distances, and the 4th is another image I have used over the years.
Seems you will need to do G2 Voltage reduction using WinDAS and USB-TTL converter
If you want to correct colors get some sensitive color calorimeter. The best one to have is i1 Display Pro but not sure about this case specifically, especially you do not want to use LCD's. Browse this thread for recommendations or maybe some CRT-phile will chime in.

Thanks for the link. I may end up ordering it in the morning.
Be sure to also get required hardware to do G2 fix.
And it would be lovely if you make nice photo session before and after after WPB and at the same white level (eg. 100cd/m2) and do measurements of monitor at fairly high brightness (actually high CONTRAST) and immediately after you put polarizer at the same settings.
We lack proper measurements of this kind of mod here at [H] and it would be great contribution on your side :cool:
 
Last edited:
Polarizer block 57% of light which make it nearly four times better at blocking light from outside.
Ah, oh, well, indeed it's very useful. That'll surely reduce the risk of having the tube developping a skin cancer because of the daylight. :ROFLMAO:
 
Ah, oh, well, indeed it's very useful. That'll surely reduce the risk of having the tube developping a skin cancer because of the daylight. :ROFLMAO:
If it was only about daylight...
I found it impossible to get decent image quality from FW900 with coating removed and even with it looks worse than most other big CRT monitors.
Today you can buy gaming LCD with tons of IPS glow, play with small ambient lamp in the room and have parts of screen where there is biggest glow being perceived as 'inly black'.
Doing the same kind of trick for monitor with such light screen color is nearly impossible if at all possible. Definitely not to satisfactory degree for me.
...or just put polarizer and have inky blacks with any small lamp placed pretty much anywhere that isn't causing glare (like you would do for glare LCD) and impress guests with why you use this bulky monitor instead of larger LCD with higher refresh rate, higher resolution with perfect sharpness, and as much motion clarity with strobing and which is usable for desktop (since you can disable strobing and have sharp image)... actually it still makes little sense but oh well... at least if you are CRT fan you can show off why you are fan of this tech and not look like idiot.

I am big fan of this monitor but it's superiority days have long time passed and inherent flaws should not be ignored/downplayed just to make impression it is better than it is, especially when some of them can be successfully mitigated making it have better image quality than it ever had.

Of course polarizer mod stays in direct contradiction to all the bullshit many people here have said over the years and I am not at all surprised for back-lashing I get each time I mention it :dead:



 
Last edited:
View attachment 66279
I registered here finally after well over a decade of following this thread. I picked up a GDM-Fw900 nearly 13-14 years ago. My story is similar to many others. I ordered one from flea-bay, only to have UPS destroy it, and then attempt to "drop and run" at my front porch. Luckily I was waiting for the delivery guy, and the box rattled and he tried to tell me that is how it was shipped. We found out that it was destroyed enroute and they themselves (UPS) attempted to re-box it so it would be less noisy. They ended up paying me for it AND recycling it for me.
A few months later, a "Local" studio 2 hours away had 3x of these for sale @ $50 each. I should have bought all 3 at that time, but hindsight is 20/20. I got the nicest of the 3.

I just wanted to thank everyone. I was getting ready to get rid of this monitor. The screen was ruined, and the clarity was tragic. In the 13 years of ownership, I managed to scratch the screen badly. About a year ago, I replaced it with a 144hz LCD and have never been so unimpressed. I am very DIY, and though I did not want to remove the coating I did it anyway yesterday. If it took 20 minutes, I would be shocked. It peeled off, and left no residue.

The monitor is back in use, and looks better then ever. Windows allowed me to install the drivers by un-checking the "compatible hardware" box. Currently using a 10 year old Phenom II quad core PC with a new-ish GTX 960, and it seems to be the last generation of analog outputting GPUs. I don't know IF I will upgrade my GPU again. New games bore me, and the older games I enjoy continue to work great. Windows 7 loves this monitor, and has given me no issues using Nvidia CP to set custom refresh rates and resolutions using a DVI>5BNC cable. Nvidia still reports it as a generic analog device, and that is OK.

Everyone's advice and pictures made it possible for me to even think that tearing this thing apart was possible. I do hope to get many more years out of it, but only time will tell. I think when this monitor dies, I may simply give up on "gaming", because I have yet to find anything that compares.


Just in case you haven't come across this, be sure to go through my White point balance guide here.

And contrary to what some people say, you can get an absolutely fantastic image without any coating *if you have good lighting conditions*, and calibrate the tube properly. I've got four of these monitors, two of which have no coating, and I have absolutely no green tint on mine.
 
camera blown everything out of proportions completely
This camera-phone is a 2011 model samsung. You are seeing photos at the max resolution. As I somewhat indicated earlier, I keep a fish tank. It is Salt-water and it is 36" long HD display. IF I had to take a guess, the lighting from the 6x Tubes put out a 10-14 kelvin color range. During peak hours, This is 250w+ of crisp white light, and when the blue actinic bulbs are on, it is IMPOSSIBLE to take a photo with this phone. I can't even describe the look, so I will include a photo. I had a newer phone that had a great camera that worked great, but I hated it for other reasons and quit using it. I think that brightness relative to conditions throws off the ability to capture accurate colors. This looks NOTHING like the actual display of what is shown.

20180311_184709.jpg
Seems you will need to do G2 Voltage reduction using WinDAS and USB-TTL converter

This monitor has not been adjusted at all during my time of ownership. That it even still works is surprising to me. We have had some times, her and I...and if she could talk I certainly would have ended her. However, she is an inanimate object.

I just did a quick search for USB-TTL converters, and they seem very cost effective. Is $5-10 a normal price? I have read about these devices in the past, and never once picked one up because it seemed like it should be a pricey thing. WinDAS is software that runs on the computer, so that seems free.

https://www.ebay.com/itm/USB-To-RS2...641691?hash=item5d724d85db:g:nCMAAOSw5IJWdqgZ
Would this item do what was needed? Can I run it from the pinouts in the monitor to the USB ports on the stand which is plugged to the PC?

If you want to correct colors
Seeing how much some of the calibration hardware costs, I doubt that would be something I get into. If I could one on the cheap, great but I would be happy with eye-ballin' it. Is there any places in the USA that offer all of these services? I can't see me needing this sort of thing more then one time, and getting a "technician" who could set the G2 voltage and have all the knowledge is something worthwhile to me. I don't know how hard hooking the TTL up would be. I can only assume there is some sort of diy tutorial out there.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
https://www.ebay.com/itm/USB-To-RS2...641691?hash=item5d724d85db:g:nCMAAOSw5IJWdqgZ
Would this item do what was needed? Can I run it from the pinouts in the monitor to the USB ports on the stand which is plugged to the PC?

Yes, that's the one I have. And yes, you plug it directly into a USB port (although to get it to work with WinDAS you'll need to ensure that it's on COM1 or COM2 or COM3 or COM4 (maybe COM5, but I don't think so). In WinDAS in the settings you'll see what I mean.



Seeing how much some of the calibration hardware costs, I doubt that would be something I get into.

See my guide, linked above, and read the section on choosing a measuring instrument. There are units on ebay right now for under $20 (search for Monaco Optix).
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Dude, that is creepy...
I used to drink. ALOT. I never acted well. Giving me a microphone and allowing me to play violent video games with access to the ears of other humans was a mistake. The only witness anymore is the monitor. I also don't have a microphone anymore lol. I have an odd/sick outlook on life.

Yes, that's the one I have.
Monaco Optix

Thanks. I will be ordering asap.

I made an offer on the best looking Monaco Optix that looked working. If I can get it for $25 and use it once or twice, it will pay for itself. Thanks for the advice!

I am getting excited again. I got polarizing film and "stuff" to make this old boat anchor really shine again.
 
Last edited:
Before you change anything though, absolutely make sure to save the current settings of the monitor in a file with Windas.

Without film, the white point balance procedure should work well. With a polarizer that is much too dark, I'm not very confident in the result you'll get (there are brightness targets that probably can't be reached). It's better to have a backup of the settings to reload in case unexpected things happen.
 
I haven't posted much in this thread, but I wanted to drop in and ask ya'll a question. I've been using an FW900 monitor for a few years and I really like it. I intend to keep using it for several more years at least.

I just had a Geforce 980 Ti die on me. It's still under warranty so I should have no trouble replacing it. What I'm wondering though is if it makes sense to upgrade to a 1080 then use the Sunix adapter people have been talking about?

I actually don't need a more powerful card but being more future-proof wouldn't be a bad thing. I'm more interested in absolute image quality and the ability to use all resolutions and refresh rates.

At this point, is there any reason to keep using the 980 Ti or would a 1080 plus Sunix adapter be a better choice overall?
 
hi, jrodefeld im also using a gtx 980 ti with the fw900, also i use multiple refreshes and resolutions for retro arcade emulators to moderm games and personaly i dont see the need to upgrade this card yet.

this card allows me to reach resolutions and refreshes such 2560 x 1600 @ 68hz 1920 x 1200 96hz with complete stable image with no distortion. from what i have read in this thread, actual digital to analog converters such the delock and the sunix (the best ones) still have some issues or cannot reach such high resolution / hz levels at stable levels like the gtx 980 ti can.
i think those adapters still need to mature a little more. so , in my opinion i would prefer to wait sometime for these adapters to mature better and aim to something such the 1080ti o better generation for a worth upgrade, also to give more time for those adapters to mature.
well that just my opinion as a 980ti fw900 user too.



I've found that good LCDs at this point have improved over CRTs, even badass ones like these, in most categories. Size, resolution, contrast, colors, and finally even motion clarity, all depending on the specific model of course. Feel free to tell me why I'm wrong :).

I am big fan of this monitor but it's superiority days have long time passed



i have been waiting years for a modern monitor that can do the following without the need to use a diferent monitor for each feature:

-display excelent blacks in both iluminated or dark enviroments where the monitor is placed.

-excelent viewing angles

-excelent latency, close to 0ms

-excellent clear motion clarity at any refresh rate between 55 - 120 or more hz, dont care about flicker as long as it flickers like crt monitor.

-refresh rates between 55 - 120hz or more

-excellent colors.

-excellent multi resolution capabilities without blurry scaling


-widescreen format, aspect ratio, size and resolutions like the fw900 at least (also can be 16:9 1080p).


because so far the fw900 seems to be the only one able to achieve all that in one single monitor.

so please, repoman0,
XoR_ or anyone,
correct me if i am wrong and please name a modern monitor model that can achieve all that. im pretty interested about it.
 
Last edited:
please, repoman0, or XoR_, or anyone...i have been waiting years for a modern monitor that can do the following wihtout the need to use a diferent monitor for each feature:
That are some bold statements you got here :ROFLMAO:

-display excelent blacks in both iluminated or dark enviroments where the monitor is placed.
Due to much worse than pretty much any other CRT color of tube FW900 need some very thought of placement or it will have terrible blacks. Forget about having any light in your room at day time... at night using ambient light is at least very very tricky... preferably you have some black walls and clothes... and face.
I can put gaming LCD pretty much anywhere and for inky blacks only make sure to have ambient light. With enough light in room even IPS glow is not noticeable.

And what is black anyway? LCD destroys any CRT in ANSI contrast ratio. Black objects in bright scene won't look black at all on any CRT. This is inherent flaw of the tech. Put anything on screen ans you cannot get perfect black. Even the way gamma response work you cannot really calibrate your display for perfect black while displaying black screen (what would be the purpose of this anyway...) and not have black details crushed.

-excelent viewing angles
Here CRT shines
There are other tech that can do it too including LCDs, albeit none of the gaming monitor does have 'perfect' viewing angles despite the tech exists for it (A-TW)
In my opinion gaming IPS have good enough viewing angles as there is no gamma shift which is the worst offender.

-excelent latency, close to 0ms
-excellent clear motion clarity at any refresh rate between 55 - 120 or more hz, dont care about flicker as long as it flickers like crt monitor.
-refresh rates between 55 - 120hz or more
Most gaming G-Sync monitors have inperceptible input lag and strobbing.
Flickering starts at 85Hz or so Hz. Minor inconvenience in most cases.
Motion clarity without strobbing is not excellent but imho adequate even on 60Hz monitor as long as it is not VA panel.
Also using V-Sync ON to get this perfect motion clarity completely butchers input lag so these features (low input lag and motion clarity) are mutually exclusive.
Many modern LCDs can do wide range of refresh rates and gaming LCDs can do it dynamically to match game frame rate - the one feature that CRT miss badly imho
I take VRR without input lag any day and night over strobing and input lag.

Also for many emulators getting perfect motion is very tricky without VRR.

-excellent colors.
Many gaming IPS have better gamut and gamma response than CRTs

-excellent multi resolution capabilities without blurry scaling
For 3d games it is non-essential feature when you have strong enough GPU and VBR
In emulators you can usually use integer scaling and fancy filters to mitigate scaling issues
For older games and emulators all you need is 4:3 monitor and many of these come with better image quality also and are way cheaper and more available, especially today where you can get one for a smile and a bottle (or two) of vodka.

-widescreen format, aspect ratio, size and resolutions like the fw900 at least (also can be 16:9 1080p).
All LCDs today have wide screen format and much better size. 22" of FW900 isn't exactly making it a big display you know...
This is the only thing FW900 beats other CRT's though...

because so far the fw900 seems to be the only one able to achieve all that in one single monitor.
correct me if i am wrong and please name a modern monitor model that can achieve all that if i am.
Going with your way of reasoning FW900 does not achieve many things many other monitors takes for granted like actually sharp image when displaying 2d images, good ANSI contrast, look great at full daylight, not flicker (being able to), support digital inputs (which are only what is available in modern GPUs), being thin and light and have very big screen and resolutions (like 4K), reasonable power consumption, not needing to color calibrate it constantly or use hacky methods to fix degrading black level, perfect geometry without issue of did I hit perfect aspect ratio yet or not... it doesn't seem right so lets tweak it... no, still not yet... better use ruler or something or I am gonna go crazy!.

If you are a true perfectionist you still need multiple display, each for different purpose.
And personally the most fun from playing games I in a long time I got from my 4K UHD G-Sync 60Hz gaming LCD and would also have hard time to choose FW900 over much bigger Panasonic plasma with even better contrast for console gaming.

Your kind of reasoning was adequate for many many years. Not only FW900 was pretty big compared to typical LCD but had higher refresh rate and resolutions and strobbing while LCD had usually 60Hz, high input lag (especially on IPS monitors) and no Variable Refresh Rate (despite many people thinking this just the way all LCD's work :ROFLMAO:) and usually pretty bad colors. Today (actually 'tomorrow' :p) choice is very simple: get 4K 144Hz with G-Sync, HDR, wide-gamut and direct LED backlighting system with 384 zones for 3000$ and you have near perfect gaming monitor until OLED gaming monitors come out... for probably double/triple that price :hungover:

SONY GDM-FW900 is an exceptional monitor and best gaming monitor for more than decade and still score some wins but let's not get ahead ourselves. Its times of absolute superiority are over. And for fracks sake am I happy about that
jupix.gif
ok2.gif


Of course many of the flaws that CRT have for me might not bother you that much to even point them. Just like many flaws of modern gaming IPS flaws do not bother me as much as some other CRT users. This is called personal preference and is inherently not objective.
All that said I think FW900 (mine anyway
splat.gif
) is a great piece of hardware and I am glad I get to own it. I even consider getting XRGB for 15KHz consoles/computers for it.
 
Before you change anything though, absolutely make sure to save the current settings of the monitor in a file with Windas.

Without film, the white point balance procedure should work well. With a polarizer that is much too dark, I'm not very confident in the result you'll get (there are brightness targets that probably can't be reached). It's better to have a backup of the settings to reload in case unexpected things happen.
Good point. Saving original setting is a must! Preferably to a floppy disk
jezyk2.gif


What are the highest brightness targets during WPB?
 
I remember 5+ years ago I tried an LCD that ran at 60hz, and I could see the lag between moving the mouse and the crosshairs moving on the monitor. Switched back to the FW900, and lag was gone.

Would you guys say this was due to the LCD's input lag or its 60hz or both?
What would I have to look for in a new LCD that would not have this lag other than the obvious of getting a higher refresh rate?

This is important to me because I still play Unreal Tournament 2004 with a few old buddies, and input lag on a hitscan game is just a flat plain no no.
 
I remember 5+ years ago I tried an LCD that ran at 60hz, and I could see the lag between moving the mouse and the crosshairs moving on the monitor. Switched back to the FW900, and lag was gone.

Would you guys say this was due to the LCD's input lag or its 60hz or both?
What would I have to look for in a new LCD that would not have this lag other than the obvious of getting a higher refresh rate?
Higher refresh rate monitors always help reduce input lag.
Some monitor can have a lot input lag
60Hz LCD without input lag should feel pretty much the same as 60Hz CRT... you do not however use FW900 at 60Hz, don't you?

It is strange you haven't used 60Hz LCD in such a long time BTW
They are everywhere, everyone have them
 
If you could ‘see’ the lag then that was probably due to input lag from the LCD especially if it was not a dedicated TN gaming monitor. Display lag is the sum total of the processing delay added by the panel electronics and, with LCDs, the switching time required of the liquid crystals to realign themselves. This doesn’t take into account the inherent “lag” of having to draw a frame at a particular rate instead of having it instantaneously appear, which most displays operate under.

Most gaming monitors in the past few years will not have any appreciable amount of lag, but be aware that without a blur-reducing strobing mode and high fps, they will still look like a mess when you are looking around in-game compared to your old FW900 at even half the refresh rate. This is due to LCDs needing to ‘hold’ the image after scanning, which your brain perceives as a smearing effect owing to the way light is accumulated by your eyes.
 
Higher refresh rate monitors always help reduce input lag.
Some monitor can have a lot input lag
60Hz LCD without input lag should feel pretty much the same as 60Hz CRT... you do not however use FW900 at 60Hz, don't you?

It is strange you haven't used 60Hz LCD in such a long time BTW
They are everywhere, everyone have them
I do have 60Hz LCDs for my everyday PCs, but there is no lag in windows. When I did the LCD test on my gaming rig, I did not notice any lag in windows, either.
I have my FW900 at 2304x1440 @ 80hz.
 
Hi Everyone. Well, I guess OLED with 120 Hz interface didn't happen this year. Maybe next year I'll finally build that 4k computer...can't bring myself to do it now knowing 120 Hz is around the corner and will at least partially mitigate motion issues...

In the meantime, though I prefer dual LCDs for work stuff, e.g., MS office, coding, for the real estate and in a brightly lit office, still very happy with an FW900 for my home enthusiast machine.

As others have stated, you have to chose which LCD panel to get the best contrast or color or motion, etc., and they won't be the same panel. Whereas CRT, while it may not be the optimal solution for most tasks these days, it does everything at least well. And there's just no getting away from the raw image quality of the device. Especially its black levels and dynamic range (15K to 1 by one estimate). Moot, if you can't control ambient lighting, but at home I can...

And even with hundreds of zones, a CRT, though I realize it's very much a best case scenario due to internal reflections, has up to over 1.6 million zones with the resolution I use. Really, who would want to replace an emissive technology with a back lit one if they had a choice?

Truth is though, I spent plenty of money on other displays. I wish I hadn't.

BTW: The above mentioned resolution, 1600 by 1024 at 100 Hz. This is the sweet spot to my eyes on the FW900, sharp, fast, and correct aspect ratio for the display's visible area. It's a lower resolution, but still looks really good to me. I think CRT is more forgiving in this fashion. (The listed 1920 by 1200 also looks good to me, but presumably not as sharp as it is at the physical limit of the device's resolution in terms of phosphors with no margin reserved for errors.) I guess even at 1024p, higher refresh would introduce some blur? In any case, I have no need for higher.

Finally, one word of caution, if you get a fresher tube, do not remove the conductive tape. The young ones are more sparky and I damaged the film in the lower right corner. Normally, it's not that noticeable at least. (May go the polarizer route at some point...)
 
Last edited:
If you could ‘see’ the lag then that was probably due to input lag from the LCD especially if it was not a dedicated TN gaming monitor. Display lag is the sum total of the processing delay added by the panel electronics and, with LCDs, the switching time required of the liquid crystals to realign themselves. This doesn’t take into account the inherent “lag” of having to draw a frame at a particular rate instead of having it instantaneously appear, which most displays operate under.

Most gaming monitors in the past few years will not have any appreciable amount of lag, but be aware that without a blur-reducing strobing mode and high fps, they will still look like a mess when you are looking around in-game compared to your old FW900 at even half the refresh rate. This is due to LCDs needing to ‘hold’ the image after scanning, which your brain perceives as a smearing effect owing to the way light is accumulated by your eyes.
Is this the anti-blur some LCDs are advertising?
 
Truth is though, I spent plenty of money on other displays. I wish I hadn't. Refusing to believe that technology, on some key fundamentals, doesn't always advance cost me plenty...
Yep, at the end of the day a high-end CRT is still not replaceable for what it does. You can achieve something similar with a rotation of maybe two or three LCDs of different types and sizes, but come on.
Is this the anti-blur some LCDs are advertising?
Should be. It does add a little lag, though, but not significant amount. Check http://www.tftcentral.co.uk/ for more info if you're going to dive into the pool of gaming LCDs.
 
As others have stated, you have to chose which LCD panel to get the best contrast or color or motion, etc., and they won't be the same panel. Whereas CRT, while it may not be the optimal solution for most tasks these days, it does everything at least well. And there's just no getting away from the raw image quality of the device. Especially its black levels and dynamic range (15K to 1 by one estimate). Moot, if you can't control ambient lighting, but at home I can...
15K:1 estimate is white screen vs black screen and is thus pretty meaningless because as you add more especially bright objects on screen black level degrades, especially around these bright objects. When watching movies with black bars it is easily visible by black bars not being even black when displaying bright scenes, definitely worse black level than any LCD.

FW900 does games well. Movies/videos also... and that is it. Using it for desktop is certainly possible but why would you even do that to your eyes? Having CRT + LCD is best combo and I jumped at LCD for anything but multimedia and games as soon as I could afford it and despite these old 17" LCDs having really terrible image quality.

And modern LCDs while not perfect in some areas can be classified as 'doing everything well'. You can even use gaming IPS to do some serious image editing tasks and these displays are much more adequate for that than any CRT.

And even with hundreds of zones, a CRT, though I realize it's very much a best case scenario due to internal reflections, has up to over 1.6 million zones with the resolution I use. Really, who would want to replace an emissive technology with a back lit one if they had a choice?
I would exchange my FW900 for Acer Predator X27 or ASUS ROG Swift PG27UQ in a hear beat.
Sure these monitors will have 'blooming' effect but so does any CRT with their inner glass reflections and flaring.

And FALD displays also boast some ridiculous contrast ratio levels for white:black contrast ratio. With enough zones ANSI contrast ratio should be pretty good as well as blocks used for it are pretty large.
Where CRT will outperform these displays is image of eg. stars.

BTW: The above mentioned resolution, 1600 by 1024 at 100 Hz.
It was 1680x1050
1600x1000 is also good 16:10 (obviously :p) resolution for FW900
But 1920x1200@96Hz is probably the best to use, imho.
 
The problem with IPS LCD is still the low motion resolution of scan and hold coupled with the slow response time of the mechanical crystals. Even at 4K 144 hz (if you can push that) it's still going to be less detailed and lifelike an image once it gets into motion, since I assume strobing is out of the question in conjunction with HDR. In that case, a better tradeoff would be for size at 50"+ which would improve the overall viewing experience to a greater extent than the diminishing returns of maximizing pixel density.
 
15K:1 estimate is white screen vs black screen and is thus pretty meaningless because as you add more especially bright objects on screen black level degrades, especially around these bright objects. When watching movies with black bars it is easily visible by black bars not being even black when displaying bright scenes, definitely worse black level than any LCD.

FW900 does games well. Movies/videos also... and that is it. Using it for desktop is certainly possible but why would you even do that to your eyes? Having CRT + LCD is best combo and I jumped at LCD for anything but multimedia and games as soon as I could afford it and despite these old 17" LCDs having really terrible image quality.

And modern LCDs while not perfect in some areas can be classified as 'doing everything well'. You can even use gaming IPS to do some serious image editing tasks and these displays are much more adequate for that than any CRT.


I would exchange my FW900 for Acer Predator X27 or ASUS ROG Swift PG27UQ in a hear beat.
Sure these monitors will have 'blooming' effect but so does any CRT with their inner glass reflections and flaring.

And FALD displays also boast some ridiculous contrast ratio levels for white:black contrast ratio. With enough zones ANSI contrast ratio should be pretty good as well as blocks used for it are pretty large.
Where CRT will outperform these displays is image of eg. stars.


It was 1680x1050
1600x1000 is also good 16:10 (obviously :p) resolution for FW900
But 1920x1200@96Hz is probably the best to use, imho.


The FW900 is NOT a 16:10 display. That was marketing non-sense. Look at the actual listed viewable dimensions in the manual. 1600 by 1024 is a correct ratio for this display. There is no reason to settle for a non-correct aspect ratio on a CRT.

And dynamic range is an absolutely critical measurement. It lets you go from a dark scene to a light scene and vice versa with something closer to the intended impact on a CRT, whereas with an LCD it will be flat in comparison. ANSI contrast is better on an LCD as you mention due to internal CRT reflections, but dark scenes can still be ruined by the back light shining through on an LCD. And I don't like that. At all.

I only occasionally mix CRT and LCD. The shallow image quality of an LCD is too jarring. I prefer only all LCD or all CRT and at home it's CRT. YMMV of course...

EDIT: As for Acer's and ASUS's upcoming FALD displays, I will consider them for my 4K computer to be. I was quite excited about FALD back in the day and had one of the original Samsung 40 inch versions as my computer monitor for a bit. However, someone had dropped it before I got it and I ended up returning it. Sometimes, I wish I had exchanged it instead. Maybe it was the one that got away. However, back then, I thought in any case they would only become smaller and better anyway. Instead they got bigger and/or edge lit. :(

More recently, we've been getting more proper FALD displays, but probably against OLED they are too little, too late. If only they had been more little earlier...
 
Last edited:
The FW900 is NOT a 16:10 display. That was marketing non-sense. Look at the actual listed viewable dimensions in the manual. 1600 by 1024 is a correct ratio for this display. There is no reason to settle for a non-correct aspect ratio on a CRT.
Makes sense

And dynamic range is an absolutely critical measurement. It lets you go from a dark scene to a light scene and vice versa with something closer to the intended impact on a CRT, whereas with an LCD it will be flat in comparison. ANSI contrast is better on an LCD as you mention due to internal CRT reflections, but dark scenes can still be ruined by the back light shining through on an LCD. And I don't like that. At all.
I only occasionally mix CRT and LCD. The shallow image quality of an LCD is too jarring. I prefer only all LCD or all CRT and at home it's CRT. YMMV of course...
On desktop, browsing web, editing text, filling Excel cells, even editing photos, etc. flat response makes perfect sense
For all the text stuff using high resolution display like 4K makes even more sense because you get very sharp fonts like on phones. Far better on eyes than using CRT... to each its own


EDIT: As for Acer's and ASUS's upcoming FALD displays, I will consider them for my 4K computer to be. I was quite excited about FALD back in the day and had one of the original Samsung 40 inch versions as my computer monitor for a bit. However, someone had dropped it before I got it and I ended up returning it. Sometimes, I wish I had exchanged it instead. Maybe it was the one that got away. However, back then, I thought in any case they would only become smaller and better anyway. Instead they got bigger and/or edge lit. :(
I will wait a little for a price drop first because >3000$ for 27" display is a little ridiculous :panda:

More recently, we've been getting more proper FALD displays, but probably against OLED they are too little, too late. If only they had been more little earlier...
OLED is inherently unsuitable for desktop display due to burn-in.
FALD display with enough zones should be a good compromise. Is 384 zones enough though... hopefully it is...
 
I feel like you guys are basically saying "Nothing compares to our FW900s........yet." :LOL:
If Dell made their 30" 4K OLED to be 120Hz it would absolutely destroy FW900 especially since it is already flickering at 120Hz
 
If Dell made their 30" 4K OLED to be 120Hz it would absolutely destroy FW900 especially since it is already flickering at 120Hz

OLEDs don't exactly flicker. Until they can produce more luminance, there is a limit the strobe length of an OLED, so motion resolution will be destroyed by CRT

If and when that happens, I think one could say that the FW900 has been dethroned (although even then, CRTs will have a couple advantages).
 
Ok I finally got my hands on the Sunix adapter. For the Delock one, they took almost 3 weeks to refund me after telling me my address was not valid. I'll order again next month, and provide a friend's address.

Anyway, I spent the past few hours testing the Sunix adapter on my GTX1080. So far this is what I can say.

Of course all the stock resolutions of the FW900 are working perfectly fine. I tried to push things further and I managed to get all the following resolutions to work fine.

- 2304x1440@80Hz (120.80kHz / 382.6944MHz)
- 2560x1600@72Hz (120.24kHz / 423.2448MHz)
- 3000x1920@61Hz (121.39kHz / 502.0691MHz)
- 3232x2020@58Hz (121.22kHz / 539.1866MHz)

This last one is touching the limits of horizontal scan and pixel clock of the monitor. I applied these resolutions using the Nvidia Control Panel, with CVT timings.

I remember reading about someone who tried some odd resolutions which made the adapter crash, black screen. Well, I don't get a black screen, but the monitor is just not stable at all with certain resolutions. I don't remember exactly what these were. Not much of a problem though, at least not for me so far. But that's still something to investigate further.

Now I was hoping to try some interlaced resolutions, including 4K (3840x2400) interlaced. But I spent the last few hours trying everything I could (except reinstalling Windows 10 or trying Linux), I just can't apply one single interlaced resolution on this computer.
The Nvidia Control Panel just says my monitor won't support it, BS. With CRU, I was able to see the progressive resolutions I added when opening "List all modes" in the Windows Control Panel, but no interlaced resolutions ever showed up... Now CRU is not working anymore. It doesn't do a thing anymore. Any ideas?

I tried with the VGA cable, and the monitor EDID recognized, and with the 5 BNC cable, with a VGA adapter and no EDID, it's recognized as Synaptics VMM2300 DEMO, but the issue is the same.

I tried 3 different Nvidia drivers, the stock one I had on the computer 391.01, then I upgraded to 391.35, and then downgraded to 388.13. Nothing helped.

I should mention sometimes the adapter starts behaving weirdly. I don't know if that's due to a bad contact with the DisplayPort connector or the actual chip itself, but sometimes the image gets - how can I describe this - cut in 3 parts, and reassembled out of order. The right side of the screen is in the middle, etc... But everything is still perfectly sharp and stable. I have to move the cables a bit, and it gets back to normal.
 
Ok I finally got my hands on the Sunix adapter. For the Delock one, they took almost 3 weeks to refund me after telling me my address was not valid. I'll order again next month, and provide a friend's address.

Anyway, I spent the past few hours testing the Sunix adapter on my GTX1080. So far this is what I can say.

Of course all the stock resolutions of the FW900 are working perfectly fine. I tried to push things further and I managed to get all the following resolutions to work fine.

- 2304x1440@80Hz (120.80kHz / 382.6944MHz)
- 2560x1600@72Hz (120.24kHz / 423.2448MHz)
- 3000x1920@61Hz (121.39kHz / 502.0691MHz)
- 3232x2020@58Hz (121.22kHz / 539.1866MHz)

This last one is touching the limits of horizontal scan and pixel clock of the monitor. I applied these resolutions using the Nvidia Control Panel, with CVT timings.

I remember reading about someone who tried some odd resolutions which made the adapter crash, black screen. Well, I don't get a black screen, but the monitor is just not stable at all with certain resolutions. I don't remember exactly what these were. Not much of a problem though, at least not for me so far. But that's still something to investigate further.

Now I was hoping to try some interlaced resolutions, including 4K (3840x2400) interlaced. But I spent the last few hours trying everything I could (except reinstalling Windows 10 or trying Linux), I just can't apply one single interlaced resolution on this computer.
The Nvidia Control Panel just says my monitor won't support it, BS. With CRU, I was able to see the progressive resolutions I added when opening "List all modes" in the Windows Control Panel, but no interlaced resolutions ever showed up... Now CRU is not working anymore. It doesn't do a thing anymore. Any ideas?

I tried with the VGA cable, and the monitor EDID recognized, and with the 5 BNC cable, with a VGA adapter and no EDID, it's recognized as Synaptics VMM2300 DEMO, but the issue is the same.

I tried 3 different Nvidia drivers, the stock one I had on the computer 391.01, then I upgraded to 391.35, and then downgraded to 388.13. Nothing helped.

I should mention sometimes the adapter starts behaving weirdly. I don't know if that's due to a bad contact with the DisplayPort connector or the actual chip itself, but sometimes the image gets - how can I describe this - cut in 3 parts, and reassembled out of order. The right side of the screen is in the middle, etc... But everything is still perfectly sharp and stable. I have to move the cables a bit, and it gets back to normal.

Thanks for the detail report! For AMD side, we (Enhanced Interrogator and I) can get interlace resolutions, but it's limited in Windows. In Linux, you can go crazy.
 
Well then... I guess I'll try with a Live USB of Ubuntu.

But that's sad I can't just set the interlaced resolution I want on the OS I use daily. That's kinda weird though, the Nvidia Control Panel has the option, no idea why it won't work.
 
Well then... I guess I'll try with a Live USB of Ubuntu.

But that's sad I can't just set the interlaced resolution I want on the OS I use daily. That's kinda weird though, the Nvidia Control Panel has the option, no idea why it won't work.

I don't know how Nvidia's user interface works, but I know on CRU it's easy to accidentally double the vertical resolution if you enter the number wrong. With interlaced checked, you have to make sure enter half-resolution the vertical column, and it will show the full resolution total on the right. So for example, you will have 540 lines entered, and it will give a 1080 line total on the right.

Maybe Linux want to protect your eyes? :ROFLMAO:

I've been playing Battlefield 1 at an interlaced resolution for a year and a half and have a 71% win percentage.

Besides, to watch movies in 4k on a FW900 you need interlaced modes. 96Hz to get perfect cadence on 24fps movies.
 
Back
Top