24" Widescreen CRT (FW900) From Ebay arrived,Comments.

One question though: I guess 340mHz is the fastest chip they have? I'll just be really bummed if we never get a DAC on par with what's in our graphics cards. With Toasty X's pixel clock patcher, you can actually go above 500mHz on analog equipped video cards.
340MHz is enough even for FW900 to hit resolutions that dot size limitations let alone other CRT monitors
doing faster DAC does not make any sense at all


if you want one though you can always build it yourself with FPGA and DM/HDMI receiver IC
DAC can be built using resistor ladder and then you are limited by FPGA ability to switch pins which is pretty crazy high for new devices
it would be possible (and very easy too!) to add gamma correction and gamut remapping to such device too

see? no need to wait for someone else to make it for you
do not expect anyone to make it for you
DIY =)
 
340MHz is enough even for FW900 to hit resolutions that dot size limitations let alone other CRT monitors
doing faster DAC does not make any sense at all

My GDM-F520 has better resolution than a FW900 due to a finer grille pitch (0.22mm vs variable 0.23-0.27mm). I can clearly see more detail between 2048x1536 and 1920x1440. Since I can do 85Hz on 2048x1536 with the Sony F520 and the image is crystal clear then yeah, we need a faster than 340MHz DAC. 400MHz should be the minimum as it has been the standard choice for all video cards since 2000.
 
almost all CRT are horizontal refresh rate limited
F520 have 137KHz which translates to your 2048x1536 @ 85Hz or eg. 1600x1200 @ 107Hz

would you seriously rather play games at 85Hz than at 107Hz?
 
Last edited:
340MHz is enough even for FW900 to hit resolutions that dot size limitations let alone other CRT monitors
doing faster DAC does not make any sense at all

Uh, what about higher refresh rates at these resolutions?

Besides, when you're playing games, they obviously look better at higher resolutions, even if you're beyond your dot pitch. It's like analog downsampling, not to mention these CRT's don't really have limited vertical resolution.

would you seriously rather play games at 85Hz than at 107Hz?

Well that's a gross oversimplification. It depends on the game. Is the game really new? Then you may not be able to hit 100fps consistently, so 80 or 85hz might be more withthin your GPU's capability. And you always want refresh rate to match frame rate, whether through vsync or frame cap, for maximum smoothness.
 
would you seriously rather play games at 85Hz than at 107Hz?

One of the many reasons I use CRT and not LCD monitors is their awesome flexibility. I want to play games from 640x480 160Hz (think StarCraft 1) all the way to 2048x1536 using any refresh I please from 60Hz to 85Hz (remember, many games are capped at 60fps, think Doom 3). A 340MHz DAC cannot do 2048x1536 85Hz, thus I will not be using my monitors to their full capacity. We need at least a 400MHz RAMDAC so stop making excuses.
 
when you lower resolution framerate drastically increases up because games are GPU, not CPU limited

stuttering caused by framerate not being perfectly synced with screen refresh rate is lowered when screen refresh rate is higher because of frame display times <- that not taking into account increase in frame rate
input lag always decreases when you increase refresh rate
visibility of tearing caused by v-sync off decrease drastically with higher refresh and framerates reducing need for v-sync thus making input lag even better

personally I do not see high resolutions worth it even for games where framerate is not an issue
 
when you lower resolution framerate drastically increases up because games are GPU, not CPU limited

stuttering caused by framerate not being perfectly synced with screen refresh rate is lowered when screen refresh rate is higher because of frame display times <- that not taking into account increase in frame rate
input lag always decreases when you increase refresh rate
visibility of tearing caused by v-sync off decrease drastically with higher refresh and framerates reducing need for v-sync thus making input lag even better

personally I do not see high resolutions worth it even for games where framerate is not an issue

60hz vsynced looks smoother than 100hz with variable frame rate. So unless I'm playing a twitch shooter, I'm going for a consistent frame pacing, which is one area where CRT's still shine.
 
Last edited:
Even if a given game's capped at 60 FPS, actually running that refresh rate on a CRT is just asking for flicker, eyestrain and headaches. 80-85 Hz is about the threshold where it stops being annoying for me.

It's actually because of that that I wish there were adapters that took 60 Hz signals (mainly from consoles, so 240p, 480p, 720p) and upscanned them to 120 Hz for comfortable CRT monitor viewing. The motion clarity benefits would be lost, but at least it wouldn't be a total flickerfest.

Also, I'm relieved to hear that there's a DisplayPort to VGA adapter worthy of the FW900 in production now. That might give me reason to get my ol' FW900 up and running again, even if it hasn't been a priority ever since I got my FG2421 (which cost me less than a replacement FW900 D board would).
 
Even if a given game's capped at 60 FPS, actually running that refresh rate on a CRT is just asking for flicker, eyestrain and headaches. 80-85 Hz is about the threshold where it stops being annoying for me.

It's actually because of that that I wish there were adapters that took 60 Hz signals (mainly from consoles, so 240p, 480p, 720p) and upscanned them to 120 Hz for comfortable CRT monitor viewing. The motion clarity benefits would be lost, but at least it wouldn't be a total flickerfest.

Also, I'm relieved to hear that there's a DisplayPort to VGA adapter worthy of the FW900 in production now. That might give me reason to get my ol' FW900 up and running again, even if it hasn't been a priority ever since I got my FG2421 (which cost me less than a replacement FW900 D board would).

Are you speaking of the adapter that's currently being discussed?
 
One of the many reasons I use CRT and not LCD monitors is their awesome flexibility. I want to play games from 640x480 160Hz (think StarCraft 1) all the way to 2048x1536 using any refresh I please from 60Hz to 85Hz (remember, many games are capped at 60fps, think Doom 3). A 340MHz DAC cannot do 2048x1536 85Hz, thus I will not be using my monitors to their full capacity. We need at least a 400MHz RAMDAC so stop making excuses.

Feel free to design one yourself then. If you cannot or will not, then don't come on here with these "stop making excuses" post. I also own a F520, and while 2048x1536 looks a tiny bit better than 1920x1440 at 85hz, it's nothing to lose sleep over. The fact that 1920x1440 at 85hz is a solid option now means I'll probably be getting one, and then say goodbye to being stuck at the GTX-980ti as being my GPU ceiling. I can get whatever the hell I want now - even AMD! :D

EDIT: XoR_ beat me to it.
 
maybe if all CRT users unite then we would be strong enough to force intergalactic government to make better converter?
 
maybe if all CRT users unite then we would be strong enough to force intergalactic government to make better converter?

Honestly if any of us had electrical engineering experience, or knows someone who does, it can't be that hard - for an electrical engineer of course. :D
 
Yeah, I mean I could devote my life to learning how to make a converter for the next couple years, but that doesn't sound too appealing. I was just kind of hoping there was some hardware, somewhere, that could compete with the DAC's we have in our 980ti's and HD 7970's
 
Honestly, I don't know why people are attacking others for expressing a perfectly rational desire. Nobody's complaining here and saying "WAAAH, I WANT 400 MHZ, 340 SUCKS".

A 400 mhz ramdac should be the standard for DACs, as it has been with GPUs for a long long time. A 340 mhz DAC isn't sufficient to push CRTs to their limits, and even if it's not a huge deal not being able to squeeze that last bit of hz or resolution out, it sure would be nice to.
 
Feel free to design one yourself then. If you cannot or will not, then don't come on here with these "stop making excuses" post.

Ok. I agree.

I also own a F520, and while 2048x1536 looks a tiny bit better than 1920x1440 at 85hz, it's nothing to lose sleep over. The fact that 1920x1440 at 85hz is a solid option now means I'll probably be getting one, and then say goodbye to being stuck at the GTX-980ti as being my GPU ceiling. I can get whatever the hell I want now - even AMD! :D

You're missing my point. Even if 1920x1440 85Hz requires just a 341.35MHz pixel clock frequency it doesn't mean it will look the same on that 340MHz RAMDAC as on your GTX 980Ti simply because it already reached its limits. Let me give you some examples:
On my GTX 980, any of the two GDM-F520s I have (they're calibrated to death, focus and convergence) the output on 2048x1536 85Hz is so sharp that I use that resolution for EVERYTHING, TEXT included!
I once had the chance to use an old PC with a XGI Volari 8300 video card which has a 420MHz RAMDAC. Compared to the output on my GTX 980, the image quality on the same resolution&refresh was simply SHARPER on white text and you could spot more detail in pictures with small and complex patterns.
Likewise, the same monitors, used on the crappy VGA connector of one of my laptops, have a blurrier output EVEN on 1600x1200 85Hz compared to the GTX 980 on 2048x1536 85Hz!

Clearly, what we are seeing here is the limits of DAC quality and not grille pitch size. These tests also concur with a Dell P1130 who has an even better focus than my two F520s although it has 0.24mm grille pitch and can only do 2048x1536 80Hz (another resolution setting not supported by a 340MHz DAC - it requires 364MHz).
 
Last edited:
You're missing my point. Even if 1920x1440 85Hz requires just a 341.35MHz pixel clock frequency it doesn't mean it will look the same on that 340MHz RAMDAC as on your GTX 980Ti simply because it already reached its limits. Let me give you some examples:
On my GTX 980, any of the two GDM-F520s I have (they're calibrated to death, focus and convergence) the output on 2048x1536 85Hz is so sharp that I use that resolution for EVERYTHING, TEXT included!
I once had the chance to use an old PC with a XGI Volari 8300 video card which has a 420MHz RAMDAC. Compared to the output on my GTX 980, the image quality on the same resolution&refresh was simply SHARPER on white text and you could spot more detail in pictures with small and complex patterns.
Likewise, the same monitors, used on the crappy VGA connector of one of my laptops, have a blurrier output EVEN on 1600x1200 85Hz compared to the GTX 980 on 2048x1536 85Hz!

Clearly, what we are seeing here is the limits of DAC quality and not grille pitch size. These tests also concur with a Dell P1130 who has an even better focus than my two F520s although it has 0.24mm grille pitch and can only do 2048x1536 80Hz (another resolution setting not supported by a 340MHz DAC - it requires 364MHz).
The sharpness increase has probably nothing to do with the RAMDAC frequency limit, but rather the quality of the DAC and the way it is implemented on the board. Nvidia cards aren't famous for this.
 
Hi,

I just bought 1080ti but it only has DVI-D. Could I get HDMI to VGA cable or DP to VGA cable and connect to FW900? Would I still be able to get a refresh rate of 85mhz on the FW900? Thanks.
 
The sharpness increase has probably nothing to do with the RAMDAC frequency limit, but rather the quality of the DAC and the way it is implemented on the board. Nvidia cards aren't famous for this.

I would suspect this is the issue more than anything. Over at CurtPalme's forum, there are users over there who actually design video neckboard amplifiers for the CRT projectors and there's a thread somewhere about the quality of components and how they affect the signal. Simply put, the RAMDAC rating had nothing to do with actual quality of the image. Devices that were said to have high bandwidth didn't resolve shit. It ended up being the QUALITY of the video chain that had the biggest affect on it.

Another analogy. Take an engine in a car. Mate a 200 hp engine to a manual transmission, driving a rear wheel drive car. Take the same engine and drop it into an automatic transmission and an all-wheel-drive car. Which car has more wheel horsepower? In most cases, the first one because less power is lost through a manual with a rear-wheel-drive (again - in most cases).

Same thing applies to the video chain. If you have a high bandwidth RAMDAC but pair it with shit components, then you'll have shitty output, regardless of how uber-awesome your RAMDAC is. Video DAC chips are but one component in the entire chain. The rest of the components matter too.
 
Ok. I agree.
You're missing my point. Even if 1920x1440 85Hz requires just a 341.35MHz pixel clock frequency it doesn't mean it will look the same on that 340MHz RAMDAC as on your GTX 980Ti simply because it already reached its limits.

Understanding this viewpoint, then I understand your concern. But as I mentioned in the post above, I don't think it's as simple as just beefing up the RAMDAC. If this adapter is well-designed, then your video output at 1920x1440 85hz should still be sharp as a tack, even when taken to its digital limit. In fact, most analog devices bottom out before their digital counterparts do.
 
I just bought 1080ti but it only has DVI-D. Could I get HDMI to VGA cable or DP to VGA cable and connect to FW900? Would I still be able to get a refresh rate of 85mhz on the FW900? Thanks.

VCOM displayport adapter or HD Fury Nano GX HDMI adapter could do 1680x1050 @ 85hz. Not sure about 1920x1200 @ 85hz though. Thankfully Nvidia has DSR, so you'll still be able to push your 1080Ti to the limit at 1680x1050.
 
VCOM displayport adapter or HD Fury Nano GX HDMI adapter could do 1680x1050 @ 85hz. Not sure about 1920x1200 @ 85hz though. Thankfully Nvidia has DSR, so you'll still be able to push your 1080Ti to the limit at 1680x1050.

Thanks for the info. I was able to get 1920x1080@80hz with DP to VGA. So I'm happy.
 
Hello fellow enthusiasts. I removed the anti-glare layer due to several scratches. The result was a sharper and brighter image on my FW900. The major downside is the reflection.

Here's some pictures: http://imgur.com/a/Sp9TJ .The only thing left now is to calibrate the monitor and darken the room.
 
Last edited:
VCOM displayport adapter or HD Fury Nano GX HDMI adapter could do 1680x1050 @ 85hz. Not sure about 1920x1200 @ 85hz though. Thankfully Nvidia has DSR, so you'll still be able to push your 1080Ti to the limit at 1680x1050.
I was actually using the Nano GC to get 240p to my 15khz monitor from my Pi a while back and it worked almost as well as the Extron RGB interface I have...the Nano is a really handy device for a variety of things so long as you don't need to go too far beyond what it can do. I originally got it to hook modern consoles up to my PC CRT when I gave my old TV to my parents (an old LG 32" 1080p 3d LCD) and was shopping for a new one.
 
I was actually using the Nano GC to get 240p to my 15khz monitor from my Pi a while back and it worked almost as well as the Extron RGB interface I have...the Nano is a really handy device for a variety of things so long as you don't need to go too far beyond what it can do. I originally got it to hook modern consoles up to my PC CRT when I gave my old TV to my parents (an old LG 32" 1080p 3d LCD) and was shopping for a new one.

Whoa cool. Did you have to triple or quadruple your horizontal pixels to get a higher pixel clock? Or does the Nano GX not have a minimum? I never tested it that deeply

Thanks for the info. I was able to get 1920x1080@80hz with DP to VGA. So I'm happy.

You know the FW900 is a 16:10 monitor right? So 1920x1200 is a more appropriate resolution.
 
So I have a chance to get one these. and I'm really only interested if it's in good to very good condition.

The story is the guy bought four of them used six years ago and they were all looking great when he got them, they were factory calibrated from the production company right before they sold them. He then used two for pro gaming events he helped run or something for a couple years or so, then they were sitting around in storage for years. When he turned them all on recently and did comparisons two were clearly looking better than the other two, which looked "washed out". He tried to adjust brightness, contrast and other OSM parameters for a little bit, they just didn't look as good as the other two no matter the OSM settings. The good two are gone now and the washed out ones are left.

So I'm wondering if you think that is something that a good auto-calibrate and WinDas adjustment would fix right up or are these tubes probably just permanently worse than actually good ones?

Also, he thinks the good ones were the ones he was using for gaming events.
Maybe those good two got tuned up better somewhere along the way during gaming events and other two just need the same?
Maybe all four were never really quite equal when he got them, it was 6 ago to remember?
Maybe for w/e reason two of them aged worse in storage, like sun hitting their screen daily?
 
Last edited:
So I have a chance to get one these. and I'm really only interested if it's in good to very good condition.

The story is the guy bought four of them used six years ago and they were all looking great when he got them, they were factory calibrated from the production company right before they sold them. He then used two for pro gaming events he helped run or something for a couple years or so, then they were sitting around in storage for years. When turned them all on recently and did comparisons two were clearly looking better than the other two, which looked "washed out" trying to adjust brightness, contrast or other OSM parameters for a little bit, they just didn't look as good as the other two. The good two are gone now and the washed out ones are left.

So I'm wondering if you think that is something that a good auto-calibrate and windas adjustment would fix right up or are these tubes probably just permanently worse than actually good ones?

Also, he thinks the good ones were the ones he was using for gaming.
Maybe those good two got tuned up better somewhere along the way during gaming events and other two just need the same?
Maybe they all four were never exactly quite equal when he got them, it was 6 ago to remember?
Maybe for w/e reason two of them aged worse in storage, like sun hitting their screen daily?
WinDAS WPB should take care of it. These things were built to last. Some of the more knowledgeable members can give you the numbers, but I think chances are good these are salvageable.
 
So I have a chance to get one these. and I'm really only interested if it's in good to very good condition.

The story is the guy bought four of them used six years ago and they were all looking great when he got them, they were factory calibrated from the production company right before they sold them. He then used two for pro gaming events he helped run or something for a couple years or so, then they were sitting around in storage for years. When he turned them all on recently and did comparisons two were clearly looking better than the other two, which looked "washed out". He tried to adjust brightness, contrast and other OSM parameters for a little bit, they just didn't look as good as the other two no matter the OSM settings. The good two are gone now and the washed out ones are left.

So I'm wondering if you think that is something that a good auto-calibrate and WinDas adjustment would fix right up or are these tubes probably just permanently worse than actually good ones?

Also, he thinks the good ones were the ones he was using for gaming events.
Maybe those good two got tuned up better somewhere along the way during gaming events and other two just need the same?
Maybe all four were never really quite equal when he got them, it was 6 ago to remember?
Maybe for w/e reason two of them aged worse in storage, like sun hitting their screen daily?
If by "washed out" you mean too bright/the blacks being gray, don't worry, it's a common issue, the G2 value going high.

Grab a colorimeter, a USB to TTL converter, follow the white point balance procedure with Windas and it'll be fixed.
 
So my unit after 5 years is intermittently losing focus on warmup every few days, pretty certain the FBT is on the way out. I am really having trouble finding one as it is has been NLA for years. If anyone knows where to get one let me know. Thanks.

Part Number
1-453-348-11
Description
Fbt Assembly (Nx-4504//j1d4)
 
The problem is to know if the issue really comes from a faulty FTB. It could well just be a cold solder joint (very likely for intermittent problems) or any peripheral failure that has nothing to do with the FTB itself.

Uncle Vito should still have FTBs for Sony monitors, but as far as I know he sells them with the entire D board, and the price may be quite high.

edit: BTW, when you say losing focus, do you mean a brief (half a second) focus loss with the display area decreasing and a spark gap activated, then everything coming back to normal immediately, or something else ?
 
Last edited:
The problem is to know if the issue really comes from a faulty FTB. It could well just be a cold solder joint (very likely for intermittent problems) or any peripheral failure that has nothing to do with the FTB itself.

Uncle Vito should still have FTBs for Sony monitors, but as far as I know he sells them with the entire D board, and the price may be quite high.

edit: BTW, when you say losing focus, do you mean a brief (half a second) focus loss with the display area decreasing and a spark gap activated, then everything coming back to normal immediately, or something else ?
On warmup on occasion will make a faint degauss noise and will cause a loss of focus untill monitor is turned off and back on. If it does not act up like this there are are no real issues and it performs amazingly.

Pic below is it running 2560x1600 @76hz through some nice bnc.
 

Attachments

  • 20170803_144139.jpg
    20170803_144139.jpg
    208.3 KB · Views: 98
  • 20170803_144209.jpg
    20170803_144209.jpg
    160.7 KB · Views: 99
Last edited:
When it turns on properly the next time, is the degauss noise more "normal" ? And about the defocus, is it only vertical, horizontal or both ?
 
Yeah. Everything acts normally but in the early span of warming it makes a click like you would hear on output resolution change and things get blurry. It will always turn on normally but it just will act up within first 20 mins or so if/when it does.
 
It really looks like a bad contact problem. According to the schematics, the degauss circuitry is very simple, it's a relay on the G board that closes to send directly AC voltage to coils.

The first thing to do is to unplug/replug the power cord to the monitor and make sure it fits well. Then if there are still problems, my thoughts would be to check the solders of components on the G board shown in the grey area on the FW900 service manual, next to the AC input. It may also simply be a problem with the relay itself, if it remains stuck in closed position that might explain the abnormal degauss sound and the blur problem, plus the click.
 
It really looks like a bad contact problem. According to the schematics, the degauss circuitry is very simple, it's a relay on the G board that closes to send directly AC voltage to coils.

The first thing to do is to unplug/replug the power cord to the monitor and make sure it fits well. Then if there are still problems, my thoughts would be to check the solders of components on the G board shown in the grey area on the FW900 service manual, next to the AC input. It may also simply be a problem with the relay itself, if it remains stuck in closed position that might explain the abnormal degauss sound and the blur problem, plus the click.
It really looks like a bad contact problem. According to the schematics, the degauss circuitry is very simple, it's a relay on the G board that closes to send directly AC voltage to coils.

The first thing to do is to unplug/replug the power cord to the monitor and make sure it fits well. Then if there are still problems, my thoughts would be to check the solders of components on the G board shown in the grey area on the FW900 service manual, next to the AC input. It may also simply be a problem with the relay itself, if it remains stuck in closed position that might explain the abnormal degauss sound and the blur problem, plus the click.
It really looks like a bad contact problem. According to the schematics, the degauss circuitry is very simple, it's a relay on the G board that closes to send directly AC voltage to coils.

The first thing to do is to unplug/replug the power cord to the monitor and make sure it fits well. Then if there are still problems, my thoughts would be to check the solders of components on the G board shown in the grey area on the FW900 service manual, next to the AC input. It may also simply be a problem with the relay itself, if it remains stuck in closed position that might explain the abnormal degauss sound and the blur problem, plus the click.
Looking at diagrams now. Honestly not that informed on this stuff.
 
For now we can only guess, but at least given what you describe I'm quite sure the flyback has nothing to do with the issue. ;)

There's one easy thing you could try also that doesn't involve any disassembly and may help knowing if the issue comes from a stuck degauss relay: next time the issue happens, don't turn the screen off/on, use the menu of the screen -> option -> degauss -> on, and see what happens.
 
For now we can only guess, but at least given what you describe I'm quite sure the flyback has nothing to do with the issue. ;)

There's one easy thing you could try also that doesn't involve any disassembly and may help knowing if the issue comes from a stuck degauss relay: next time the issue happens, don't turn the screen off/on, use the menu of the screen -> option -> degauss -> on, and see what happens.
Will do.
 
Had this problem when I left my monitor turned off for a week. Came back home and it had obvious symptoms of faulty FBT. Then I decided to keep it on for as long as possible - it used to be on 24/7 for I guess a month? All problems were gone since then. Now I can leave my moniotr off for a week or two and it won't show any single wrong behavior.
 
Back
Top