24" Widescreen CRT (FW900) From Ebay arrived,Comments.

sometimes and i dont know why, windows detects my FW900 as a generic non PNP monitor, i restart windows and it detects it as GDM-FW900 again, but sometimes even restarting it keep detecting it as generic, when that happened the firs time, i lost all resolutions in CRU which were setup for FW900,

a workaround for this was to export all GDM-FW900 custom resolutions from CRU using the "export" button to a file, and when windows detected the monitor as generic, i imported the file, so now it does not matter if windows detects the monitor as FW900 or generic, since i now just use the same custom res created for the FW900 for the "generic" monitor and problem solved.

Very strange with native DAC, do you have other monitor connected to the graphic card?
I had the same problem in the past but only with a secondary Samsung monitor connected with an adapter and never with the FW900, i thought it was an AMD driver bug.
Initially i used your same solution with CRU, copy-paste button was very useful too.
Then i realized that sometimes Windows created new Non-PnP Monitor registry keys, so i had to redo the same thing with CRU with each new key.
I resolved the thing exporting the EDID to an INF file and installing the Non-PnP Monitor with that.
Every time Windows detects that monitor as new Non-PnP, that INF is used on the fly.
I stopped using this method with Radeon Crimson driver, because stupid unbelievable things happen with that configuration
 
i have the FW900 since about 2 and a half years, and about 1 year with a GTX980 TI, connected to it via DVI-I to vga adapter, no other monitor conected, and this issue started about 6 months after acquiring the 980 TI, (tested with another VGA cable and DVI to VGA adpater but issue still persisted) i think this may not be related to OS driver level in my case, because, when i boot the computer and it will not detect the monitor as FW900, but generic instead, bios boot logo seems stretched and in monitor OSD it reports vertical - horizontal frecuencies other than the usual 2304 x 1440 80 Hz during boot when the monitor is correclty detected. interesting thing is that about a month i opened the monitor and cleaned the internal boards, tube from dust, reseated the cable ribbons just for maintenance routine, and coincidentally the issue dissapeared for an about a month, now it sometimes happpens but rarely, much less frequent than before opening and cleaning the monitor.
fortunately with the CRU imported file trick, that issue does not bother at all, since windows will use same resolutions for the FW900 and the "virtual generic monitor" as well without me needing to do anything.
 
In the pic you have posted look the horizontal frequency, see that 121.73 KHz?

I think I've learned from your posts, thank you Derupter and was so surprised to get 2560x1600 at 73Hz with 121.76KHz

If it's work I guess it won't kill fw900 in long therm

W0KkGyU.png
 
wow!! that is very interesting, thanks for sharing, boski, it seems the FW900 can even have further undiscovered potential when using even better pixel clock DACs, the maximum i have been able to achieve at 2560x1600 on my FW900 is 68hz, via GTX980TI internal 400mhz DAC.
if i try 73hz it will display a very unusable flickery plain white screen because the 400mhz limit i bet.

it would be interesting to know how much HZ can you achieve at 1920x1200 with that DAC.
 
it would be interesting to know how much HZ can you achieve at 1920x1200 with that DAC.
At 1920x1200 you are horizontal frequency limited so higher pixel clock doesn't make any difference.
 
hmmm that seems right, when i try to set anything further than 1920x1200 @96, will get "out of scan can range" message.
 
a handy simple calculator: http://myhometheater.homestead.com/bandwidthcalculator.html

There are 3 basic limits:

- vertical refresh rate (usually max 160HZ)
- horizontal scanning frequency (usually around 121kHz)
- DAC pixel clock (usually around 400MHz from your GPU, or more if you bypass it with adapters such as the one from Sunix which has higher limits around I think 500 or 550)

There is no "perfect" all-around setting, everyone has different preferences and different content to display. Personally I switch settings differently for every game depending on what I like the most. The user experience is far from the usual do-all attitude of 60HZ fixed-pixel 1080p monitors :) There is a lot of tinkering involved with CRTs so if you feel like you are doing something very experimental then rest easy because its probably a totally normal procedure for a more seasoned users.
 
I am in a pickle myself though.

I could swear the Sunix adapter worked fine for weeks over BNC cables but now it doesnt. The monitor is now detected as VMM2300 DEMO and I cant go over 1280x1024 or use CRU.

I think it could be because BNC cannot supply EDID (there is one missing pin in the D-SUB end of the BNC cable)? But that would contradict the fact that I was able to run the display normally over BNC for weeks. Unless I am losing my mind of course.

My question now would be, is there a way to supply/force the EDID manually via software/Windows? I have Win8.1

If I connect regular D-SUB cable (not BNC) it works just fine. But I want to use the BNC because I have other computer connected and want to switch between them using the INPUT slider on the monitor (been doing this for weeks until recently).
 
I think it could be because BNC cannot supply EDID

This is exactly why. On both AMD and Nvidia, custom resolutions are limited by the max pixel clock they read off the display ID, even when you change it in CRU.

There are a few things you can do. One is overwrite the EDID on your monitor. I haven't tried this yet, I know Powerstrip can do it, but that's not really supported anymore, and then there is a linux program as well.

Another thing you could do is make a little adapter that runs two wires to the ID pins on another device with an EDID, while it passes through RGBHV to your BNC cables. So you could run the wires to a spoof plug or something like a Dr. HDMI.

10 or 20 pages back you can see where I used an Extron RGB to supply the EDID from a spoof adapter while I ran the actual RGBHV on another output.

There is no "perfect" all-around setting, everyone has different preferences and different content to display. Personally I switch settings differently for every game depending on what I like the most. The user experience is far from the usual do-all attitude of 60HZ fixed-pixel 1080p monitors :) There is a lot of tinkering involved with CRTs so if you feel like you are doing something very experimental then rest easy because its probably a totally normal procedure for a more seasoned users.

Yeah, this is one of the really cool things about our monitors. I'm currently playing Battlefield 5 at 2400x1800 interlaced at 90hz, with the frame rate capped to 45fps. I'm doing that because my 4-core i5 isn't really keeping up with my GPU, so I capped the frame rate and increased the load on my GPU. And 90÷45= a perfectly paced two complete scans for every frame, at native resolution. And with low input lag.

That said, much nicer to actually run at higher and progressive rates, so a new CPU and motherboard is on the way.
 
Yea thats what I thought, but how come I was able to make it work before... must have been mistaken somehow. I will try to get a new cable first, see if the BNC would work in the other machine at least. There is WinXP so I could theoretically install the official FW900 INF files from Sony website (this didnt work on Win8.1 with Sunix)

I recommend you to install RTSS 7.2.0

It allows for a new type of sync which they called "scanline sync". Vsync produces visually smooth frames but introduces input lag. If you cap your framerate to the monitor refresh rate then you get smooth frames too but usually one (or sometimes more) very visible tear line. The scanline sync is basically capping the framerate AND doing some timing magic to offset this tearline to the top/bottom or if you are lucky completely out of view. The result is visually like vsync (very smooth!) and the added input lag is almost nonexistent. Its really amazing, you guys will love it!! First google some instructions how to use it though, there is an offset variable that you will need to adjust on a trial and error basis. Make sure to enable the FCAT indicator with 4 bars in the options in order to see/understand it more. Your FPS also cannot dip below your refresh rate otherwise its not pretty. RTSS is also great because it can show you the frametime chart (frametime history overlay), its a must have tool imo and works on WinXP too.

I also suspect the Sunix adapter to fuck with the frames somehow. I was playing for a few weeks on WinXP some older games, everything buttery smooth but now when I come back to 1080ti plugged into Sunix I can feel something is not right. The best way to describe it is that when I game at 120Hz the picture is normally very fluid but with Sunix it FEELS like 80FPS for some reason. Like it simply discards some frames. There is no lag I think, the mouse feels okay but the picture is not so smooth and persistent now. I dont want to say this for sure as it could be something else in the system but I plan to make some fair tests that could possibly narrow it down.

On the topic of LUT calibration, I have recently discovered that instead of using the command line argyll I can install a GUI over it called DisplayCal which makes it very easy to calibrate only the gamma curve, meaning you get minimal color shift from the original and bring some details into the shadows. You need WHITE POINT, WHITE LEVEL and BLACK LEVEL to be set to "AS MEASURED". Otherwise it will try to do its own version of 9300K (for example) and to me it looks ugly and good only for watching carefully mastered movies maybe. Gamma 2.4 will bring a (very) little detail into shadows and maintain most of the pop in the picture but Gamma 2.2 is too washed for gaming imo. By the way, older games such as original Half-Life seems to be designed for a higher gamma (2.45?) anyway (they are very washed out by default) due to the CRTs being everywhere back then.

But... I have actually learned to appreciate the default look of the WinDAS calibration. Its great for games. What you lose in the shadow details you gain in the more saturated colors which usually must be compressed with calibration. By default, the picture looks very "inky" and I learned to like this look, it makes games look more premium and not washed out. Not to mention if you load your own calibration it discards the ingame calibration which more often than not loses the artistic intent of the developer and usually makes the game looking bland. So right now I force my calibration only for some special games where the black crush is unbearable such as Doom 3.

Using the DisplayCal and only for adjusting the gamma curve, its very similar to the gamma adjustment in graphics card settings but much better, so I recommend doing several calibration runs of different gamma levels (from 2.4 down to 2.2 I would say) and then using program such as Color Profile Keeper to switch between the profiles and force them into games where they otherwise do not stick (this part works really well, have yet to find a game where it didnt work). But again, I would do this only for movies or very dark games where the black crush impacts gameplay.

A word of warning, this happened to me recently, when using DisplayCal and NOT having your own calibration activated, it tends to use the graphics card default profile and FORCE it into games for some reason. This overwrites the ingame profile, which is of course not desirable. Just something to keep in mind, otherwise you may end up pulling hair like me looking for whats wrong. Simply exit the program (via system tray on bottom right) and its okay.
 
Last edited:
Not to mention if you load your own calibration it discards the ingame calibration which more often than not loses the artistic intent of the developer and usually makes the game looking bland. So right now I force my calibration only for some special games where the black crush is unbearable such as Doom 3.
Are you saying that games are loading their own LUT? This doesn't make sense.
 
I am not sure if it is LUT or PROFILE or whatever (I mainly game so this a very complex topic for me) but I believe they do load something that gets discarded/overwritten if you load your own profile. Like there is only 1 slot available and you need to pick either ingame or your own settings. When your profile does not "stick", it means the game is trying really hard to force their own profile, hence why Color Profile Keeper is a good app. Someone else please back me up or dismiss it.
 
I recommend you to install RTSS 7.2.0

Yeah, I've been using this and it's awesome. Perfect frame pacing and no tearing or input lag. It seems you have to use under 90-85% of your GPU otherwise it will start tearing. So I have to run a slightly lower resolution than I normally would but for fast paced games the tradeoff is so worth it.
 
AFAIK drivers just reset the LUT after DirectX activates. Proper calibration should only bring you closer to developers' vision.
 
Proper calibration should only bring you closer to developers' vision.

Yes, it would make perfect sense but I believe that is not happening. Developers probably use it as post-production color adjustment. Its crazy I know :)

One could do a quick test, load your default non-calibrated monitor profile ICM file into the portable Color Profile Keeper app to force it into a few games. See if the output color is any different (intensity will depend on the game of course). There should not be any color change in Windows desktop when you do that (unless you load a calibrated ICM of course). The custom ICM will automatically unload after you exit the program by the way, takes about a second.

Hopefully someone more knowledgeable than me will chime in on this though.
 
Just ran UT99 side by side on 2 different computers, one had the Sunix adapter. I could not see the issue now. Earlier, I think it was rather a webbrowser in the background, when I switch resolutions too often. the webbrowser starts to lag and it propagates into the whole system. But I noticed there is some judder/stutter when using the new s-sync in UT99. That could also have been what I saw earlier. Its readily apparent if you strafe for a longer time and keep looking at pronounced objects (lights) on the wall nearby (without moving mouse), basically a moving panorama test.

Thanks for the tip about GPU usage, I will turn on its indicator and monitor it as well. Right now I just relied on "does not dip below refresh rate too often".

By the way, even if the frametime chart in RTSS flatlines perfectly, it does not mean there is no stutter/judder/whatever so take that chart as only a basic indicator. Its funny how many SLI users think that they "fixed" SLI microstutter by capping the framerate to that of a refresh rate which then usually flatlines in the chart. You need to use your eyes basically, ideally on a moving panorama or something like that. Ingame menu in Mafia slides beautifully between objects on the table in the background, which I like to use as one of the eye tests. But that game is capped to 60FPS unless you install the widescreen fix which I have yet to try (Win7+ only)
 
Higher refresh rate indeed helps with tearing though its still tearing slightly, but the tears are smaller and on more places so its not that bad if the game is fast paced and you need the extra input lag reduction.
 
What tearing? :confused:
With new RTSS and scanline sync you can have cake eat it and even do other tings to it ;)

In past I played Quale Live at exactly 125Hz and somehow tearing was always at very top of screen and oh my god, was it fast and fluid experience :cool:
RTSS adds some lag but still much less than v-sync by a long shot. It is preferred way to use CRT and any fixed refresh rate display. Not so much for VRR because G-Sync/Freesync is still better.

Getting frame rates at any resolution that are above FW900 refresh rate limited by its 121KHz horizontal refresh limit is piece of cake of modern GPU. Even 980Ti should be enough for that and if not then converters like Delock can easily do 1920x1200@96Hz and faster GPUs do 1200p at >96fps

Damn, I have my FW900 at home and since this option appeared I am really hardpressed to get it to where I live and use it for gaming. Fast paced games are so much better at FW900 than 60Hz LCD I use now... :(
 
I am nearly 100% certain my Flyback Transformer is on its way out. Fuzzy screen on startup followed by a "POP" sound which restores clarity. I am able to change out alternators in various cars, so I don't see this as being all that complex. I also understand how to ground out and discharge when the time comes. However, I am having minor issues using the internet to find out what the model number of the part is that I am looking for. Anyone do this repair already?
 
Question, playing Fortnite and I'm trying to get this thing to actually run smooth with this game. I have the res at 1920x1200@96 and can easily hold an fps above 96 frames but whenever I go into game everything stutters or jitters when I just span left to right. I also have a 144hz panel and that looks great but nothing like I've experienced in the past on older games with this monitor. Also the ufo motion tests look amazing on the fw900 vs the lcd panel. Any input at all?

I've tried vsync, fast vsync, you name it...
 
Question, playing Fortnite and I'm trying to get this thing to actually run smooth with this game. I have the res at 1920x1200@96 and can easily hold an fps above 96 frames but whenever I go into game everything stutters or jitters when I just span left to right. I also have a 144hz panel and that looks great but nothing like I've experienced in the past on older games with this monitor. Also the ufo motion tests look amazing on the fw900 vs the lcd panel. Any input at all?

I've tried vsync, fast vsync, you name it...

Have you tried "scanline sync" via rivatuner statistics server app? Do you have SLI/CF? Do you have SSD for your system that is not nearly full? Do you have enough RAM? Are you positive some other app running in the background is not causing it? Did you close webbrowser before playing the game? Did you disable steam ingame overlay?

Does it stutter in all hw/sw configurations you have tried, or is there a pattern?

Btw, some games are just shit in this regard and there is not much you can do. Even older ones, for example NFSHP2 (2002).
 
I am nearly 100% certain my Flyback Transformer is on its way out. Fuzzy screen on startup followed by a "POP" sound which restores clarity. I am able to change out alternators in various cars, so I don't see this as being all that complex. I also understand how to ground out and discharge when the time comes. However, I am having minor issues using the internet to find out what the model number of the part is that I am looking for. Anyone do this repair already?
There's a long time that that flyback has been out of stock everywhere. There may still be some spare D boards for sale around (Unkle Vito if he's still in the business ?) but it's quite expensive.

IMO that kind of issue should be investigated more seriously before stating it's really coming from the flyback itself. The only thing that seems to be sure it that replacing the entire D board fixes the issue, but it might well be caused by some other inexpensive component in the vicinity of the flyback. For instance, there are some polyester foil capacitors around which may evolve a very suspicious way. I've seen quite a bunch of them with some sort of oily yellow substance spreading inside, behind the transparent plastic layer, whereas normal ones are plain grey. Maybe polyester decomposing under the action of heat and electricity ? Anyway it's rather vicious as I've controlled some of these with a component tester and they read fine, if there are problems they probably occur at higher voltages than a few volts, and they may be intermittent/heat related as well.
 
Have you tried "scanline sync" via rivatuner statistics server app? Do you have SLI/CF? Do you have SSD for your system that is not nearly full? Do you have enough RAM? Are you positive some other app running in the background is not causing it? Did you close webbrowser before playing the game? Did you disable steam ingame overlay?

Does it stutter in all hw/sw configurations you have tried, or is there a pattern?

Btw, some games are just shit in this regard and there is not much you can do. Even older ones, for example NFSHP2 (2002).

I've tried scanline sync, vsync, etc. Its a single card 980 gtx or a 1080 ti via a sunix adapter. I do have an ssd and I don't experience this issue with the 144hz panel. When I test I use only a single monitor only. The only pattern I can find is using the fw900.
 
Question, playing Fortnite and I'm trying to get this thing to actually run smooth with this game. I have the res at 1920x1200@96 and can easily hold an fps above 96 frames but whenever I go into game everything stutters or jitters when I just span left to right. I also have a 144hz panel and that looks great but nothing like I've experienced in the past on older games with this monitor. Also the ufo motion tests look amazing on the fw900 vs the lcd panel. Any input at all?

I've tried vsync, fast vsync, you name it...
Try 1920x1200@85 and 1680x1050@100 and see if there's a difference.

Is it the same with vsync off?
 
I've tried scanline sync, vsync, etc. Its a single card 980 gtx or a 1080 ti via a sunix adapter. I do have an ssd and I don't experience this issue with the 144hz panel. When I test I use only a single monitor only. The only pattern I can find is using the fw900.

Bring up your monitor's OSD to see if you're actually running at the correct resolution/refresh rate. Last time I played fortnite (a couple months ago) it had a problem where it would only run at my "native" resolution, AKA the first resolution listed when you open CRU, regardless of what resolution I actually switched to in the menu. This was a problem because in game it was just upsampling to my "native" resolution instead of actually switching to the specified resolution.

So my guess is that you have a refresh rate/frame rate mismatch going on.
 
I've discovered the issue you are having Enhanced early on and its irritating but at least we know how to fix it. What I've just discovered is that any changes made with CRU a full system reboot is required, not just the restart program included with CRU. For some reason this disrupts any type of sync'ing capabilities. Once this was done I was then able to enable scanline sync and holy ****, gync is no comparison. This would explain a lot as I was always trying to tweak the settings. No tearing and just butter heaven, thanks everyone.

What resolution/refresh rates does everyone enjoy for gaming?
 
I've discovered the issue you are having Enhanced early on and its irritating but at least we know how to fix it. What I've just discovered is that any changes made with CRU a full system reboot is required, not just the restart program included with CRU. For some reason this disrupts any type of sync'ing capabilities. Once this was done I was then able to enable scanline sync and holy ****, gync is no comparison. This would explain a lot as I was always trying to tweak the settings. No tearing and just butter heaven, thanks everyone.

What resolution/refresh rates does everyone enjoy for gaming?

I accidentally quoted the wrong post, I was talking about the stuttering issues you're having with fortnite. I never had an issue as long as the game was actually running at the resolution it said it was.

But yeah as far as other games go, I use every resolution under the sun.

Right now I'm playing Battlefield V. As of a couple weeks ago I still had the 4-thread i5 6600k, I was severely CPU-thread limited even with overclocking, so I would get drops to 45fps. So what I did then was creat a 2400x1800@90hz interlaced mode (the highest my HD Fury Nano can go at 90hz interlaced), then used RTSS scanline sync x/2, which caps the frame rate at half the refresh rate. This gave me a perfectly-paced 45fps, with very low input lag. I cranked up motion blur in-game to reduce combing/ghosting you get from interlaced and half-vsync. I was also had enough GPU headroom to run a bit of super sampling in-game so it was closer to 1980i. It looked really good, very cinematic, but I generally like to play competitive FPS games at 60fps or higher to get better motion clarity and recoil control.

So I bought an 8-core Ryzen 1700x on sale. Overclocked it to 3.8gHz right away, now I can play with very few drops below 80fps. So now I'm running 1792x1344@80hz, no super sampling. This is actually the sweetspot resolution/refresh rate combo where my GPU and CPU are running above 90% most of the time.
 
Last edited:
I've discovered the issue you are having Enhanced early on and its irritating but at least we know how to fix it. What I've just discovered is that any changes made with CRU a full system reboot is required, not just the restart program included with CRU. For some reason this disrupts any type of sync'ing capabilities. Once this was done I was then able to enable scanline sync and holy ****, gync is no comparison. This would explain a lot as I was always trying to tweak the settings. No tearing and just butter heaven, thanks everyone.

What resolution/refresh rates does everyone enjoy for gaming?

Happy to hear you got it. Never had such an issue with CRU that would need a PC restart. Game restart is a must though but that goes without saying.

I have yet to see a G-sync monitor with my own eyes but I would think it should be good for games like ArmA 2/3 which have very unstable FPS/frametimes and the performance is often locked by the multiplayer server. I wasnt able to play this game smoothly on a CRT.

After a brief and unexciting BF3 reinstall episode, I am now playing BF Bad Company 2 multiplayer again at:

5200x2296@100Hz
(4x DSR, 0% smoothing, this res requires Sunix adapter so it can go over 400MHz pixel clock)
I also turned off all eyecandy such as anti-aliasing and HBAO because it causes noticeable input lag (even on 1080ti). The 12mpx is "good enough" for me to not notice the fuzzy edges and transparent textures too much when actually playing the game (otherwise big AA snob here).

Regarding the "VMM2300 DEMO" issue with the Sunix adapter, I was getting it with regular VGA cable as well and this helped in that case:

1) Use your PC normally with Sunix adapter and be stuck in the "VMM2300 DEMO" mode.

2) Disconnect all cables from Sunix adapter and from your GPU and wait a few seconds. You will lose display of course.

3) Shut down PC (probably by holding a power button)

4) Reconnect all cables to Sunix

5) Power on PC

Maybe this will help someone later.
 
I plug in the Sunix in this order:

1) VGA cable to Sunix
2) USB power to Sunix (this when the Sunix powers on and copies the EDID from your monitor)
3) Diplayport to Sunix

This is the way to get the EDID from your monitor properly passed through to your GPU.

If you want to get extra hacky, you could make male-female coupler at the end of your VGA cable that has the two EDID wires re-routed to some sort of EDID spoofer, if your monitors EDID is limiting the custom resolutions you can create (this is a windows w/ displayport problem).
 
The only thing that seems to be sure it that replacing the entire D board fixes the issue
I was not aware that they no longer made the FB Transformers for this model anymore. There is no 3rd party replacements? Where would I look to find a new D board? I would hate to have this die on me, because I don't really want to use it as range practice.
 
I was not aware that they no longer made the FB Transformers for this model anymore. There is no 3rd party replacements? Where would I look to find a new D board? I would hate to have this die on me, because I don't really want to use it as range practice.
FBTs are model specific, they stopped making the one you're interested in when they stopped producing the FW900, more than 15 years ago.

Since it's model specific, only the original part can fit and trying to use another one, even from a close model, it a dangerous gamble IMO. There is little technical information about them and very high voltages at the output, there's no way to know what would happen but it could be pretty bad.

BTW I doubt there are many FBTs still manufactured nowadays, that's a kind of part specific to CRTs, which are also not used anymore. There are maybe some low end ones still used in basic analog oscilloscopes, and that's all.

About D boards, try contacting Unkle Vito ( https://hardforum.com/members/lagrunauer.149340/ ) , he might still have some for sale. Someone sold one on that thread as well some time ago, I don't know if it's still available.
 
FBTs are model specific, they stopped making the one you're interested in when they stopped producing the FW900, more than 15 years ago.
Breaks my heart to say it, but it died this afternoon. All I have on the front face is the blinking orange light instead of green. It will not power on. I tried plugging and unplugging it, but it seems dead. I fiddled with it for a half hour, and moved it to storage. Currently using some crappy 144hz LCD. I did message Uncle Vito, but have not gotten a response, and I don't consider that a good sign. I figure I am pretty much shit out of luck at this point. I don't even know how to get rid of something like that, but I may have to shoot it a lot, and just sweep the pieces into a garbage can for the trash guy to remove it.
 
Well, it depends if you have some decent knowledge in electronics and the will to try saving the old folk, or not. That blinking orange light probably means safeties are enabled and shut the monitor down, or one of the power lines failed. Running Windas and the very first choice in the procedure menu ( Preparation for Alignment ) may give a clue about where the problem is located without even opening it.
IMO it's too pessimistic to just tell the flyback is dead because it stops working and give up. The culprit may be something as stupid as a dry solder joint that slowly wore out, and well, one day all the bad solder is gone and there no contact anymore. I've already spotted a couple of those in mines.

Anyway, if you get rid of the screen I'd suggest you ask if there are people in your area interested in salvaging it for parts. Even with an unknown problem the boards feature some obsolete components that are pretty hard to obtain and may save another FW900. ;)

Uh, oh, and don't consider destroying the tube, you're likely to harm yourself. :p
 
Last edited:
Uh, oh, and don't consider destroying the tube, you're likely to harm yourself.
I would be 100+ yards away if I choose to destroy it. 5.56 & 30-06 would certainly help fit this thing into a standard trash can. However, I put it in dry storage in the attic for now. Just based on how it was acting with the screen becoming very fuzzy followed by a loud pop, I would guess that it is the FBT. My brother is an electronics engineer and works in the field. I don't know if I could goose him into assisting with diagnosis, but I certainly could scare him out to the range for some tv blasting. I have not decided one way or another on what to do. I have had this beast for 14 years? Perhaps a bit less. It never gave me trouble until the last year or so. I knew this was coming, and have been unable to do anything about it. I will give it a few weeks to cool off before I take the shell off and poke around.
 
Last edited:
someone from one of those CRT / old electronics YouTube channels might know about flyback substitution. regular transformers can be substituted in other electronics. also a dead fw900 is still fw900 parts for other fw900 peeps
 
I'm having a weird problem with my dp to vga adapter. In my brother's computer the adapter is capable of doing 1600x1200@85hz without any kind of problems, he's using a nvidia gpu and all the custom res is done via their own app, with gtf timmings. In my computer, with an amd gpu (rx570) that same adapter won't pass ~175mhz of pixel clock. I do all the adjustments with CRU, because the integrated custom resolution of amd doesn't allow me to input anything that suprases that pixel clock. If I remember correctly, when exceeding the pixel clock , windows should still allow me to select that resolution. But not now, it doesn't show anywhere.

Worst part of it it's that yesterday it worked fine for a few hours, before I changed the gpu from one pci-e to another. I've patched the amd drivers (with no difference), and also formatted the whole system, still the same problem. Something really strange is the fact that in a clean windows installation, w/o internet conection, the monitor, in the detailed resolutions section of CRU, has a 1600x1200@ 85hz with different timmings and polarity.

Any ideas?
 
Blutrache You might be running into the problem where the drivers are detecting a low max pixel clock from the physical EDID. With my Sunix adapter, as long as you plug up your CRT to the Sunix before you connect the Sunix to the GPU, your monitor's EDID will be passed through. But I noticed that if you plug the Sunix to the GPU first, the GPU just reads the Sunix's EDID, which has a super low max pixel clock.

So a couple thing might be happening for you. Either your adapter simply doesn't have an EDID passthrough capability, or you're plugging it up before connecting your monitor... or something else entirely. We're in uncharted waters here.
 
Back
Top