24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I've tested 2
I recently got my hands on a ViewSonic G90f, having arguably the best specs of any 18" viewable screen CRT.
It even came with the original manuals, driver CDs, and whatnot. Folks, this monitor is in scary good condition:....
Looks beautiful! Viewsonic made some of the best monitors back in the day. I was sad when Viewsonic and Sony stopped making decent monitors a while back. I thought they would continue their trend with LCD.
 
With that, I'm curious about some of the practices you guys do in order to maintain your CRTs and keep them as pristine as possible (or reasonable). My Hitachi SuperScan Elite 751 was in aesthetically awful condition when I got it and still is, despite textbook screen performance. So, I'm somewhat new to all around CRT maintenance.
Are there any particular practices or tricks I should keep in mind?

Service manual is here. Looks like adjusting voltages isn't straightforward - didn't see anything about a service menu, so not sure how you can do a proper hardware calibration without some knowledge of electronics. That said, you can still get some traction with the OSD and some software calibration.
 
Service manual is here. Looks like adjusting voltages isn't straightforward - didn't see anything about a service menu, so not sure how you can do a proper hardware calibration without some knowledge of electronics. That said, you can still get some traction with the OSD and some software calibration.
So, what would the advantage of hardware calibration via whatever tweaks are needed plus WinDas be over just using straight software like Displaycal?
Also, would the "video level" from the service manual be relevant?
"Allows you to change the video input signal level to match the signal coming from your computer. Press down or up to select 0.7V or 1.0V."
Only two selections- but would they be useful at all?
 
So, what would the advantage of hardware calibration via whatever tweaks are needed plus WinDas be over just using straight software like Displaycal?

My understanding is that some hardware calibrations allow the luminance output (as a function of input voltage) to be tweaked at a few points (out of 256 maximum points). I think that's essentially what's going on with WinDAS. This way, the tube will internally "map" a set of input voltages to another set of voltages that are ultimately expressed at the cathode. RGB gain and bias (along with brightness and contrast) can tweak things too, but at less points along the "curve".

That's my guess at what's going on - it's been a long time since I delved into this, and there were a few things I never fully understood.

I've not experimented enough with RGB Gain and Bias - it's possible that Bias adjusts the G2 voltage for each gun. In that case, you might be able to achieve good black levels. Might be worth tinkering around with, and then using DisplayCal as a final touch (though the farther the natural state of the tube is away from your target, the more unique colors you will lose due to limited LUT precision).


Also, would the "video level" from the service manual be relevant?
"Allows you to change the video input signal level to match the signal coming from your computer. Press down or up to select 0.7V or 1.0V."
Only two selections- but would they be useful at all?

This thread may help. Take home is that modern PCs output ranges between 0 and 0.7 volts, but other systems range between 0 and 1 volt. If a display is expecting a range of 1.0, but receives only 0 to 0.7, it will show a dimmer image. You'll lose dynamic range (and your blacks won't be any deeper I don't think).
 
I notice in CRU that under the timing methods there is the default 'manual choice', Automatic - CRT standard and Automatic - Old standard.

Given my 2060u, which timing mode should I use?

Windows automatically selected 1600x1200 at 85hz when I plugged in the monitor through the DPU-3000 without CRU being involved.

It also seems that my 2060u has an option to perform a 'standard picture auto size adjust' and a separate option for 'GTF auto size adjust'. This tells me that the monitor is capable of reading the more modern VGA sync standard and the old GTF standard.

Manual mode on CRU produces the following settings:

https://photos.app.goo.gl/Uwjxpz5VkLiYS54u8

CRT automatic mode produces the following settings:

https://photos.app.goo.gl/h2W6QXPASHGn6HHV8

Automatic Old standard produces the following settings:

https://photos.app.goo.gl/K74dxUgsZAyfmrFX9

My questions are:

a) Which timing mode should I use?

b) When should the automatic - old standard be used instead of the automatic - CRT standard? What are the disadvantages of using the 'old standard' on a CRT monitor that supports the more modern standard?

c) What advantages do I get in day-to-day use of the CRT monitor specifying 1600x1200 at 85hz through CRU using the 'Automatic - CRT standard' instead of simply specifying 1600x1200 at 85hz through the Windows 10 control panel?

d) How is CRU written to prove that the mathematical formula for the 'automatic CRT' calculation delivers the best results for CRTs?

I would imagine my monitor is telling my PC what the best options are for it without needing to use the CRT option? It is fully communicating its EDID and Windows knows its product name.
 
Last edited:
Any of you know an HDMI to female VGA adapter that works well?
After reading this post, it seems the pluggable type c to vga adapter would be a decent option.
However, considering I'll only pump 230MHz max into my ViewSonic G90f, I'd prefer an HDMI to VGA adapter so I can leave my type c display port open for other things.
I just tried this Portta adapter specifically because it listed its bandwidth being 225MHz (good enough), only to have it fail to even output to the ViewSonic for what's likely a bandwidth related issue. A red light from the monitor's power indicator appears when I plug its fixed cable into the adapter. When I turn the monitor on, it'll output an image for a half second, then blank, image, blank, and so on. I verified this happens with other computers, too. I tried the adapter with my Hitachi SuperScan Elite 751, and an image would display properly at most settings. But, there would be some distortion at settings with bandwidths past 200MHz.
When the Hitachi was my main monitor, I used a Tendak adapter that worked swimmingly. But, I didn't realize the VGA cable of the ViewSonic was fixed until I got it, so I'll need a gender changer if I wanna try the Tendak. Plus, the ViewSonic has a slightly higher max bandwidth than the Hitachi, so I might breach the Tendak's limits.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
There are methods to adding USB-C. For a device with a non-detachable USB-C male cable like the plugable USBC-VGA, you need a method that adds a female port such as the Sunix UPD2018 PCIe card (or Thunderbolt 3 add-in card) or a Wacom Link Plus.
https://egpu.io/forums/gpu-monitor-...sb-c-to-displayport-cable/paged/4/#post-79127
I must not have been clear- my Dell XPS 9560 has a USB-C/Thunderbolt output. But, I'd rather use its HDMI output if I can, since I'm currently using Thunderbolt for throughput-demanding devices.
That said, I'll go the USB-C route if it's the only way I can get 8 bit color at 230MHz.
 
I must not have been clear- my Dell XPS 9560 has a USB-C/Thunderbolt output. But, I'd rather use its HDMI output if I can, since I'm currently using Thunderbolt for throughput-demanding devices.
That said, I'll go the USB-C route if it's the only way I can get 8 bit color at 230MHz.
Yes of course you want to use the existing display output port of your laptop (HDMI) if you can - it will be less expensive and saves the multipurpose USB-C/Thunderbolt ports for other purposes.

If you can't find an HDMI solution, you can still use the HDMI port with DisplayPort solutions using an HDMI 1.4 to DisplayPort adapter (it might be troublesome to chain multiple adapters together though).

Computers that already have USB-C ports can have additional USB-C ports added, just like computers that don't have USB-C ports. The Wacom Link Plus has an HDMI 1.4 input (340 MHz) so it could be used to convert HDMI directly to USB-C so you can use a USB-C solution. I've had trouble with the HDMI input of the Wacom Link Plus though.

Regarding Thunderbolt, if you're only connecting one display that uses less bandwidth than 4K 60Hz 10bpc (533 MHz) then your write speeds should not be affected much since PCIe traffic is limited to around 22 Gbps (you should test this to be sure). Read speeds should not be affected at all since Thunderbolt has separate lines for receiving and transmitting.
 
search for words "Benfei" and "Rankie" in this thread.
Dang, so in order to use the only other verified-working HDMI to VGA adapters, I have to set the output to YCBCR? That's annoying.
I'm more and more thinking I might try to save up for a GTX 980 Ti external gpu rig sooner.
 
Dang, so in order to use the only other verified-working HDMI to VGA adapters, I have to set the output to YCBCR? That's annoying.
I'm more and more thinking I might try to save up for a GTX 980 Ti external gpu rig sooner.

980Ti is getting long in the tooth, it's way weaker than the PS5 and Xbox SX GPU's.

Why not get a Sunix DPU3000 or Delock/ICE's version?
 
980Ti is getting long in the tooth, it's way weaker than the PS5 and Xbox SX GPU's.

Why not get a Sunix DPU3000 or Delock/ICE's version?
Cause the 980 Ti is good on a sub 200 dollar budget and a big upgrade over my laptop's GTX 1050.
But yeah, I still might get a Sunix DPU3000 or Delock/ICE's version for upgrades down the road while I still can.
 
if you find setting the output to YCBCR of those hdmi to vga so annoying, you surely wont have the patience to deal with the issues from Sunix DPU3000 or Delock/ICE' versions (for more info about those issues, search them in this thread)
i have a dpu3000 with fw900 and gtx 1080 ti, i had to have some patience and learn to configure custom resolution refresh rate combos to work prorperly with it, and was worth it. as today, i am happy with it and dont miss anything from my 980 ti (rip) at all. well, thats just me.
 
Last edited:
if you find setting the output to YCBCR of those hdmi to vga so annoying, you surely wont have the patience to deal with the issues from Sunix DPU3000 or Delock/ICE' versions (for more info about those issues, search them in this thread)
i have a dpu3000 with fw900 and gtx 1080 ti, i had to have some patience and learn to configure custom resolution refresh rate combos to work prorperly with it, and was worth it. as today, i am happy with it and dont miss anything from my 980 ti (rip) at all. well, thats just me.
Well, it's more so annoying that the only verified-working HDMI adapters require an atypical output. Needlessly tedious and not a workaround inherently required for HDMI to VGA conversion. I'd rather find a high bandwidth adapter that's plug and play.
Though, I'm on Linux, so I'm unsure if the workaround is even necessary for me. I wouldn't be too surprised if it's a Windows-only issue, or if there's some even worse issue with those specific adapters on Linux.
Setting custom resolutions with the Sunix or Delock doesn't seem too bad. Honestly, I think messing with xrandr is kinda fun.
Also, reiterating, my choise of 980 Ti is based on budget more than anything. They have really attractive used prices right now.
 
I'm getting a female to female gender changer for my Tendak tomorrow. I'll post results.
I've already verified with my Hitachi that it can do at least 205MHz distortion/noise free. I'll need it to be able to do at least 225Mhz in order to get crisp native (custom resolution conforming to .21mm dot pitch @ 18") out of my ViewSonic.
 
I'd rather find a high bandwidth adapter that's plug and play.

I don't think such an adapter exists.

The Sunix is definitely not plug and play. It does some weird shit especially between 1536p and 1760p. It will also go randomly jittery at normal resolutions.

The simple, hassle-free adapters are usually low pixel clock.

Though that Vention USB-C adapter that's now discontinued seemed to work out well for people.
 
Last edited:
I notice in CRU that under the timing methods there is the default 'manual choice', Automatic - CRT standard and Automatic - Old standard.

Given my 2060u, which timing mode should I use?

Windows automatically selected 1600x1200 at 85hz when I plugged in the monitor through the DPU-3000 without CRU being involved.

It also seems that my 2060u has an option to perform a 'standard picture auto size adjust' and a separate option for 'GTF auto size adjust'. This tells me that the monitor is capable of reading the more modern VGA sync standard and the old GTF standard.

Manual mode on CRU produces the following settings:

https://photos.app.goo.gl/Uwjxpz5VkLiYS54u8

CRT automatic mode produces the following settings:

https://photos.app.goo.gl/h2W6QXPASHGn6HHV8

Automatic Old standard produces the following settings:

https://photos.app.goo.gl/K74dxUgsZAyfmrFX9

My questions are:

a) Which timing mode should I use?

b) When should the automatic - old standard be used instead of the automatic - CRT standard? What are the disadvantages of using the 'old standard' on a CRT monitor that supports the more modern standard?

c) What advantages do I get in day-to-day use of the CRT monitor specifying 1600x1200 at 85hz through CRU using the 'Automatic - CRT standard' instead of simply specifying 1600x1200 at 85hz through the Windows 10 control panel?

d) How is CRU written to prove that the mathematical formula for the 'automatic CRT' calculation delivers the best results for CRTs?

I would imagine my monitor is telling my PC what the best options are for it without needing to use the CRT option? It is fully communicating its EDID and Windows knows its product name.

CRU CRT standard is VESA CVT and Old standard is VESA GTF, with manual nothing is calculated, it only serves if you want to manually change some timings.
CRU uses those formulas which are standard VESA, you can use whatever you want, both are fine.
If you set 1600x1200 85 Hz on Windows panel without any custom resolution setting, the driver uses DMT with timings different from both CVT and GTF because this resolution is on the DMT VESA list with those custom timings.

Sometimes the resolutions present in the windows panel are wrong for CRT monitors.
The perfect example is the 1920x1080 60 Hz, if you see this resolution in Windows and you haven't set it custom, if you set it the image will be very bad.
This because that resolution is on the DMT VESA list, so the video driver uses those timings which are CVT Reduced and not compatible with CRT monitors.
If a resolution in Windows panel isn't on DMT list the video driver uses GTF timings with a CRT monitor.
Most all CRT monitor are GTF compliant, which means that if you set a resolution with GTF timings the monitor will show an almost good image and you will only have to make small corrections with the monitor geometry settings (this in theory)

Some information about VESA timings:

https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1043533036
https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1043924287
 
Last edited:
If the greenish color is there regardless of the brightness level, it's an issue somewhere on the R, G, or B line from the input plug to the KR, KG, or KB on the tube's socket. It can be something increasing the G signal or something else decreasing the R/B signals.

Hi.
It's looking like the tube has a GK-G1 or GK-H short. I've seen it flash a couple times on startup after trying to adjust the 30 IRE step in Windas WBP procedure and ending up at 22 cd/mm2 instead of 8 cd/mm2 because of how strong the green color is.

I've also bought the new caps for the A board and did a recap. Turns out my ESR meter was right, but obviously recapping the A board have not fixed the issue.

After messing a bit with windas I figured that I was wrong about the OSD colors not having the green tint, the green tint is always there if the brightness output of the monitor is high enough. The tint is there no matter which color is displaying on the screen.

BTW, following your advice to check the RGB lines, the only difference between channels I found is in the R114, R214, R314 (small smd resistors on the output of the IC403 - rgb amp).
The R214 (green color) is measuring to 21.8 ohms, while R114 (R) and R314 (B) are measuring to 22.4 ohms. Within 5% mentioned in the service manual, so this should not be causing any of my problems.
 
Well, I'm pretty surprised about the capacitors, these are Rubycon quality caps. They do age but the ones that suffer significantly are the decoupling ones located on power supply lines, and I don't think I've seen one decreasing more than 10-15%. What's your meter out of curiosity ?
Did you check D206 is OK ?
That could be an issue with IC403 as well, I've had a defective one on a screen and considering the pile of crap I received from a Chinese seller pretending to sell some "new" and "working" ones, this IC might be a bit fragile.
It's possible to somewhat test it with a multimeter: remove IC403 from the board and take a measurement in diode mode between the color outputs (pins 4, 6, 14) and the 80V input (pin 15). It should behave like a diode (conductive one way, blocking the other way), with similar voltage drops on all channels. Then take a measurement in resistance mode (red probe on pin 15, black probe on a color output), you should also get similar resistances with all channels. If you don't, the IC is probably defective.
 
CRU CRT standard is VESA CVT and Old standard is VESA GTF, with manual nothing is calculated, it only serves if you want to manually change some timings.
CRU uses those formulas which are standard VESA, you can use whatever you want, both are fine.
If you set 1600x1200 85 Hz on Windows panel without any custom resolution setting, the driver uses DMT with timings different from both CVT and GTF because this resolution is on the DMT VESA list with those custom timings.

Sometimes the resolutions present in the windows panel are wrong for CRT monitors.
The perfect example is the 1920x1080 60 Hz, if you see this resolution in Windows and you haven't set it custom, if you set it the image will be very bad.
This because that resolution is on the DMT VESA list, so the video driver uses those timings which are CVT Reduced and not compatible with CRT monitors.
If a resolution in Windows panel isn't on DMT list the video driver uses GTF timings with a CRT monitor.
Most all CRT monitor are GTF compliant, which means that if you set a resolution with GTF timings the monitor will show an almost good image and you will only have to make small corrections with the monitor geometry settings (this in theory)

Some information about VESA timings:

https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1043533036
https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1043924287

"If you set 1600x1200 85 Hz on Windows panel without any custom resolution setting, the driver uses DMT with timings different from both CVT and GTF because this resolution is on the DMT VESA list with those custom timings."

I've spent too much time watching Joe Rogan....can you explain in more basic English what exactly the 'DMT VESA list' is? Is it a list written by Microsoft into Windows 10 with default custom timings that are unchanged regardless of what the monitor's EDID communicates to Windows?
 
"If you set 1600x1200 85 Hz on Windows panel without any custom resolution setting, the driver uses DMT with timings different from both CVT and GTF because this resolution is on the DMT VESA list with those custom timings."

I've spent too much time watching Joe Rogan....can you explain in more basic English what exactly the 'DMT VESA list' is? Is it a list written by Microsoft into Windows 10 with default custom timings that are unchanged regardless of what the monitor's EDID communicates to Windows?

Display monitor timings (DMT) is a list of pre-defined timings created by VESA in 1994 when timing formulas were not available.
Then in 1996 they created GTF and in 2003 CVT, which are formulas for calculating timings with any resolution and refresh rate.
VESA continued to update the list even after creating the formulas, although most of the new entries are CVT and CVT reduced resolutions.
Operating systems and video drivers know those timings and apply them when needed.

Here DMT VESA document
Here GTF
Here CVT
Here a little description of timings by Nvidia

When you apply a resolution in an operating system, the video driver checks that list and if it is present apply those settings, otherwise it uses the GTF formula.
Looking at the DMT document you will see the 1600x1200 85 Hz entry in the list, indicated as NOT CVT COMPLIANT and with timings different from GTF and CVT, this because it was created in 1996.
With a CRT monitor you can use those timings or GTF or CVT, all three are good.

Clearly what is present in the EDID under "Detailed resolutions" has priority over all the rest, but see the entries under "Standard resolutions" ?
In that "EDID Block" you can set resolutions and refresh, but what timings are used?
Well, DMT if they are on the VESA list, otherwise GTF.
 
Last edited:
"If you set 1600x1200 85 Hz on Windows panel without any custom resolution setting, the driver uses DMT with timings different from both CVT and GTF because this resolution is on the DMT VESA list with those custom timings."

I've spent too much time watching Joe Rogan....can you explain in more basic English what exactly the 'DMT VESA list' is? Is it a list written by Microsoft into Windows 10 with default custom timings that are unchanged regardless of what the monitor's EDID communicates to Windows?
Many of the old VESA specs are free
https://vesa.org/vesa-standards/

DMT (Display Monitor Timing) spec available at
https://app.box.com/s/vcocw3z73ta09txiskj7cnk6289j356b/folder/11133572986

The edid-decode project lists and incorporates all? the EDID related specs (HDMI, CTA, VESA (DisplayID, DMT, EDID, CVT)) in its source code:
https://git.linuxtv.org/edid-decode.git/about/

A display has DMTs in its EDID. Some high level part of an OS needs to have a list of the DMT timings to use them. The timing info is then passed to the GPU.
An EDID has many ways to specify a timing - Established timings and Standard timings (some of those are DMTs while some are third party like IBM or Apple), VIC (CTA-861-G), Detailed timings, DisplayID Video Timing Modes 1 to 9 etc. The same part of the OS that knows about DMTs needs to know the other types as well.

The BIOS/firmware also needs that info to display boot screen before the OS loads (unless they use a default timing or a detailed timing that has all the info).
 
Im getting a problem with my DP2070SB, every few hours the screen will start to fluctuate down in brightness, when this happens knocking the side of the monitor does seem to help, it will often fix it for second or few minutes, and after a while it seems to fix it itself. Hitting a CRT to get the picture back seems to be a cliched thing, im wondering if this is a common sign of something?
On one hand it seems like it could be minor, like a loose connector, but on the other hand it seems unlikely for a simple mechanical issue to come and go out of nowhere.
 
Im getting a problem with my DP2070SB, every few hours the screen will start to fluctuate down in brightness, when this happens knocking the side of the monitor does seem to help, it will often fix it for second or few minutes, and after a while it seems to fix it itself. Hitting a CRT to get the picture back seems to be a cliched thing, im wondering if this is a common sign of something?
On one hand it seems like it could be minor, like a loose connector, but on the other hand it seems unlikely for a simple mechanical issue to come and go out of nowhere.

sounds like dry solder joints
 
  • Like
Reactions: jt55
like this
Hi everyone. I've been fortunate enough to use a really nice FW900 for quite some time. However, sometime last year it started having problems: at first, it refused to turn on unless I retried the power button multiple times. It would sound like it's turning on properly, but there would be no visible screen and the indicator would blink. I'd hear the "click" sound a bunch of times. However, turning it off and then back on again about 2-3 times would ultimately cause it to switch on correctly and work properly.

Finally, one day I tried going through this routine to turn it on, and was met with a white flash of light and after that it was finally dead.

It feels to me like maybe the capacitors have died? I don't really know much about it technically, though. It's just been sitting at my desk collecting dust. I sincerely doubt I'd be able to resolve this myself. I'm just wondering, is it anywhere near plausible to find someone who could repair this, or am I probably out of luck due to how nobody really has CRTs anymore? I'm in Rotterdam, in case anyone happens to be close by and would be interested in being commissioned to repair this :)
Thank you!

edit: gonna make a video of its current state tonight.
 
Last edited:
Hi everyone. I've been fortunate enough to use a really nice FW900 for quite some time. However, sometime last year it started having problems: at first, it refused to turn on unless I retried the power button multiple times. It would sound like it's turning on properly, but there would be no visible screen and the indicator would blink. I'd hear the "click" sound a bunch of times. However, turning it off and then back on again about 2-3 times would ultimately cause it to switch on correctly and work properly.

Finally, one day I tried going through this routine to turn it on, and was met with a white flash of light and after that it was finally dead.

It feels to me like maybe the capacitors have died? I don't really know much about it technically, though. It's just been sitting at my desk collecting dust. I sincerely doubt I'd be able to resolve this myself. I'm just wondering, is it anywhere near plausible to find someone who could repair this, or am I probably out of luck due to how nobody really has CRTs anymore? I'm in Rotterdam, in case anyone happens to be close by and would be interested in being commissioned to repair this :)
Thank you!

edit: gonna make a video of its current state tonight.
Dang, I hope it can get fixed!
Unfortunately, there aren't many CRT repairmen available due to how dangerous CRTs can be to work on and how niche opportunities are.
Have you tried looking through the FW900's service manual? You might get lucky and figure out the problem is something simple- like a bad capacitor.
 
With the single symptoms described I would have no chance to find what the problem might be without extensive testing so I doubt someone with no electronic knowledge could guess that with a look to the service manual. ;)

But it's unlikely to be a capacitor issue, something failed and triggers a safety.
 
I made a video of what's happening. It's actually a little bit different from what I remember happening back when it broke, I seem to recall it was just a flash of light and then nothing. Today when I try to switch it on, it hisses at me and constantly tries to re-initialize/degauss the screen.

Here's the video:


If anyone has any ideas following the video, or wants me to try something (not like it does anything else, though...), I'd appreciate it!

Also, thanks for the link to the service manual. I'll have a look at it. I'm also gonna call around for some repair shops, maybe I'll get lucky and find someone with experience and willingness to repair a CRT.
 
That almost sounds like the neck of the tube is broken or something. Can you take it apart and look at the neck? I've had a G70 green tube break on me, and when I tried to power on the set that's pretty much what it sounded like. :(
 
I made a video of what's happening. It's actually a little bit different from what I remember happening back when it broke, I seem to recall it was just a flash of light and then nothing. Today when I try to switch it on, it hisses at me and constantly tries to re-initialize/degauss the screen.

Here's the video:


If anyone has any ideas following the video, or wants me to try something (not like it does anything else, though...), I'd appreciate it!

Also, thanks for the link to the service manual. I'll have a look at it. I'm also gonna call around for some repair shops, maybe I'll get lucky and find someone with experience and willingness to repair a CRT.

Even better yet it means we now have a justifiable reason (at least tot he wife) of going and spending some money to get a new one! ;)
 
Hello everyone, my FW900 is greenish, and the OSD menu color is normal. I have seen many people's sony FW900s that seem to have the same problem. This seems to be a common problem.
Does anyone know how to solve this problem? Is it because of capacitor failure? Can replace the capacitor to solve the problem?
 

Attachments

  • IMG_8906.JPG
    IMG_8906.JPG
    282.7 KB · Views: 0
  • IMG_8908.JPG
    IMG_8908.JPG
    309.8 KB · Views: 0
  • IMG_8911.JPG
    IMG_8911.JPG
    350.7 KB · Views: 0
I made a video of what's happening. It's actually a little bit different from what I remember happening back when it broke, I seem to recall it was just a flash of light and then nothing. Today when I try to switch it on, it hisses at me and constantly tries to re-initialize/degauss the screen.
Yes, it looks like the screen is trying to degauss and fails on your video, the little "sparks" in the screen match every attempt, and I don't think these are normal behaviour.
Problem, the degauss circuitry is very simple apparently, it's a relay connecting the tube to the AC power. The relay and its control work, it can be heard.
I'd say there are few possibilities:
- a lot of luck and it's just a bad solder preventing enough current to flow in the degauss coil ?
- degauss coil defective (can't be repaired I suppose)
- OR something else must be enabled simultaneously with the degauss coil, it isn't and that prevents the degauss from being done properly. Back to the start, it can be anything including a defective tube.
 
Hello everyone, my FW900 is greenish, and the OSD menu color is normal. I have seen many people's sony FW900s that seem to have the same problem. This seems to be a common problem.
Does anyone know how to solve this problem? Is it because of capacitor failure? Can replace the capacitor to solve the problem?

Recalibrate your white balance yo. Follow the WPB guide.
 
So I decided to my GDM-FW900 out since I now have space for it, but my GPU (GTX 1060 3GB) does not have DVI-I or VGA. I tried a VGA to HDMI cable I had, but it nothing displayed in the screen. So what is the best way to use this monitor these days? My motherboard does have USB 3.1 gen 2 which should have enough bandwidth for 2304 x 1440 at say 100hz. I'm presuming that is the only way to go since gen 1 doesn't have enough bandwidth, the GPU also has DP And DVI-D.
 
So I decided to my GDM-FW900 out since I now have space for it, but my GPU (GTX 1060 3GB) does not have DVI-I or VGA. I tried a VGA to HDMI cable I had, but it nothing displayed in the screen. So what is the best way to use this monitor these days? My motherboard does have USB 3.1 gen 2 which should have enough bandwidth for 2304 x 1440 at say 100hz. I'm presuming that is the only way to go since gen 1 doesn't have enough bandwidth, the GPU also has DP And DVI-D.
Check out this post:
https://hardforum.com/threads/24-wi...ived-comments.952788/page-435#post-1044652495
Alongside what it lists, I can personally vouch for this HDMI female to VGA male adapter: https://www.amazon.com/dp/B01B7CEOVK/?ref=exp_retrorgb_dp_vv_d
It's good to at least a 235Mhz pixel clock- I might test it for higher clocks at some point, despite my CRT not supporting anything higher, just to help contribute to a list of valid adapter choices. Unfortunately, most adapters seldom have DAC pixel clock listed in the specs due to how they're geared more towards crappy old LCD monitors and projectors more than anything. 1920x1080@60p, with its puny pclk of 173, is pretty much always the max for those things.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
So I decided to my GDM-FW900 out since I now have space for it, but my GPU (GTX 1060 3GB) does not have DVI-I or VGA. I tried a VGA to HDMI cable I had, but it nothing displayed in the screen. So what is the best way to use this monitor these days? My motherboard does have USB 3.1 gen 2 which should have enough bandwidth for 2304 x 1440 at say 100hz. I'm presuming that is the only way to go since gen 1 doesn't have enough bandwidth, the GPU also has DP And DVI-D.

The USB-C output of motherboards is not good for video adapters because they usually don't support Displayport alternate mode and when is supported it works only with the integrated graphic of the CPU, so no way to connect it to your GTX 1060.
About the adapters try with the StarTech DP2VGAHD20, it has been tested by Flybye and seems to be a very good DP adapter with a pixel clock of 375 MHz, if you want more try with one of the Synaptics based.
 
Back
Top