24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Sup dudes. Trying to get back into CRT of late.

I purchased the DP2VGAHD20 which is recommended both here and on reddit. Everything works fine, but it does clip the very top range slightly.

On Native outputs, tested on my Intel VGA output and an old Ati 7970, the readings are consistent between both; the monitor was calibrated (i1disPro) to RGB 100,100,100, Peak white. Pumped through the Startech adapter, the Peak values are 84,87,90. So a loss of 16nits, If I clamp to d65.

I know for certain its using 0-255, the adapter IS responsive to that end. To be clear I can not recover that 16nits, because the red channel is at maximum.

Not saying this is the end of the world or nothing. I'm not sure if this is because I got a bum adapter, or it's expected for all adapters of this sort. Thought I'd ask you Experts.

Is this behavior common in other adapters ? How's that Delock, full brightness ?

<I'm NOT calling out anyone for recommending this adapter, I'm NOT ANGRY> Just curious about the outcome :geek:
 
Last edited:
Is this behavior common in other adapters ? How's that Delock, full brightness ?

<I'm NOT calling out anyone for recommending this adapter, I'm NOT ANGRY> Just curious about the outcome :geek:
I have not done any measurements but my Startech was slightly dimmer than Delock 62967

Personally I prefer Delock for FW900
 
I have not done any measurements but my Startech was slightly dimmer than Delock 62967
it makes me wonder, the people making these adapters, ARE THEY making measurements, or are they just following some base diagram and cranking them out..
 
We should make a "table" for these adapters, so people know the quirks.
Here you can find a summary of these adapters.
Until recently, users were telling if the image was good or bad, both Delock 62967 and Startech DP2VGAHD20 have a good image for gaming and media content, without blur or other bad things.
Then XoR did extensive testing on DP2VGAHD20 and discovered that over a centain pixel clock the image starts to degrade in quality in comparision with 62967 and native DAC.
This is probably due to the fact that the 62967 has from specs a 270 MHz DAC while the DP2VGAHD20 is 200 MHz.
The DAC of these chipset can do more than that, but probably a better low pass filtering or a better DAC precision at high speed is an advantage for the 62967.
The DP2VGAHD20 however reaches a higher pixel clock and it works with all the video cards while the 62967 may have problems with some of them, requiring a replacement of the displayport cable.
 
Thx for the deets derupter, Do you happen to know why the adapter would impact brightness ? Is it underpowered or perhaps just a lack of design/testing oversight.

I've seen other adapters of this type use their own external power supply
 
Thx for the deets derupter, Do you happen to know why the adapter would impact brightness ? Is it underpowered or perhaps just a lack of design/testing oversight.

I've seen other adapters of this type use their own external power supply
Is that loss costant across the resolutions or change as you go up with pixel clock?
It would be interesting to know if it also happens with other samples and what happen with the 62967 or the Synaptics based adapters.
 
Is that loss costant across the resolutions or change as you go up with pixel clock?
It would be interesting to know if it also happens with other samples and what happen with the 62967 or the Synaptics based adapters.
Yes the loss is constant.
 
The Cgmha converter finaly work. After some tinkering with the uefi- and nvidia controller settings the „ vision d“ mainboard passthrough does what a dp to usb cable won‘t do. Indeed give me more headaches.

At first the new adapter seems to produce a sharper signal as the dp2vgahd20. But that‘s just my subjective impression. Based on readability. This spokes for the high capable chipset. As the manual doesn‘t gave valuable info i’m a bit skeptical.

However, in fact the maximum bandwidth is ca. 360mhz 2304x1440 at 75hz. Also I‘m able to display 480p at 160hz. It seem to be limited by some configuration on my side. Already the dp2vgahd20 wasn’t able to go above this spec. 2304x1440 at 79hz should be possible I vaguely remember. This most likely eliminate the non specified dp cable I use as a point of failure.
I tried another monitor to max out the bandwidth but it failed also. So back to tinkering. Btw can somebody give me the horizontal frequency data for the fw900 in cru (1440p)?
Man, I miss my old ramdac card…
 
So it wasn't a driver limitation? You were able to create the resolution in CRU but when you actually try to use the resolution the adapter goes black?
 
I am able to do 2304x1440@80Hz on Startech. With tighter blanking intervals for sure but the image still looked correct.
377MHz if I remember correctly is maximum pixel clock on my unit.

BTW. To anyone using digital to analog converters and Nvidia cards: Do not forget to enable dithering if you also change gamma slider or change gamma in games because these adapters are 8 bits per pixel only and will have banding without dithering.

To check it you can use http://www.lagom.nl/lcd-test/gradient.php
It should look always smooth at gamma set in control panel at 1.0 but as soon as it is increased or decreased there can be visible banding.

To fix this enabling temporal dithering at 8-bit bring quality to be on par with AMD cards and really on par with 10 bpp RAMDACs. It is quite bothersome as it requires tinkering with Windows registry. More info at https://forums.guru3d.com/threads/nvidia-and-dithering-controls-how-to-enable.436621/
It is even more bothersome because these settings like to reset from time to time eg. when installing new driver version. Best to check from time to time if there is banding on LAGOM LCD page or something similar and re-apply tweak as needed.
 
To fix this enabling temporal dithering at 8-bit bring quality to be on par with AMD cards and really on par with 10 bpp RAMDACs. It is quite bothersome as it requires tinkering with Windows registry. More info at https://forums.guru3d.com/threads/nvidia-and-dithering-controls-how-to-enable.436621/
It is even more bothersome because these settings like to reset from time to time eg. when installing new driver version. Best to check from time to time if there is banding on LAGOM LCD page or something similar and re-apply tweak as needed.
I've tried this before and couldn't get it working right. it kept undo-ing itself randomly.

Does the color-profile need to be redone with dithering enabled ? Does it have a different pixel response ?

When I plug and unplug, it enters 3 different displaydatabase entries ?


Never mind, just shotgun the damn entries, dithering is definitely enabled, "for now". I will recalibrate once the monitor stabilizes in 2 hours, and let you guys know if the color response is different.
 
Last edited:
Dithering does not survive sleep, but it will survive reboot. no problems there. Thus far it's stayed enabled.

I am using the 8bit temporal setting.

It does impact the color tables. The system is able to use some more intermediate values. On dispcal, it was reporting 7bit on uncalibrated display (this is an estimate), now it's reporting 11-bit. Again, estimate, it's not actually 11bit.

This definitely means you have to run the calibration from scratch.

I added a fan to my setup recently, so I was going to recalibrate anyway, as the temperature change makes a large dip on its response curve (Primarily Darkens).
 
Did, 2 calibration runs, 1 desktop 1 madvr, enabling the ditherbit has lowered the average DE. Not by a significant amount but it's there.

Looking over some stuff, temporal introduces a bit of softness to high frequency edges/textures.

It kind of makes sense that it's disabled by default, because to someone "outside" of the hobby, they might just accuse nvidia's gpu image of looking "blurry" and that'd be bad press.
 
Last edited:
With the prices that many CRT's are going for on eBay as I am sure many know here, its like these things are becoming collector items, especially the FW900 which I seen them go for many times what they were worth new! Maybe one day I will try out a FW900 but not for $5,000. The shipping part I understand being very expensive as it can be very difficult to assure it arrives it one piece.

However I am happy to have the 2060u and hope to use it as long as I can as an occasional use monitor. I was able to recap the power supply and main board with no issues although I haven't done the video board yet, the idea of doing this is to try to increase the useful life while reducing possible issues. The capacitors I used where United Chemi Con, Nichicon, and Panasonic, all where higher hour, some higher temperature 85-105C. No issues noticed yet after going on a year. In case I need parts maybe I can find it on eBay or somewhere like a junk 2060u.
 
I was able to recap the power supply and main board with no issues although I haven't done the video board yet, the idea of doing this is to try to increase the useful life while reducing possible issues.
Add a filtered intake fan or an unfiltered exhaust fan. It'll be alot more stable/ safe, and last much longer.

You might also consider new thermal paste on the vrm/mosfet sections/ heatsinks. Be careful taking them off, they could be stiff. Warm up the monitor before attempt.

Double check if they may have used thermal epoxy, which isn't meant to be replaced. "it can still wear out over time" , but it's more tricky to touch up.
 
Last edited:
  • Like
Reactions: WWDWD
like this
With the unveiling of the first OLED Monitor geared towards gamers, is anyone else losing interest in CRTs? Personally, I loved using my 21" Diamondtron CRTs for a few months but the resolution and screen size left me wanting more. Considering that now this upcoming OLED monitor will probably be around $2,500, I think I have dropped my dreams of getting an FW900 and will take this as an upgrade instead. At this point, it appears that the only thing CRTs have on OLEDs is motion clarity yet that leaves me satisfied enough with OLED. Thoughts?
 
Last edited:
OLEDs don't have BFI at arbitrary refresh rates and they lose too much brightness at the 60hz strobe. So that's two things they need to fix for me to switch
 
With the unveiling of the first OLED Monitor geared towards gamers, is anyone else losing interest in CRTs? Personally, I loved using my 21" Diamondtron CRTs for a few months but the resolution and screen size left me wanting more. Considering that now this upcoming OLED monitor will probably be around $2,500, I think I have dropped my dreams of getting an FW900 and will take this as an upgrade instead. At this point, it appears that the only thing CRTs have on OLEDs is motion clarity yet that leaves me satisfied enough with OLED. Thoughts?
Native support on all resolutions comes to mind, and the way that pixels on a CRT illuminate with one another makes a 1920x1080p image look as smooth as somewhere around 1800p on a 4K LCD/OLED (Anti-Aliasing also looks a lot better on a CRT), which means that you don't need as powerful of a computer to get a good image. A brand new computer could have its lifespan doubled because of this.

Clear motion also comes with all resolutions, and, since they're all native, they all look good, which means retro games can all be played as they were meant to be, and, what they were built to be played on. There is something unique about the glowy or luminous pixels of a CRT, and, that is something that is yet to be replaced in my eyes.

A controversial point is that, if you settle with an FW-900, you also step out of the game of chasing new monitor tech, and, spending more money when new ones come out.
 
I decided yesterday, that, like my last FW900, I will remove the film of the screen, since it doesn't let the screen be as Glowy as it can be... although, the last one I had, I removed it because it had a big scratch on it, and, I remember that the phosphor trail was very apparent on it, despite the benefits of the glow (horror games were unplayable).

Is the extended phosphor trail an effect of removing the film ?

I remember-scrolling on this forum would make the whites of the text trail for around half a second after scrolling. Right now, I can see the trail, but, it seems to fade away a lot quicker, and, is less bothersome in general.

My last FW900 also had a voltage problem, so, it was running too high of a voltage, which, maybe that affected it...

Let me know what You think.

Film or no film ?
 
I'd leave the film on. I only removed mine because it was damaged. Was kind of cool without, but not worth it as it lost the ability to resolve black. Any ambient light, maybe even including its own light reflected back, just ruined it for black. So I ended up having to find a new filter. Still this same one from Kantek with which I remain pleased. (Only fits if the display is debezeled though.)

And really black was the main reason I returned to, actually stayed with for the most part, CRTs.
 
Last edited:
I'd leave the film on. I only removed mine because it was damaged. Was kind of cool without, but not worth it as it lost the ability to resolve black. Any ambient light, maybe even including its own light reflected back, just ruined it for black. So I ended up having to find a new filter. Still this same one from Kantek with which I remain pleased. (Only fits if the display is debezeled though.)

And really black was the main reason I returned to, actually stayed with for the most part, CRTs.
I remember that in a dark room, blacks still looked black without the filter on, so, blacks are ok for me.

The light reflecting/glowing on the glass is something that bothered me too though, thanks for reminding me.
 
I decided yesterday, that, like my last FW900, I will remove the film of the screen, since it doesn't let the screen be as Glowy as it can be... although, the last one I had, I removed it because it had a big scratch on it, and, I remember that the phosphor trail was very apparent on it, despite the benefits of the glow (horror games were unplayable).

Is the extended phosphor trail an effect of removing the film ?

I remember-scrolling on this forum would make the whites of the text trail for around half a second after scrolling. Right now, I can see the trail, but, it seems to fade away a lot quicker, and, is less bothersome in general.

My last FW900 also had a voltage problem, so, it was running too high of a voltage, which, maybe that affected it...

Let me know what You think.

Film or no film ?
Definitively keep the film on.
It is there for good reasons, and it's been designed specifically to improve the picture of the tube it is stuck on (better contrast, less reflections and so on). It only makes sense to remove it when it is heavily damaged.
 
Definitively keep the film on.
It is there for good reasons, and it's been designed specifically to improve the picture of the tube it is stuck on (better contrast, less reflections and so on). It only makes sense to remove it when it is heavily damaged.
Thanks for your reply.
I decided to keep it on, after seeing just how much it helps when the room isn't dark.
 
What do you guys recommend for Contrast on the FW900 ?

I'm leaving it at 80 right now, like my last one's stock contrast.

sRGB mode bumps it up to 97, with brightness at 15. In normal I keep the brightness at 28, which is the best middleground I found between retaining detail and deep blacks.

Is sRGB mode the more correct one when it comes to natural colours ?
 
What do you guys recommend for Contrast on the FW900 ?

I'm leaving it at 80 right now, like my last one's stock contrast.

sRGB mode bumps it up to 97, with brightness at 15. In normal I keep the brightness at 28, which is the best middleground I found between retaining detail and deep blacks.

Is sRGB mode the more correct one when it comes to natural colours ?
SRGB, if I recall correctly is supposed to lock the monitor into a mode where white point is 6500k. I think it locks brightness and contrast. At least it did with the CR1 monitors.

I wouldn’t use it. It doesn’t do anything to change the primaries of the monitor and as far as I’m aware the screens have no color management built into them. They are and will always be limited to the SMPTE-C colorspace (Rec. 601?).

If you want accurate colors and modes then do the WPB adjustment with the proper equipment and/or adjust gain/bias to your desired target.
 
SRGB, if I recall correctly is supposed to lock the monitor into a mode where white point is 6500k. I think it locks brightness and contrast. At least it did with the CR1 monitors.

I wouldn’t use it. It doesn’t do anything to change the primaries of the monitor and as far as I’m aware the screens have no color management built into them. They are and will always be limited to the SMPTE-C colorspace (Rec. 601?).

If you want accurate colors and modes then do the WPB adjustment with the proper equipment and/or adjust gain/bias to your desired target.
Yes, I plan to do that somewhere in the future.

Thanks for your reply.
 
Hello everybody
I'm trying to adjust my FW900 again after few years but running into some weird issue: I cannot connect it to my monitor or something else weird is going on when (if) I manage to connect to it at all
I get the following message(s)
"ECS syntax error NG! NG! NG!"
or
"REG write is NG -> can't access, check condition!"

I use old laptop with windows xp, date set to year 2005 and compatibility mode to w95, port com3 (when I change it I am unable to get this NG! error and instead get 'can't connect' warning'. Mode is also set to manual.
Everything patched and was working properly. I plug 2 cables only into 2 bottom pins on my FW900. Pretty sure that was the setup last time. I have 2 leds on the adapter and both appears to be doing something- red is always on and blue is blinking when I send signals/action through software

What else can I try? Any ideas?
Oh boy. How sure are you that your adapter works? Any way to verify? Other than that I would double check your adapter is connected properly. It could have failed in the last two years.
 
gonna sell mine runs great what`s the best safest way to sell it ...how much should I ask , the plastic film has been removed ..thanks
 
Last edited:
gonna sell mine runs great what`s the best safest way to sell it ...how much should I ask , the plastic film has been removed ..thanks
Ship it only if the person ordered a logistics company to personally come pick it up for them, and the company offers a blanket to wrap it up and ways hold it in a safe place. Most of them offer blankets, and, a lot of them have ropes to hold it in place or a place to put it where it won't budge, or where they can keep an eye on it (like on the front seat.).
 
Ship it only if the person ordered a logistics company to personally come pick it up for them, and the company offers a blanket to wrap it up and ways hold it in a safe place. Most of them offer blankets, and, a lot of them have ropes to hold it in place or a place to put it where it won't budge, or where they can keep an eye on it (like on the front seat.).
I just imagined a FW900 wrapped in a nice blanket, sitting on the front seat of a truck with a cup of hot coffee. I couldn't stop laughing. :ROFLMAO:
 
Is there anything that can be done about marks like this on the fw900's screen filter ?

P_20220114_045757.jpg
 
Back
Top