24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Was checking out a blog and saw a GIF captured at ~6k FPS of a Zowie XL2566X+ monitor (400Hz TN display) that surprisingly has a CRT effect built-in, as part of the monitor's 'DyAc2' motion blur reduction feature.

View attachment 701389

I'd heard of DyAc before but wasn't aware this was what it was simulating (edit: the v2 at least, seems v1 just used regular BFI). Interesting to see it implemented at the monitor level.
Zowie is for counter-strike players , they know their stuff, goat flat pannels for hardcore pc gamers no debate <3
 
Was checking out a blog and saw a GIF captured at ~6k FPS of a Zowie XL2566X+ monitor (400Hz TN display) that surprisingly has a CRT effect built-in, as part of the monitor's 'DyAc2' motion blur reduction feature.

View attachment 701389

I'd heard of DyAc before but wasn't aware this was what it was simulating (edit: the v2 at least, seems v1 just used regular BFI). Interesting to see it implemented at the monitor level.
Yes, scanning backlights and edgelights are a thing! BenQ has done an excellent job with DyAc over the years, as one of the best competing strobe backlights versus NVIDIA ULMB2.

However, NEITHER can simulate a 60Hz CRT, like either my XG2431 or my CRT simulatorl They have a DRM firmware locked minimum Hz that is fairly high (~100) or they stop strobing.

And you can't "Bring Your Own Algorithm" like you can with generic Hz.

BTW, I just released Version 1.01 of my Blur Busters Open Source Display Initiative (CRT simulation, plasma simulation, etc), posted on both BlueSky and X concurrently.

It's great when it's in a firmware, but I'm trying to expand things -- having a hard time convincing many display makers -- so I'm now going commando on BYOA (Bring Your Own Algorithm).

It's essentially "I am not going to wait for display manufacturers, I'm just going ahead" BYOA era of Blur Busters. Display manufacturers can take the superior BYOA's and implement them into their firmwares if they want.

Many of the display simulation algorithms I plan to releasing over the course of 2025, will all be free permissive open source -- you can sell it yourself, use it yourself, implement it yourself, the CRT simulator is permissive open source on github. Oneself just have to supply generic Hz & the shader algorithms work -- whether a box-in-middle, or display-side, or driver-side, or software-side. It's part of the larger Blur Busters plan.

The algorithm is free, some will implement independently (Retroarch did) and others may contract me (Retrotink 4K did). It can go either way with any of the algorithms that are coming out throughout 2025-2026+.
 
Last edited:
hello broskis i came across an interesting comment on reddit that suggested a potentially universally compatible way to do interlaced scan on linux, if any of you is on linux or has already tested this method, could you tell me if its interlaced for real and if it works?

"If you want interlaced video to work correctly with xrandr you are going to have to manually interlace it yourself since the interlace option is broken with modern gpus. Set the width of the resolution mode to what you want the full width of the output resolution to be. Set the height to be half of whatever desired height that you want for the resolution. Then use the --scale 1x2 option when outputting that resolution mode (this doubles the internal height of the image). It will look very blurry when nothing is moving on your screen vertically; however, in motion everything will look just fine and will in effect achieve the full resolution that you desire."

it kinda sounds too good to be true, i love interlaced scan and i look the artifacts it produces at lower resolutions and refresh rates, will that linux method create real interlaced resolutions?
 
  • Like
Reactions: XoR_
like this
What could these faint lines be?
-They move when I drag the window
-I see them with 2 different PCs on both inputs and VGA connected with no adapter
-Degaussing and changing color temps in OSD doesn’t remove them

What is very intersecting is in Paint 3D I drew a horizontally wide box, and those faint lines to the right of the box disappear as I rotate the box. It seems this faint white ghost is only there when there is a long horizontal bar/edge.
 

Attachments

  • IMG_6096.jpeg
    IMG_6096.jpeg
    245.5 KB · Views: 0
What could these faint lines be?
-They move when I drag the window
-I see them with 2 different PCs on both inputs and VGA connected with no adapter
-Degaussing and changing color temps in OSD doesn’t remove them
It doesn't look bad on your photo. Cannot really see anything to be honest.

This is something some CRTs develop over time as I guess capacitors dry up.
In your case it doesn't seem like anything to worry about.
 
It doesn't look bad on your photo. Cannot really see anything to be honest.

This is something some CRTs develop over time as I guess capacitors dry up.
In your case it doesn't seem like anything to worry about.
It's barely noticeable. I've noticed it only happens against certain colors, and only happens with very long horizontal edges. Shrinking the width of a box lessens it lol. I guess I see it ever sense I noticed it. Good thing is games rarely have long perfectly horizontal edges so I haven't even seen it in-game. The cap thing...there is a TV shop in town who has been around since the late 70s which still advertises repairing CRTs and an electronics store I've used before which I trust. What do you think is a fair price range for someone to test/replace all the caps?

Ha I found severe smear, and this guy did solve it by replacing caps. At this point I feel like what I'm seeing is just going to eventually get worse to this point.

View: https://www.youtube.com/watch?v=ebOmWYqlOIY
 
Last edited:
It's barely noticeable. I've noticed it only happens against certain colors, and only happens with very long horizontal edges. Shrinking the width of a box lessens it lol. I guess I see it ever sense I noticed it. Good thing is games rarely have long perfectly horizontal edges so I haven't even seen it in-game. The cap thing...there is a TV shop in town who has been around since the late 70s which still advertises repairing CRTs and an electronics store I've used before which I trust. What do you think is a fair price range for someone to test/replace all the caps?
I would do it myself - but not with this issue being at this level. Your photo looks like any ordinary CRT and not broken CRT.
It is like you say something that once you notice you will see. I always see it on pretty much all CRTs to some small degree when displaying killer sample images. New CRTs had this issue when brand new in the year of manufacturing. Cheap CRTs had it more - more than what your photo shows 🙃

What I wonder if this can be improved for "up to specs" monitor as I never tried to improve such things but in theory someone with better understanding of how CRTs electronics work could perhaps improve them - including making them less blurry / improve bandwidth, etc.

Then again like you say it is not affecting normal images...
If you ever used plasma panels you would know they have similar (not the same but similar) issue and it almost doesn't affect them in normal usage except some cases e.g. specific game huds.

Ha I found severe smear, and this guy did solve it by replacing caps. At this point I feel like what I'm seeing is just going to eventually get worse to this point.
Yes but how big issue it will become and rather how fast really depends on if the issue you experience is part of monitor's specification or it is developing issue that becomes visibly worse over time.

Like I said all CRTs have this effect to very small degree.
In your photo I don't really see it. I am pretty sure that with better example it is visible and why the topic was raised but if it was actually bad it would look bad in your photo. It does not.

One more thing: Keep in mind this is 20+ hardware that most probably already experienced some mechanical stress due to being transported. Tinkering with its electronics is dangerous because it involves mechanical stress on PCB boards. If e.g. trace breaks it won't be very visible and if this happens then easy cap replacement can shift in to hellish repair that can take many hours looking at the board and its schematics to find broken trace.

tl;dr
I would recommend to not bother repairing this because it probably ain't broken to begin with. Risk breaking monitor is not very high but it does exist making such 'repairs' not recommended.
 
The login screen where you enter a password is where it is most noticeable. I don’t remember how noticeable this was 20+ years ago lol. I think maybe I will leave it for now considering what you mentioned. TY
 

Attachments

  • IMG_6102.jpeg
    IMG_6102.jpeg
    98.4 KB · Views: 0
One more thing: Keep in mind this is 20+ hardware that most probably already experienced some mechanical stress due to being transported. Tinkering with its electronics is dangerous because it involves mechanical stress on PCB boards. If e.g. trace breaks it won't be very visible and if this happens then easy cap replacement can shift in to hellish repair that can take many hours looking at the board and its schematics to find broken trace.
Traces won't break like that on their own. When it happens, it's mostly because of careless soldering: part of the track doesn't stick to the PCB anymore with excessive heating and may eventually break during further operations.

Keep in mind we're talking about "old" tech with thick traces and 2 layers/PCB at most, it's not some cheap massively multilayered modern PCB where the main goal is to save as much copper as possible.
 
Traces won't break like that on their own. When it happens, it's mostly because of careless soldering: part of the track doesn't stick to the PCB anymore with excessive heating and may eventually break during further operations.

Keep in mind we're talking about "old" tech with thick traces and 2 layers/PCB at most, it's not some cheap massively multilayered modern PCB where the main goal is to save as much copper as possible.
I agree 100% - unlikely scenario. Still these things can and do happen.

The login screen where you enter a password is where it is most noticeable. I don’t remember how noticeable this was 20+ years ago lol. I think maybe I will leave it for now considering what you mentioned. TY
Here it looks actually visible and it does indeed looks like caps dried up too much.
 
...Here it looks actually visible and it does indeed looks like caps dried up too much.
Exactly. For now it's only very visible under certain circumstances. Circumstances only visible when nothing exciting is happening.
 
What could these faint lines be?
-They move when I drag the window
-I see them with 2 different PCs on both inputs and VGA connected with no adapter
-Degaussing and changing color temps in OSD doesn’t remove them

What is very intersecting is in Paint 3D I drew a horizontally wide box, and those faint lines to the right of the box disappear as I rotate the box. It seems this faint white ghost is only there when there is a long horizontal bar/edge.
I knew someone who had an F520 with a similar issue. It was caused by oxidation at the cable connections in the neckboard. You can try spraying some contact cleaner to see if it alleviates the issue.
 
I knew someone who had an F520 with a similar issue. It was caused by oxidation at the cable connections in the neckboard. You can try spraying some contact cleaner to see if it alleviates the issue.
I believe it. I've had a BNC connector just needing to be tapped slightly to regain contact. Since then I've put electrical gel on them.
 
In this repair guide it's called "streaking".

I think I've also seen it fixed with a CRT rejevenator


https://swharden.com/misc/crt-repair/
Nice resource:


I disagree with the author here though:

"My gut feeling is that the Dells use a Mitsubishi tube, and that the quality control is not up to Sony's. It is just a feeling, I have not done any research on this."

The dell trinitrons are rebranded sony tubes - one strong piece of evidence for this is that WinDAS works with the Dells.
 
"My gut feeling is that the Dells use a Mitsubishi tube, and that the quality control is not up to Sony's. It is just a feeling, I have not done any research on this."

The dell trinitrons are rebranded sony tubes - one strong piece of evidence for this is that WinDAS works with the Dells.
Exactly. There might be some variations though, other than the different casing. I noticed the P1130 "made in Mexico" use a Chinese simple yoke, whereas the "made in Japan" ones feature a more complexe yoke also made in Japan.
 
I found amazing tool - ability to correct gamut using ICC profile
AMD: https://github.com/dantmnf/AMDColorTweaks
NVIDIA: https://github.com/ledoge/novideo_srgb

I only tested AMD version.
AMD supported this trick since long time ago but EDID had to have correct values with unchanged white point - because correcting white point doesn't works that well. FW900 has different value so whatever its EDID values are worth they are unusable. Also precision of chromatic point in EDID is very limited encoded as 10 bit integer value if I remember correctly. Not to mention biggest issue: you need EDID emulator.
Update:
Tested Nvidia app and... wow!
It is even better than AMD version as it can auto-start and you can enable dithering with this app so this headache called "how to enable dithering on Nvidia cards" is finally gone.

So basically what it allows us to do and why I mention it in this topic is to using ICC profile (or manually typing xy values for RGB components) remap gamut from assumed sRGB/Rec.709 to whatever display's native gamut is. FW900 is similar to Rec.709 but its SMPTE-C phosphors are actually more like Rec.601 so not quite the same colors.
By correcting gamut we get smaller overall color space but everything that intersects Rec.709/sRGB and FW900 gamut will be displayed correctly. Also sRGB colors which FW900 cannot display will at least have correct hue even if saturation is incorrect. This works wonders for things like videos, photos, etc. Less so games because games don't have realistic colors (for the most part anyways) but at least colors are more correct.

Also important and cool feature is enabling dithering - especially since FW900 being CRT has quite high gamma which we need to correct. It is possible to get gamma 2.4 or even 2.2 by increasing bias/brightness values but this raise black level so is not ideal. Adjusting gamma via LUTs is much better solution but this has its own issues like if you have 8-bit DAC and try to correct gamma from 8-bit source then you get nasty banding. Dithering helps with that.
Of course 8-bit is the issue because DACs we today use are 8-bit. Uness someone uses GTX 980Ti or earlier card or there is DAC I don't know about which can do 10-bit dithering is the safest bet. Not to mention 10-bit by itself has worse linearity than even 6-bit with dithering let alone 8 or even 10-bit with dithering.

It is also possible to change gamma in this app and this means that it is no longer necessary to load ICC profiles in Windows and have to deal with applications overriding LUTs. Truly ultimate solution to gamma and banding woes on Nvidia cards. 'set and forget' type of application.

p.s. Dithering can be enabled at 6, 8 and 10 bits and should dither everything at higher bitdepth including any applications rendering at 10+ bits. These should get dithered regardless of dithering being enabled - or at least tests I did in the past on Windows 7 indicated that is the case on 8-bit displays - so we mostly need dithering for correcting gamma via LUTs but if for some reason that did not happen dithering should also fix that.
The 6-bit dithering mode is not as useless as it might sound. DACs can have limited precision and do their own dithering which might be of worse quality. Alternatively there could be linearity issues within the DAC. I would recommend visually inspecting how smooth gradients look with 8-bit vs 6-bit dithering.

That the DAC (by DAC I of course mean something like DP to VGA converters) has dithering or not is not so obvious simply because 6-bit + dithering vs 8-bit won't be very noticeable if dithering is somewhat decent. It would however be in this case better to use 6-bit dithering on GPU because GPU in this case will dither using pixels in X and Y direction to dither and DACs will only use X direction.
I mention all this because I get impression my Delock does not really have fill 8-bit precission. Maybe it is just an impression and something else causes it. I need to do more investigation.
 
I used to be able to get 10-bit on the Sunix DPU3000, up to some really high resolutions.

But something must have changed in AMD drivers because now when I try to switch to 10-bit, even at low resolutions, it won't let me.

In fact, it does this weird thing where it shows 10-bit as available, I switch to it, display goes dark like it's switching. The when the display comes back, it's still on 8-bit and the 10-bit option disappeared.

That almost seems like a hardware issue but I tried lots of stuff like switching DP cables and nothing works so far.

Only other difference is I'm now on a 6800xt whereas previously (when I had tried 10-bit in the past) I was on a 5700xt
 
I used to be able to get 10-bit on the Sunix DPU3000, up to some really high resolutions.
Ok but was it true 10-bit or only mode worked? Bad 10-bit is worse than good 8-bit when you have dithering which Radeons have enabled by default.
Even good 10-bit but without dithering is worse than 6-bit with temporal dithering where it comes to handling changing gamma - tested on my IPS monitor.
On the same IPS 10-bit with dithering vs 10-bit no dithering there is slight banding visible on gradients when changing gamma.
10-bit with dithering vs 6-bit with dithering - zero visible difference and in both cases I see no banding.
Otherwise even at 6-bit least significant bit changes level so little that dithering it doesn't cause any visible noise.

On OLED running in HDR the differences are much more noticeable and 6-bit temporal adds just barely visible noise.
And I changed this setting permanent meaning I run my OLED in HDR at 6-bit.
Pioneer Kuro LX5090 noise levels it is not unfortunately but 6-bit temporal dithering in HDR adds very subtle plasma-like dithering touch to the image which I like.

Just using 8-bit + dithering on DACs is probably the best.
I mentioned dithering to check if maybe 6-bit looks better as these DACs might not be made up to specs and have linearity issues.
Otherwise seeing that 6-bit looks the same as 8-bit does give peace of mind that 10-bit is totally not necessary.
 
Update:
Tested Nvidia app and... wow!
It is even better than AMD version as it can auto-start and you can enable dithering with this app so this headache called "how to enable dithering on Nvidia cards" is finally gone.

So basically what it allows us to do and why I mention it in this topic is to using ICC profile (or manually typing xy values for RGB components) remap gamut from assumed sRGB/Rec.709 to whatever display's native gamut is. FW900 is similar to Rec.709 but its SMPTE-C phosphors are actually more like Rec.601 so not quite the same colors.
By correcting gamut we get smaller overall color space but everything that intersects Rec.709/sRGB and FW900 gamut will be displayed correctly. Also sRGB colors which FW900 cannot display will at least have correct hue even if saturation is incorrect. This works wonders for things like videos, photos, etc. Less so games because games don't have realistic colors (for the most part anyways) but at least colors are more correct.

Also important and cool feature is enabling dithering - especially since FW900 being CRT has quite high gamma which we need to correct. It is possible to get gamma 2.4 or even 2.2 by increasing bias/brightness values but this raise black level so is not ideal. Adjusting gamma via LUTs is much better solution but this has its own issues like if you have 8-bit DAC and try to correct gamma from 8-bit source then you get nasty banding. Dithering helps with that.
Of course 8-bit is the issue because DACs we today use are 8-bit. Uness someone uses GTX 980Ti or earlier card or there is DAC I don't know about which can do 10-bit dithering is the safest bet. Not to mention 10-bit by itself has worse linearity than even 6-bit with dithering let alone 8 or even 10-bit with dithering.

It is also possible to change gamma in this app and this means that it is no longer necessary to load ICC profiles in Windows and have to deal with applications overriding LUTs. Truly ultimate solution to gamma and banding woes on Nvidia cards. 'set and forget' type of application.

p.s. Dithering can be enabled at 6, 8 and 10 bits and should dither everything at higher bitdepth including any applications rendering at 10+ bits. These should get dithered regardless of dithering being enabled - or at least tests I did in the past on Windows 7 indicated that is the case on 8-bit displays - so we mostly need dithering for correcting gamma via LUTs but if for some reason that did not happen dithering should also fix that.
The 6-bit dithering mode is not as useless as it might sound. DACs can have limited precision and do their own dithering which might be of worse quality. Alternatively there could be linearity issues within the DAC. I would recommend visually inspecting how smooth gradients look with 8-bit vs 6-bit dithering.

That the DAC (by DAC I of course mean something like DP to VGA converters) has dithering or not is not so obvious simply because 6-bit + dithering vs 8-bit won't be very noticeable if dithering is somewhat decent. It would however be in this case better to use 6-bit dithering on GPU because GPU in this case will dither using pixels in X and Y direction to dither and DACs will only use X direction.
I mention all this because I get impression my Delock does not really have fill 8-bit precission. Maybe it is just an impression and something else causes it. I need to do more investigation.
That's amazing and will definitely improve IQ on early-2000s CRT and Analog LCDs. It's insane how auto-Dithering was featured in AMD's driver since 1998/1999 and Nvidia came to have it in 2025.

10-bit Temporal Dithering is the end game for getting rid of every trace of banding. NVCP even allows up to 12-bit Dithering.
 
So what are the chances Synaptics would provide the datasheet and other technical documentation for the VMM2322 now that it's discontinued?

Because there are still tons of chips on Alibaba, not to mention the chips in useless laptop docks that could theoretically be repurposed.

I imagine using the chip strictly for a VGA DAC could be fairly simple if you have the datasheet.
 
That's amazing and will definitely improve IQ on early-2000s CRT and Analog LCDs.
Analog vs digital doesn't make difference for LCDs. For 6-bit panels with FRC (as opposed to A-FRC) you should get image quality improvements by using 6-bit temporal dithering on the GPU.
For CRT we only need dithering if we touch gamma. Most DACs should support 8-bit - it is just not enough to allow for changing gamma without introducing banding.

It's insane how auto-Dithering was featured in AMD's driver since 1998/1999 and Nvidia came to have it in 2025.
It was possible to enable temporal dithering on Nvidia hardware for years.
It required adding/changing some registry keys and at some point (drivers and/or windows version) dithering disabled itself when putting computer to sleep/hibernation.

10-bit Temporal Dithering is the end game for getting rid of every trace of banding. NVCP even allows up to 12-bit Dithering.
Yup.
Normally Nvidia only uses dithering for "Limited" range so if monitor supported limited range it could be used to force dithering. It was not that great because it introduced dithering where it didn't need to. Normal Windows desktop is running at 8-bit and 8-bit connection at full range is enough to not get banding. At least if gamma LUTs are not changed.

It also means that enabling dithering has no downsides. For 8-bit desktop when gamma is not changed we can safely use 8 or 10-bit with temporal dithering and never get any dithering noise.
 
This is possibe?

Vga DAC Silverstone EP16
with pixel clock 412 MHz ?

https://www.silverstonetek.com/en/product/info/accessories/EP16/

https://www.ebay.com/str/aerocooler

https://www.ebay.com/itm/254113997682?_trkparms=amclksrc=ITM&aid=111001&algo=REC.SEED&ao=1&asc=279998&meid=d2a7431860744ed5ab6ac2e820385009&pid=102022&rk=2&rkt=20&sd=387244466712&itm=254113997682&pmt=1&noa=1&pg=2332490&brand=Silverstone&_trksid=p2332490.c102022.m5276

Model No.SST-EP16C
ColorCharcoal
MaterialAluminum + Plastic
ChipsetVIA VL100
ITE IT6562FN
Signal inputUSB 3.1 Type-C Male
Power requirementUSB bus power
Signal outputVGA connector x1
HDMI connector x1
Operation Temperature0°C ~ 45°C
HDMI port Resolution support4096 x 2160 @30Hz (max)
VGA port Resolution support1920 x 1200@120Hz (max)

ep16-edm-en.jpg
 
Last edited:
As an eBay Associate, HardForum may earn from qualifying purchases.
Back
Top