24" Widescreen CRT (FW900) From Ebay arrived,Comments.

You need all three channels when performing the VMC and HMC adjustments, so cyan and yellow simply won't give you the information you need to perform these adjustments. jbl's info in his guide is incorrect about what the VMC and HMC sliders do.

I don't have it with me (I'll have to dig up my document) but what was wrong about it exactly? I can revise it, if need be.

If I recall correctly, HMC adjusts the magenta beam (red and blue concurrently) with respect to Green on the horizontal axis (so if you have a crosshatch pattern and you move the slider around, green and magenta vertical bars separate). VMC does the same thing but on the vertical axis. Is that not the case?

EDIT: I reread your original explanation. I think I know what the problem is. My document probably suggested that the green gun was also moved using those sliders when in fact green is stationary during their adjustment.
 
EDIT: I reread your original explanation. I think I know what the problem is. My document probably suggested that the green gun was also moved using those sliders when in fact green is stationary during their adjustment.

Yep, this. It's important to have the green line displayed in the pattern, since it acts as a reference to judge the symmetry of the red and blue lines (when adjusting the HMC and VMC).

This evening, if I have time, I'm gonna try and map out which regions of the screen are affected by adjustments in each of the 135 WinDAS zones. Need to figure out how much the regions overlap from zone to zone. Once that's known, it's easier to choose which regions to optimize in each zone (if you don't do this, then you end up undoing a lot of the work from a previous zone when working on a subsequent zone.

It's also much easier to adjust convergence for horizontal lines, so need to practice more with vertical lines. I think that creating a test pattern that uses a color where each of the R G B channels have the same luminance will make this easier (similar to what flod suggested).

If I figure this out, I'll create a test pattern that demarcates which region to optimize for each of the zones.
 
Yep, this. It's important to have the green line displayed in the pattern, since it acts as a reference to judge the symmetry of the red and blue lines (when adjusting the HMC and VMC).

This evening, if I have time, I'm gonna try and map out which regions of the screen are affected by adjustments in each of the 135 WinDAS zones. Need to figure out how much the regions overlap from zone to zone. Once that's known, it's easier to choose which regions to optimize in each zone (if you don't do this, then you end up undoing a lot of the work from a previous zone when working on a subsequent zone.

It's also much easier to adjust convergence for horizontal lines, so need to practice more with vertical lines. I think that creating a test pattern that uses a color where each of the R G B channels have the same luminance will make this easier (similar to what flod suggested).

If I figure this out, I'll create a test pattern that demarcates which region to optimize for each of the zones.

Gotcha. Feel free to edit that verbiage then. :)

I'm a little confused though. What are you getting by using White instead of Yellow for the HMC and VMC adjustments? Green and Red are both visible, and while you cannot see blue, HMC and VMC adjust both blue and red in the same way simultaneously, so it's a non-issue. Or is that not what you're seeing with your monitor?
 
Gotcha. Feel free to edit that verbiage then. :)

I'm a little confused though. What are you getting by using White instead of Yellow for the HMC and VMC adjustments? Green and Red are both visible, and while you cannot see blue, HMC and VMC adjust both blue and red in the same way simultaneously, so it's a non-issue. Or is that not what you're seeing with your monitor?

In image below, I've attempted to show what the HAMP and HMC sliders do. Adjusting HAMP will transform A into B, and adjusting HMC will transform C into D. So what I do is to first adjust HAMP until the lines are maximally separated. Then I adjust HMC until the lines look like they do in A. Then I adjust HAMP until the lines come together. If you don't center the lines using HMC, and try to bring them together using HAMP, they'll never meet in the middle.

And you need all three channels to be visible in order to make sure the lines are centered when making HMC adjustments.
 

Attachments

  • DynamicConvergenceSliders.png
    DynamicConvergenceSliders.png
    8.9 KB · Views: 0
In image below, I've attempted to show what the HAMP and HMC sliders do. Adjusting HAMP will transform A into B, and adjusting HMC will transform C into D. So what I do is to first adjust HAMP until the lines are maximally separated. Then I adjust HMC until the lines look like they do in A. Then I adjust HAMP until the lines come together. If you don't center the lines using HMC, and try to bring them together using HAMP, they'll never meet in the middle.

And you need all three channels to be visible in order to make sure the lines are centered when making HMC adjustments.

Fair enough, but I guess I'm still a little confused. Wouldn't doing the magenta using the "AMP"s and then the "MC"s afterward acheive the same purpose? Or is that what you're trying to find out? I'm not trying to shoot holes in what you're trying to do. It's just that when I did this process for hours and hours and hours I found that first converging magenta and then converging yellow ended with the tightest results. My 900 was the best converged set I had, and I had an F-520 from Unkle Vito and an Artisan - both with excellent convergence.

If you do manage to get it even tighter then that's awesome, I'm just bringing this up because I'd hate for you to spend more hours on this and have it be for naught.
 
Fair enough, but I guess I'm still a little confused. Wouldn't doing the magenta using the "AMP"s and then the "MC"s afterward acheive the same purpose?

Ah, I see your logic, I never thought about that approach.

So first, get them to meet using AMP, then use MC to bring them to the middle.

My bad jbl, you were right all along (minus the part about your description of what the MC slider does).

And please continue to shoot holes :)
 
Ah, I see your logic, I never thought about that approach.

So first, get them to meet using AMP, then use MC to bring them to the middle.

My bad jbl, you were right all along (minus the part about your description of what the MC slider does).

And please continue to shoot holes :)

No problem. Glad we came to an understanding of what I was talking about. :) Yeah, it's been a long time since I've done this but I remember having the "Eureka!" moment of figuring out the two-pass method of converging the monitor. It ended up being quicker AND tighter. Tighter because you could adjust each set of sliders in isolation, and quicker because your work was much more focused (and you don't have to go back as much to do rework and corrections).
 
Could anyone help out and tell me without spending 2 weeks going through 420 pages of information, what is the easiest way to get my FW900 working again with my founders edition RTX2080? I finally gave in few years back and replaced it after I build a new computer and my old GTX780 was last card that could run VGA.
Would like to get it run at native 2560x1440 at at least 85hz. Is there an adapter or a convertor that would allow me to run it with the rtx2080 without doing anything crazy.
 
Last edited:
My FW900 has an issue. It becomes very blurry within the first hour or so after I turn it on, sometimes instantly and sometimes slowly. To get it sharp again I need to power cycle the monitor. Degaussing doesn't help. It usually takes one time, sometimes 2 and rarely more. After that it works fine for hours unless I turn it off again.
It doesn't make any pop sounds. It works perfectly otherwise: good geometry, very good sharpness and covergence, picture quality is great at 30 brightness / 90 contrast.
The issue itself doesn't bother me that much, but I'm worried that It can grow into something bigger with time.
Any ideas? Could it be just a bad cap or some solder issue? Or is it flyback / tube dying?
 
Last edited:
Could anyone help out and tell me without spending 2 weeks going through 420 pages of information, what is the easiest way to get my FW900 working again with my founders edition RTX2080? I finally gave in few years back and replaced it after I build a new computer and my old GTX780 was last card that could run VGA.
Would like to get it run at native 2560x1440 at at least 85hz. Is there an adapter or a convertor that would allow me to run it with the rtx2080 without doing anything crazy.

The Vention USB C to VGA Adapter looks like a good bet.
 
My FW900 has an issue. It becomes very blurry within the first hour or so after I turn it on, sometimes instantly and sometimes slowly. To get it sharp again I need to power cycle the monitor. Degaussing doesn't help. It usually takes one time, sometimes 2 and rarely more. After that it works fine for hours unless I turn it off again.
It doesn't make any pop sounds. It works perfectly otherwise: good geometry, very good sharpness and covergence, picture quality is great at 30 brightness / 90 contrast.
The issue itself doesn't bother me that much, but I'm worried that It can grow into something bigger with time.
Any ideas? Could it be just a bad cap or some solder issue? Or is it flyback / tube dying?

Mine does that from time to time. In the past, it would resolve itself with a pop. Nowadays, for the last year or so, it resolves itself if I turn monitor on and off. Sometimes I'll go for a month without it happening, sometimes it happens every day for a few days. Not a big issue for me. Not sure what causes it.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Just did some more experimentation with dynamic convergence. Each zone of adjustment (there are 135 of them) affects an area that seems to be shaped like an ellipse, with a horizontal axis of about 17 cm, and a vertical axis of 7cm. The exception is for the zones near the edges and corners. The zone location appears to represent the center of the ellipse (though I didn't verify this), so if you're near an edge, the affected region will be half of the ellipse, and if you're near a corner, the affected region will be a quarter of the full ellipse.

So now the question is this: When adjusting each zone, do you optimize convergence for the center of the ellipse, or for the edges? I think the answer to this depends on how much overlap there is between the zones. With no overlap, you'd optimize for center. But they definitely overlap.

Will take more measurements next time I have time.
 
Where on the product page does it mention strobing? Don't see any mention of ULMB

Not sure about where on the product page, but it has a mode called MBR (motion blur reduction). Unlike other monitors, it allows you to adjust the pulse width of the strobe, which in turn allows you to dial in the brightness. Unlike my Samsung monitor, whose pulse is fixed in that mode.
 
Not sure about where on the product page, but it has a mode called MBR (motion blur reduction). Unlike other monitors, it allows you to adjust the pulse width of the strobe, which in turn allows you to dial in the brightness. Unlike my Samsung monitor, whose pulse is fixed in that mode.

Can you make it pulse at 85hz, if you want to play a game at 85fps?
 
Can you make it pulse at 85hz, if you want to play a game at 85fps?

I'm testing it right now and yes - it appears you can! I can definitely see it flicker a little and I can easily tell the difference between MBR mode being on and off at 85hz. Thanks for asking! I never would have guessed it would support this mode at that speed. I tried 72hz and it didn't work, so I'm not sure what the limit is.
 
I'm testing it right now and yes - it appears you can! I can definitely see it flicker a little and I can easily tell the difference between MBR mode being on and off at 85hz. Thanks for asking! I never would have guessed it would support this mode at that speed. I tried 72hz and it didn't work, so I'm not sure what the limit is.

It's 85hz for a lot of ULMB monitors. Not helpful for 60hz locked games but it's something.
 
Where on the product page does it mention strobing? Don't see any mention of ULMB

It's 1 ms of MPRT (moving picture response time)

MPRT is basically how long a pixel is visible for if you were to instruct the display to flash that pixel as briefly as possible (I think another way of saying this is that it's the width of the temporal impulse response function).

It combines both the time due to sample and hold (i.e. how long the display is "energizing" that pixel for), and the time that is required for the pixel to decay* once the display stops "energizing" the pixel.

The only way you'd be able to achieve 1 ms of MPRT with a sample and hold display (i.e. a non strobed display), is if the display were running at 1000 hz, and had an infinitely fast decay time (if you had a display that ran at 2000 hz, and had a 0.5 ms decay time, you'd also have a 1 ms MPRT).

The higher the MPRT, the more motion blur you get (i.e. the wider the blur trail, at any given pixel velocity).

*I'm ignoring pixel rise time, since it overlaps with the sample and hold time.

There's a fairly technical discussion here.
 
Has anyone tried using nvidia optimus or similar tech on desktop instead?

this way you can output video game through onboard intel dsub vga instead of using an external dac, no?
 
Mine does that from time to time. In the past, it would resolve itself with a pop. Nowadays, for the last year or so, it resolves itself if I turn monitor on and off. Sometimes I'll go for a month without it happening, sometimes it happens every day for a few days. Not a big issue for me. Not sure what causes it.

That's reassuring. Hopefully mine lasts at least a couple of years too.

Yes, it's strobed. It's roughly *similar* to a Trinitron with medium-persistence phosphor. BUT - it's still not as good. Close, but CRT is still better.

On a scale of 1 to 10, how close is it?

Has anyone tried using nvidia optimus or similar tech on desktop instead?

this way you can output video game through onboard intel dsub vga instead of using an external dac, no?

I've tried Lucid Virtu MVP. It only woked well in one game, and it's not longer relevant. Wasn't able to find another working solution.
 
On a scale of 1 to 10, how close is it?

Let me think about this for a bit. 1-10 scale is a little difficult to answer but I can say with certainty that it's close enough not to be an issue. In full disclosure, if you have a FW-900 or other high-end CRT, there's really no reason to get rid of it unless it becomes non-functional (or some other variable changes that you have to get rid of it). No gaming LCD can touch it in overall quality.

EDIT: I realize by saying this I'm probably helping to drive up the cost of used ones but seriously, until OLED becomes more affordable, and rolling scan is implemented, I don't see the FW-900 being dethroned anytime soon.

Double-Edit: I dug up my post when I first got my Samsung monitor:

"Yeah, I personally saw no flicker with my monitor at 100hz. Definitely not 120hz or 144hz. The VA panel's slow transition times made 144hz a waste though. Honestly, for high-res gaming, if I could give CRT a 10/10, then the Samsung was like a 7.5/10 without ULMB enabled, and a 8.5~9.0 with it enabled. If the backlight was adjustable so that I could get some good contrast, then bump it up to 9.5. It was pretty darn close to CRT."

I would rate the AOC a 9.0. It comes very close to CRT and unlike the Samsung, CAN adjust the backlight. Ultimately, the CRT's still have slightly better image quality and have better motion clarity. So my assessment still stands. CRT are still the kings of gaming. But for every other use this monitor is superior.
 
Last edited:
jbl, do you have a CRT hooked up? If so, would be interesting to do an experiment.

If I'm not mistaken, when a strobed backlight is used, the "pixel decay" is instantaneous, which means that the blur trail would have a sharp edge.

With a CRT, the pixel decays smoothly, which means that the blur trail has a soft edge. You can see this on a CRT if you move your mouse back and forth over a black background (especially if you have deep blacks). The trail looks very natural and smooth.

Can you report on what the blur trail looks like on your AOC, and compare it to what it looks like on your CRT? If you don't have a CRT, just tell us what it looks like on the AOC.
 
I don't see how LCD can compete with CRT and other emissive tech without some kind of FALD or such. I had a VA panel for a bit. They've probably gotten better, but in a darker room its version of black was still so bright.

Then an F520 became available to me and pulled me back in...
 
Last edited:
I don't see how LCD can compete with CRT and other emissive tech without some kind of FALD or such. I had a VA panel for a bit. They've probably gotten better, but in a darker room its version of black was still so bright.

Then an F520 became available to me and pulled me back in...

I didn't say it can compete, if you're referring to my assessment. :) I think it's a decent enough substitute that comes close enough. I'll always prefer CRT until OLED makes its way to being more affordable, but it is what it is.
 
jbl, do you have a CRT hooked up? If so, would be interesting to do an experiment.

If I'm not mistaken, when a strobed backlight is used, the "pixel decay" is instantaneous, which means that the blur trail would have a sharp edge.

With a CRT, the pixel decays smoothly, which means that the blur trail has a soft edge. You can see this on a CRT if you move your mouse back and forth over a black background (especially if you have deep blacks). The trail looks very natural and smooth.

Can you report on what the blur trail looks like on your AOC, and compare it to what it looks like on your CRT? If you don't have a CRT, just tell us what it looks like on the AOC.

I no longer have PC CRT monitors (I have a couple of PVM's though). But yes, the LCD ghosting is a lot "harder" than the CRT ghost trail. CRT definitely looks better.
 
I would rate the AOC a 9.0. It comes very close to CRT and unlike the Samsung, CAN adjust the backlight. Ultimately, the CRT's still have slightly better image quality and have better motion clarity. So my assessment still stands. CRT are still the kings of gaming. But for every other use this monitor is superior.

Thanks for the input!
 
Thanks for the input!

No problem. I'm hoping to get some more calibration work done on it tomorrow afternoon/evening. I just found another weird setting that alters the way the image outputs so I want to dial it in to achieve even greater perfection. But so far, so good.
 
I tested the Delock 87685 this weekend. I can confirm it has exactly the same issues as the Sunix DPU3000 alternative: unstable around 2048x1536 (my Dell P1130 and F520s go into standby mode when switching to that resolution more often than not); and also screen displacement and jittering especially at higher resolutions.
The future looks bleak for CRTs ...and I have 2 GDM-F520s in perfect condition (both the antiglare and electronics).
 
Last edited:
I tested the Delock 87685 this weekend. I can confirm it has exactly the same issues as the Sunix DPU3000 alternative: unstable around 2048x1536 (my Dell P1130 and F520s go into standby mode when switching to that resolution more often than not); and also screen displacement and jittering especially at higher resolutions.
The future looks bleak for CRTs ...and I have 2 GDM-F520s in perfect condition (both the antiglare and electronics).

The sunix isn't the only contender out there. The Vention looks like a great option too, though this requires usb-c
 
What is the highest resolution / refresh rate that it can handle without problems?
"Without problems" is relative. 1920x1440 doesn't shutdown the monitor but it still has occasional jittering. The image trembles. Besides that I noticed a decrease in brightness and sharpness on my two F520s that wasn't apparent on my Dell P1130. At the current state and price, both the Sunix and Delock versions are rip-offs.
I'm sticking with my GTX 980 for as long as I can...
 
What is the highest resolution / refresh rate that it can handle without problems?

Very high resolutions are usually fine. Like 2304x1728 @75hz doesn't give me any problems. But 2048x1536 and resolutions just above it are usually buggy.

And then lower resolutions can get jittery at certain refresh rates. Like 75hz can get shaky at resolutions below 1536p, but not all of them.

I bet if we did enough testing, we could discover a pattern. But hopefully the Sunix is only a holdover until the HD Fury 5 comes out and gives us the functionality we need
 
Can Sunnix handle higher refresh rates like 1600x1000 @ 110Hz, or 1280x800 @ 130Hz? Where can I learn more about HD Fury 5? Can't find much info about it.
 
Can Sunnix handle higher refresh rates like 1600x1000 @ 110Hz, or 1280x800 @ 130Hz

Yes to both, easily. The limit I've hit is around a 550mHz pixel clock, so you can do the math from there.

Where can I learn more about HD Fury 5? Can't find much info about it.

They haven't really talked about it except to occasionally say "yes, we're still working on it". I don't think it is a high priority for them since they make more money from their home theater products.
 
Back
Top