24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Yes, that seems to be true. Out of curiosity, I popped open some of my HDMI to VGA converters to see what was inside. The overachieving Gembird converter is the one in the first photo and, as expected, it does not use CH7101 chipset as advertised. Does anybody know what this "LK7112" chip is? I couldn't find any info with some cursory Googling, but it is really good. It's better than both my Delock 62967 and Startech DP2VGAHD20 - as mentioned before it will go to 395 mhz before green pixels start to appear.

The second pic is from a Walmart brand HDMI converter that I've had for a while. This one really does use the CH7101, and performs about as well as Petrasescu_Lucian's unit. It tops out a little under 270 mhz and the most it can reliably display is 1920x1440 at 60hz.



2048x1536 at 86hz is the max resolution/refresh rate for the 2070SB as specified in the user manual. That's why I typically use it as a benchmark. In practice, though, I run it at 1440x1080 most of the time so everything on the desktop is the same size as my computer's main LCD.
Very interesting!
Can you contact Gembird about this? It would be great if they could replace all their hdmi->vga converters with this LK7112 chip!
 
Add to that the sides-swapping and occasional trembling/jitter on any resolution and it's a completely no go for me. These adapters are crap; and I don't care if they can go "higher" but are unstable as hell. Enough said.
Yeah that stuff is annoying, but for 60fps-locked games I like to run really high resolutions to take advantage of the GPU overhead. So 3200x1800 @ 60hz for 16:9 games, 2880x2160 @ 60hz for 4:3 games.

And I also have other adapters to fall back to if I'm trying to play some lower resolution that is getting jittery on me. Still haven't found a good adapter to run the side swapping resolutions (which I believe happens in the 2048>2560 horizontal pixel range, or maybe it's like 1536>1800 vertical lines)
 
The second pic is from a Walmart brand HDMI converter that I've had for a while. This one really does use the CH7101, and performs about as well as Petrasescu_Lucian's unit. It tops out a little under 270 mhz and the most it can reliably display is 1920x1440 at 60hz.
I said in my previous posts that the limit is 191 MHz not 270 MHz.
I opened un my Gembird A-HDMI-VGA-04 and guess what? :)
Not even a lowly CH7101. The chipset is CS5210.
What a sham of a company!
 

Attachments

  • 20210925_195450.jpg
    20210925_195450.jpg
    308.5 KB · Views: 0
My question got buried in the comments so I repeat: does Startech adapter outputs to HDMI 2.0 and VGA at the same time?

And I also have other adapters to fall back to if I'm trying to play some lower resolution that is getting jittery on me. Still haven't found a good adapter to run the side swapping resolutions (which I believe happens in the 2048>2560 horizontal pixel range, or maybe it's like 1536>1800 vertical lines)
I would assume it has more to do with pixel clock than lines or refresh rate.
Like in some frequencies not working correctly due to some clock issues, stray signals and such.
If I had such adapter I would see if maybe adding shielding over components help. This might be also issue of input and not VGA output part. Was this verified somehow?

I said in my previous posts that the limit is 191 MHz not 270 MHz.
I opened un my Gembird A-HDMI-VGA-04 and guess what? :)
Not even a lowly CH7101. The chipset is CS5210.
What a sham of a company!
It is one of the cheapest converters and chipsets used allow for advertised 1080p60
The fact that someone got unit with different better chipset doesn't make company a sham.

2048x1536 at 86hz is the max resolution/refresh rate for the 2070SB as specified in the user manual. That's why I typically use it as a benchmark. In practice, though, I run it at 1440x1080 most of the time so everything on the desktop is the same size as my computer's main LCD.
1440x1080 is a great resolution to use for 4:3 monitor
For FW900 I usually end up using 1920x1200@97Hz or even 1920x1080@107Hz and that is it.

In the past I used lower resolutions. When I got the monitor I had GTX 460 and played mostly at 1280x800 or 720p, at least then new games. 1440x810@142Hz (or 141Hz) was one of the resolution I used for quite some time. It is fun resolution because is has very slight visible scanlines giving the image this retro style. Also flickers much less than higher modes.
 
The fact that someone got unit with different better chipset doesn't make company a sham.
So far there are at least two Gembird A-HDMI-VGA-04 converters that use different chipsets from the one advertised on the box (CH7101), so I stand by my words.
 
I would assume it has more to do with pixel clock than lines or refresh rate.
Like in some frequencies not working correctly due to some clock issues, stray signals and such.
If I had such adapter I would see if maybe adding shielding over components help. This might be also issue of input and not VGA output part. Was this verified somehow?

the side-switching on the Sunix adapters seems to have nothing to do with pixel clock. It never happens at 2024x1518 at any refresh rate, but the second you go over 2048x1536, it happens all the time regardless of refresh rate. It starts to go away once you get to really high resolutions, like 1800p.

The other issues, I don't know. They seem like digital glitches, not analog interference though. You'd have to own a sunix to know what we're talking about.
 
I said in my previous posts that the limit is 191 MHz not 270 MHz.
I opened un my Gembird A-HDMI-VGA-04 and guess what? :)
Not even a lowly CH7101. The chipset is CS5210.
What a sham of a company!

It's not even a different chipset, it's also a different PCB everytime, what a mess. I bet they just grab any adapter they can put their hands on, manufactured by other companies, and they sell it under the same reference with their brand on it. :ROFLMAO:
 
As long as the chipset used is at least as good as the one advertised then it is fine...

Supported resolutions up to 1920p
LOL
I really do not think they actually meant 1920 lines.
Though maybe they meant 1080x1920p60, that should be doable 🤣

Anyway, anyone with Startech can confirm that it outputs VGA and HDMI at the same time?
It would be super cool to be able to run GDM-FW900 and LCD at the same time at the same resolutions and refresh rates. Also to measure input lag.
 
It could have something to do with aging capacitors or something like that. It looks as if some voltage needs to stabilize there first then it works just fine. It's only during cold startup after hours (=overnight) of being turned off.

Yeah, it's all bonkers. Not just the blacks. It's all a bit washed out and in overdrive. I have tried on-target G2 and also faking G2 higher and lower than on-target. I have performed WPB many times before with success (on FW900) but always had this problem on F520s - I have two of them and both have the same WPB issue (and also the flashing at startup, just a different gun each).

I was not paying attention throughout the whole thread but is there actually someone here who has F520 and did WPB with success?
jka I spent several hours today going through two separate WPB procedures for my F520, both in which I had no problems meeting the targets during the calibration process and both after finalizing are plagued with a hazy strongly green tinged picture and noticeably blurry OSD afterwards. In the case of this monitor I received it with a very blown out + green tinged picture that couldn't be controlled after .dat file text editor G2 tweak or OSD expert mode rgb brightness/contrast adjustment.

Glad I found your posts, but I'm also stumped as to how to proceed, as doing yet another WPB procedure doesn't seem very appealing!
 
Last edited:
Anyway, anyone with Startech can confirm that it outputs VGA and HDMI at the same time?
It would be super cool to be able to run GDM-FW900 and LCD at the same time at the same resolutions and refresh rates. Also to measure input lag.
This adapter uses the IT6224 chipset which is like the IT6564 of the Startech DP2VGAHD20, but with native USB-C input (information obtained from ITE and Delock)
In the description you'll see: "Two monitors can be operated simultaneously at the outputs, the same image is displayed on all ports".
So it "should" be the same thing with the DP2VGAHD20.
 
Last edited:
  • Like
Reactions: XoR_
like this
jka I spent several hours today going through two separate WPB procedures for my F520, both in which I had no problems meeting the targets during the calibration process and both after finalizing are plagued with a hazy strongly green tinged picture and noticeably blurry OSD afterwards. In the case of this monitor I received it with a very blown out + green tinged picture that couldn't be controlled after .dat file text editor G2 tweak or OSD expert mode rgb brightness/contrast adjustment.

Glad I found your posts, but I'm also stumped as to how to proceed, as doing yet another WPB procedure doesn't seem very appealing!
Were you using a colorimeter?

Also, what approach did you use to set the G2 levels during those first two early steps? (one where there is black, and one with a greenish black)
 
Were you using a colorimeter?

Also, what approach did you use to set the G2 levels during those first two early steps? (one where there is black, and one with a greenish black)
Yes, I used a Xrite Colourmunki Display.

I used HCFRs 1% and 2% moving black bars for the first G2 adjustment, I drop G2 until I can't see the 1% and can just barely make out the 2% bar. I'm not sure what the other G2 adjustment you are referring to is?
 
Yes, I used a Xrite Colourmunki Display.

I used HCFRs 1% and 2% moving black bars for the first G2 adjustment, I drop G2 until I can't see the 1% and can just barely make out the 2% bar. I'm not sure what the other G2 adjustment you are referring to is?
on the FW900, there is a second step where you do the same thing, but I believe only the green gun is active. Perhaps this step doesn't exist on the F520. My thinking was that the poor results you're experiencing may have been due to the two steps being adjusted differently relative to each other.

edit: just-re-read your original post. If I'm understanding correctly, the blown out greenish picture was there before the WPB adjustment, and the adjustment didn't fix it, right? I originally thought the WPB caused the issue.
 
on the FW900, there is a second step where you do the same thing, but I believe only the green gun is active. Perhaps this step doesn't exist on the F520. My thinking was that the poor results you're experiencing may have been due to the two steps being adjusted differently relative to each other.

edit: just-re-read your original post. If I'm understanding correctly, the blown out greenish picture was there before the WPB adjustment, and the adjustment didn't fix it, right? I originally thought the WPB caused the issue.
The greenish tinge was there before the adjustment and afterwards. I did another WPB yesterday and got rid of the green tinge by using IRE 0 for the Cutoff Max adjustment, it seems like that may have been the setting you were pointing out. I had been using the 1 target on the HCFR near black scale (it reads as RGB 2/2/2) as Windas asks for 6.5cd/m2, which felt like very far from black to me, but that looks to have been a mistake on my part.

However now I have another issue, despite being able to hit every adjustment target and calibrating G2 for deep blacks, my picture after finalizing the adjustments is still very far from being able to achieve both deep blacks and the full white luminance achieved during the adjustments. Is it misguided of me to believe this should be possible? I've tried flipping through all 3 colour temps and none of them come close to the blacks and full white luminance achieved during calibration!

For example my IRE100 during adjustment:
unknown.png


and my IRE100 after finalizing WPB procedure and tweaking OSD brightness for good blacks and as high a luminance as possible:
unknown.png


This reading was taken using the dynamic mode which appears to be the brightest out of the 3 and the IRE100 is still off by 28cd/m2!

Now this reading may not seem so bad, but something else rather odd has been going on. The G2 was originally set to 162 by the previous owner and after a number of adjustments via .dat text editing that seemed to do nothing for the green tinge/blown out picture I adjusted it during WPB procedure for deep blacks, the first time at 92, then during the next WPB at 91 - then lastly to 85, followed by a final post-WPB .dat text editing adjustment down to a very low 75 which is the setting that I seem to be able to achieve the best balance between blacks/full white luminance and which I took the above screenshot... But 75 seems such a incredibly low setting and I keep wondering: shouldn't the monitor already be very well adjusted and not in need of further G2 tweaking post-WPB procedure? What the heck is going on??
 
Last edited:
This adapter uses the IT6224 chipset which is like the IT6564 of the Startech DP2VGAHD20, but with native USB-C input (information obtained from ITE and Delock)
In the description you'll see: "Two monitors can be operated simultaneously at the outputs, the same image is displayed on all ports".
So it "should" be the same thing with the DP2VGAHD20.
Thanks a lot Derupter

Startech DP2VGAHD20 looks like a good upgrade over Delock 62967 as it should allow to run native FW900 resultion of 2304x1440@80Hz. If it can also output to HDMI at the same time then it can also be used to do some testing like input lag, trying to run VRR on CRT and looking what happens in HDR mode among some of the silly ideas.

In any way I ordered one and will report any interesting findings.

Why did you opt for a CRT screen? Was it cheap compared to others or something else.
When I got my FW900 the best LCD monitors for games were 24" TN with 120Hz and Lightboost2 and such monitor was 3-4x more expensive than I got my CRT for. Quality of FW900 was leaps and bound ahead of such LCD. At that time FW900 was still the best gaming monitor in the world 🤩

FW900 is no longer the best gaming monitor in the world, or at least I do not think it is. It is however still hella good way to play games. When running especially new games that do not have any aliasing on FW900 the impressions from the game are incredible. The main issue of CRT for me is lack of VRR. It really makes gaming on modern gaming displays much less hassle. When not running game at perfectly synced frame and refresh rare the main advantage of strobed displays like CRT is lost.

In any way CRT are great because for their unique organic look. In the past when having break from CRT I was amazed at image quality being so good. Today it is not so much in this direction. It is however due to using very high quality IPS monitors and even though I am disappointed colors are not as vibrant I get used to CRT look and it always grows on me and does so quickly and I switch to considering CRT has the best colors.

If you do not have CRT and are interested then please get one. Especially if you never used VGA CRT monitor. Not necessarily FW900 as those are hard to get. There is a lot of 4:3 monitor which can be grabbed for even free and those will show you what CRT are about. You might also need digital to analog converter like the Startech I ordered today. If you fall in love with what you see you should consider FW900 but like I said do not expect high availability.

BTW. For retro consoles and retro computers CRT are the best option. Something like SNES or Amiga games can be played using Retrotink 5x or OSSC but you really want CRT for such games because these have smooth scrolling which on CRT is perfectly sharp and graphics just look better.
sal772bfx0f21.jpg


This is maybe extreme example but this really is CRT magic and this looks even better in person as the screen draws it to your brain by illuminating and flashing what pretty much is single pixel at a time. For 3d games less so but as I for example played Cyberpunk 2077 on FW900 and was amazed how detailed game is I then ran the same game on LCD and was amazed at how imperfect the rendering really is in this game. CRT simply make games look better than they actually are 🤯

BTW. Similar graphics improvement can found though different reasons and even to a greater degree on PDP. Meaning plasma display panel. For console gaming really great cheap option. Not without its flaws but it is really amazing what some of these yesterday's tech can do.
 
Thanks a lot Derupter

Startech DP2VGAHD20 looks like a good upgrade over Delock 62967 as it should allow to run native FW900 resultion of 2304x1440@80Hz. If it can also output to HDMI at the same time then it can also be used to do some testing like input lag, trying to run VRR on CRT and looking what happens in HDR mode among some of the silly ideas.

In any way I ordered one and will report any interesting findings.


When I got my FW900 the best LCD monitors for games were 24" TN with 120Hz and Lightboost2 and such monitor was 3-4x more expensive than I got my CRT for. Quality of FW900 was leaps and bound ahead of such LCD. At that time FW900 was still the best gaming monitor in the world 🤩

FW900 is no longer the best gaming monitor in the world, or at least I do not think it is. It is however still hella good way to play games. When running especially new games that do not have any aliasing on FW900 the impressions from the game are incredible. The main issue of CRT for me is lack of VRR. It really makes gaming on modern gaming displays much less hassle. When not running game at perfectly synced frame and refresh rare the main advantage of strobed displays like CRT is lost.

In any way CRT are great because for their unique organic look. In the past when having break from CRT I was amazed at image quality being so good. Today it is not so much in this direction. It is however due to using very high quality IPS monitors and even though I am disappointed colors are not as vibrant I get used to CRT look and it always grows on me and does so quickly and I switch to considering CRT has the best colors.

If you do not have CRT and are interested then please get one. Especially if you never used VGA CRT monitor. Not necessarily FW900 as those are hard to get. There is a lot of 4:3 monitor which can be grabbed for even free and those will show you what CRT are about. You might also need digital to analog converter like the Startech I ordered today. If you fall in love with what you see you should consider FW900 but like I said do not expect high availability.

BTW. For retro consoles and retro computers CRT are the best option. Something like SNES or Amiga games can be played using Retrotink 5x or OSSC but you really want CRT for such games because these have smooth scrolling which on CRT is perfectly sharp and graphics just look better.
View attachment 399378

This is maybe extreme example but this really is CRT magic and this looks even better in person as the screen draws it to your brain by illuminating and flashing what pretty much is single pixel at a time. For 3d games less so but as I for example played Cyberpunk 2077 on FW900 and was amazed how detailed game is I then ran the same game on LCD and was amazed at how imperfect the rendering really is in this game. CRT simply make games look better than they actually are 🤯

BTW. Similar graphics improvement can found though different reasons and even to a greater degree on PDP. Meaning plasma display panel. For console gaming really great cheap option. Not without its flaws but it is really amazing what some of these yesterday's tech can do.

That organic look really is amazing and photorealistic, recently played Escape From Tarkov on my FW-900 and it was just stunning and had a realistic appearance that even OLED can't replicate.
 
Hi everyone. What are the current best HDMI to VGA adapter(s) for Series X/S and PS5 to FW900 (as of October 2021)? Be nice if any suggested adapters are available in stock.
 
Last edited:
Startech DP2VGAHD20 looks like a good upgrade over Delock 62967 as it should allow to run native FW900 resultion of 2304x1440@80Hz.
I can confirm the Startech DOES NOT support 2304x1440@80Hz. You get white noise. That timing is above the 375MHz pixel limit of the converter. It can only saturate 21" 4:3 monitors with 131KHz horizontal bandwidth that you tipically use at a maximum of 2048x1536 80Hz like my Dell P1130.
The GDM-F520 for instance can go up to 85Hz on that resolution but the converter can only do 82Hz.
If we can only find converters with that LK7112 chipset...🥺
 
Last edited:
  • Like
Reactions: XoR_
like this
Hi everyone. What are the current best HDMI to VGA adapter(s) for Series X/S and PS5 to FW900 (as of October 2021)? Be nice if any suggested adapters are available in stock.
XboxX can output 1440p, so you might get lucky if you find the right adapter that can support 2560x1440.

PS5 is a challenge. Maybe there are HDMI adapters that could support 4k at 4:2:0? We've yet to find one.

Otherwise, you're looking at 1080p for both consoles, and just about every adapter can do 1080p just fine. I actually need to sell an adapter that can go a little above 1080p (but not close to 1440p) if you're interested: https://www.ventioncable.com/product/hdmi-to-hdmivga-converter/

$12 shipped, and it has a HDMI out for screen mirroring
 
I can confirm the Startech DOES NOT support 2304x1440@80Hz. You get white noise. That timing is above the 375MHz pixel limit of the converter.
LOL, I took wrong number of claimed MHz to check if it will support 2304x1440p80 😅
Still imho this resolution should be possible with 375Hz converter, just not with the EDID/GTF timings. These timings are quite conservative and because not a lot of pixels/lines have to be removed to fit in to 375MHz I do not think geometry will be bad with such mode. This however needs actual testing to be sure.

If we can only find converters with that LK7112 chipset...🥺
I think situation in DP/HDMI to VGA converters is pretty good anyway considering that when VGA was first removed from graphics cards we didn't really have anything that was confirmed to do above 225MHz and today we can easily get 340MHz or 375MHz and with some issues even past 400MHz. Not bad at all imho.

Hi everyone. What are the current best HDMI to VGA adapter(s) for Series X/S and PS5 to FW900 (as of October 2021)? Be nice if any suggested adapters are available in stock.
For all HDMI-only devices that have 1080p you can use pretty much any active HDMI to VGA adapter because all of them support enough bandwidth for 1920x1080p at 60Hz.

The only issue might be HDTV timings. 1080p has not enough horizontal blanking for CRT so you can get some kind of distortion. Trinitrons usually show picture more or less correctly with visible distortion being visible on left side and generally somewhat wonky geometry. It is usable, just not perfect. On some monitors 1080p might not work at all or be way too distorted (like both sides being cut off with rolled image).

What should pretty much always work is 720p. There is enough blanking in 720p to not have any distortions.
I do not know if 1440p from Xbox works on CRT but in theory it should. You will probably have issues with geometry for the same reason as 1080p if not more severe issues.
2160p aka UHD 4K is out of the question on CRT.

Personally I used 720p and 1080p on PS3 and PS4. On PS3 since most games run at 720p it is all fine. On PS4 the 720p is useful for games which run below 1080p to avoid up-scaling. I mean it might very much be that 720p will look better than 1080p, especially considering no geometry issues at this mode vs 1080p. I have not tried PS5 with CRT yet but the screen modes timings are always the same so it should work the same on all devices outputting 1080p. Also since most games run at 60fps and most of them rendering at higher resolution the effect should be very good, no pesky aliasing or FXAA/TAA blur and extremely sharp fluid motion.
 
2160p aka UHD 4K is out of the question on CRT.

I've sent 4k from a PS4 Pro to my 140kHz Diamondtron.

It was through that Benefei adapter, and I was running 4:2:0 YCbCr. The colors were messed up, because the adapter wasn't designed for 4:2:0, but it was a genuine 2160p signal and my monitor displayed it.

So theoretically, if somebody like HD Fury put out a HDMI 2.0 spec VGA adapter, some of us could play PS5 at 4k60hz on our CRT's. Not many of us, but a few. If it had a 2160p>1440p downscale function, then a lot more of us could.
 
Hello! This is pertaining to a Diamond Pro 2070 SB I hope that is okay I felt this thread was a good place to bring my question. I have used my Diamon pro as my primary monitor with a GTX 1080 TI for a couple years, using a displayport to VGA adapter, the Delock 62967 and it has worked great. After upgrading to my new card I received today, an MSI RTX 3080 TI, and plugging in the same adapter I'm finding that periodically, and particularly when the GPU is under load, the entire monitor starts flickering like this
Screenshot_20211002-180448_MX Player.jpg
for an instant and coming back or less frequently losing signal entirely momentarily. This never happened at all prior to this upgrade and I'm wondering if anyone knows what could be the cause and how to resolve this? Thank you.
 
Thanks a lot Derupter

Startech DP2VGAHD20 looks like a good upgrade over Delock 62967 as it should allow to run native FW900 resultion of 2304x1440@80Hz. If it can also output to HDMI at the same time then it can also be used to do some testing like input lag, trying to run VRR on CRT and looking what happens in HDR mode among some of the silly ideas.

In any way I ordered one and will report any interesting findings.

I remember that during my old tests I recovered 12-13 MHz of pixel clock reducing the blanking without problems and without touching the vertical timings, so yes with a 375 MHz adapter the 2304x1440@80Hz is easy, 2048x1536@85Hz with mosters like Diamond Pro 2070SB or F520 most likely, my tests on the LaCie electron22blueIV have shown that it supports the reduction of blanking even better than FW900.
Anyway GTF-CVT horizontal timing formula is practically the same from 1996 when the CRT monitors had lower performance when compared to later models, the only thing that worried me is if doing this can stress even more the electronics.

Hello! This is pertaining to a Diamond Pro 2070 SB I hope that is okay I felt this thread was a good place to bring my question. I have used my Diamon pro as my primary monitor with a GTX 1080 TI for a couple years, using a displayport to VGA adapter, the Delock 62967 and it has worked great. After upgrading to my new card I received today, an MSI RTX 3080 TI, and plugging in the same adapter I'm finding that periodically, and particularly when the GPU is under load, the entire monitor starts flickering like this
It's a know issue of the Delock 62967, it works good with some video cards and bad with others.
It is caused by the bad cable, PCB design, ecc.., practically it can't handle HBR2 mode correctly so over 180 MHz is unstable.
Probably your new video card has a weaker displayport signal than the old one, the solutions are:
-replace the displayport cable, you need to open it, desolder the old one and replace it with a new one, many users have solved it this way.
-buy a Startech DP2VGAHD20, no report of compatibility problems with video cards, better performance than 62967, the only issue is with monitor standby and dual monitor configuration (when you disable one of the two)
Try with a resolution under 180 MHz of pixel clock and see if it's stable, if yes you have this issue.

You can also try to clean the male plug connector of the adapter and the female port of the graphic card with some isopropyl alcohol spray or even better WD-40 contact cleaner.
Spray some inside the displayport male plug of the adapter, plug in and out on the graphic card some times, wait for evaporation of the solution.
Please do this with the computer off (i know this seems like advice for idiots, but better safe than sorry)
 
Last edited:
I remember that during my old tests I recovered 12-13 MHz of pixel clock reducing the blanking without problems and without touching the vertical timings, so yes with a 375 MHz adapter the 2304x1440@80Hz is easy, 2048x1536@85Hz with mosters like Diamond Pro 2070SB or F520 most likely, my tests on the LaCie electron22blueIV have shown that it supports the reduction of blanking even better than FW900.
Anyway GTF-CVT horizontal timing formula is practically the same from 1996 when the CRT monitors had lower performance when compared to later models, the only thing that worried me is if doing this can stress even more the electronics.


It's a know issue of the Delock 62967, it works good with some video cards and bad with others.
It is caused by the bad cable, PCB design, ecc.., practically it can't handle HBR2 mode correctly so over 180 MHz is unstable.
Probably your new video card has a weaker displayport signal than the old one, the solutions are:
-replace the displayport cable, you need to open it, desolder the old one and replace it with a new one, many users have solved it this way.
-buy a Startech DP2VGAHD20, no report of compatibility problems with video cards, better performance than 62967, the only issue is with monitor standby and dual monitor configuration (when you disable one of the two)
Try with a resolution under 180 MHz of pixel clock and see if it's stable, if yes you have this issue.

You can also try to clean the male plug connector of the adapter and the female port of the graphic card with some isopropyl alcohol spray or even better WD-40 contact cleaner.
Spray some inside the displayport male plug of the adapter, plug in and out on the graphic card some times, wait for evaporation of the solution.
Please do this with the computer off (i know this seems like advice for idiots, but better safe than sorry)
Okay that's good to know. You are right lower bandwidth resolutions are more stable. I will try cleaning it and see if it has any effect. I do have another adapter on the way however I went with the ICY BOX IB-SPL1031 are you aware if that suffers any similar issues?
 
Okay that's good to know. You are right lower bandwidth resolutions are more stable. I will try cleaning it and see if it has any effect. I do have another adapter on the way however I went with the ICY BOX IB-SPL1031 are you aware if that suffers any similar issues?
Here you can find a list with the adapters and the relative issues.
 
do have another adapter on the way however I went with the ICY BOX IB-SPL1031 are you aware if that suffers any similar issues?

I have the Sunix version. It works well at most resolutions, but it has some idiosyncrasies that you will see once in a while.

Like, above 2048x1536, and below 2304x1728 or so, resolutions will frequently glitch out and swap the far left and right sides. Like any extra vertical lines on the right and left beyond the 2048 will get swapped. For some reason this goes away at higher resolutions. Like I can run 2560x1920 or 2880x2160 all day without side swapping. But 2240x1680 is unusable.

And then there is the "jittering" issue, where some resolutions just aren't stable, the pixels are constant vibrating side to side. Some people say it's common on refresh rates that end in "5", but I've played tons of games at 75hz without the issue, and I run my desktop at 1200p 85hz and very rarely see it there (and I can fix it by switching to another resolution and switching right back).

So I don't really know what the trigger for that bug is. But on occasion, I'll make a custom resolution that gets the bug, and I'll just have to make a different one. Pretty rare though.

And last, minor thing: 480p60hz doesn't work so well on the Sunix. So might worth keeping a cheap HDMI adapter around for playing old PC games that use really low resolutions like that.

So that might look like a lot when it's written out like that, but in reality you'll have a great time with it. Just once in a while you'll find a custom resolution you can't use, so just make a different resolution that's like 40 pixels larger or smaller and you'll be fine.

Also, check a couple pages back, I posted an older firmware that allows 2880x2160 and 3200x1800. For some reason the firmware that comes on the IcyBox won't let it go that high.
 
So far happy to report that the Startech DP2VGAHD20 arrived quickly and is working great I don't even seem to be encountering any mentioned issues so far with it. Still will test the other adapter to see if I can push things a little further my max with this is 2048x1536@82hz

Edit: nvm I am encountering the dual monitor issue described :(
 
Last edited:
So far happy to report that the Startech DP2VGAHD20 arrived quickly and is working great I don't even seem to be encountering any mentioned issues so far with it. Still will test the other adapter to see if I can push things a little further my max with this is 2048x1536@82hz

Edit: nvm I am encountering the dual monitor issue described :(
Does a restart of the video driver fix the issue?
Device manager - Display adapters - Disable/enable your graphic card
or using the CRU restart utility which is the same thing, but it also restart the graphic panel.
 
Does a restart of the video driver fix the issue?
Device manager - Display adapters - Disable/enable your graphic card
or using the CRU restart utility which is the same thing, but it also restart the graphic panel.
Yes that does appear to help!
I have the Sunix version. It works well at most resolutions, but it has some idiosyncrasies that you will see once in a while.

Like, above 2048x1536, and below 2304x1728 or so, resolutions will frequently glitch out and swap the far left and right sides. Like any extra vertical lines on the right and left beyond the 2048 will get swapped. For some reason this goes away at higher resolutions. Like I can run 2560x1920 or 2880x2160 all day without side swapping. But 2240x1680 is unusable.

And then there is the "jittering" issue, where some resolutions just aren't stable, the pixels are constant vibrating side to side. Some people say it's common on refresh rates that end in "5", but I've played tons of games at 75hz without the issue, and I run my desktop at 1200p 85hz and very rarely see it there (and I can fix it by switching to another resolution and switching right back).

So I don't really know what the trigger for that bug is. But on occasion, I'll make a custom resolution that gets the bug, and I'll just have to make a different one. Pretty rare though.

And last, minor thing: 480p60hz doesn't work so well on the Sunix. So might worth keeping a cheap HDMI adapter around for playing old PC games that use really low resolutions like that.

So that might look like a lot when it's written out like that, but in reality you'll have a great time with it. Just once in a while you'll find a custom resolution you can't use, so just make a different resolution that's like 40 pixels larger or smaller and you'll be fine.

Also, check a couple pages back, I posted an older firmware that allows 2880x2160 and 3200x1800. For some reason the firmware that comes on the IcyBox won't let it go that high.
I just had the ICY BOX arrive and it does seem to be giving some strange issues. I can't seem to get the resolutions I put in CRU to come up no matter what I do. I downloaded the firmware you mentioned but I'm not sure how to go about flashing it
 
I just had the ICY BOX arrive and it does seem to be giving some strange issues. I can't seem to get the resolutions I put in CRU to come up no matter what I do. I downloaded the firmware you mentioned but I'm not sure how to go about flashing it
The firmware isn't going to make CRU work.

Are you running an AMD card? CRU doesn't work with MST hubs (like the Sunix) on AMD cards, for some reason (make sure you submit a bug report through the Adrenalin software). You can still make custom resolutions with the AMD tool, but you might have problems with an artificially low limit on pixel clock set by AMD.

If that happens, you have to go nuclear, and buy an external EDID minder, like Extron's EDID 101V. That's what I have, with my "native" resolution set at 720x540 @ 160hz, and my max resolution in the detailed modes at 2880x2160 @ 60hz. I flashed the 101v by making an EDID with CRU, then writing it to a HD Fury Dr. HDMI, which I then hooked up on the "display" side of the 101v through passive HDMI>DVI then DVI>VGA adapters. At that point the 101v will save the EDID, and then you just put the 101v between your GPU and CRT.

... there might be less complicated ways to do that, but that was all stuff I was able to score on ebay for cheap.

What's weird is that you can still switch which resolution is the "native" one via CRU, if a game requires you to do that. But it has to be a resolution that you either made through AMD's tool, or is on the physical EDID chip.

Back to the firmware, you use Synaptics "vmmUpdater". There's also a "vmmDPTool" which lets you backup the original firmware, though I'm sure yours is the same as the IcyBox firmware I posted. The older Sunix firmware doesn't really fix anything except unlocking a slightly higher pixel clock for stuff like 2880x2160
 
If that happens, you have to go nuclear, and buy an external EDID minder, like Extron's EDID 101V. That's what I have, with my "native" resolution set at 720x540 @ 160hz, and my max resolution in the detailed modes at 2880x2160 @ 60hz. I flashed the 101v by making an EDID with CRU, then writing it to a HD Fury Dr. HDMI, which I then hooked up on the "display" side of the 101v through passive HDMI>DVI then DVI>VGA adapters. At that point the 101v will save the EDID, and then you just put the 101v between your GPU and CRT.
For the more technically minded people it is possible to create EDID emulator using I2C EEPROM
In the past I created DVI-A to VGA adapter with EDID emulation. It was programmed using Arduino

I didn't use it for custom resolutions as much as to change gamut information and use AMD's ability to correct FW900's gamut.
I really miss this feature when using FW900. Colors were slightly more accurate using it. Greens especially but also reds. The maximum saturation of pure green and red was less than sRGB and less than native gamut but otherwise all colors inside the FW900's color coverage were mapped more or less correctly. More or less because EDID doesn't have terrific numerical precision. Still measurements and examination by eye showed that colors are vastly more accurate.

What all GPU companies should just do it add this functionality in to their devices and drivers and have option in GPU control panel to enable correction for used profile and if not present then maybe use EDID. I mean it is such obvious feature it is absolutely ridiculous it is not a thing and that no one cares for it being a thing. It would solve most of the issues most people have with inaccurate colors. Even cheapest monitors have drivers with color profiles with mostly correct gamut information (and they also include this in EDID - though for some reason white might not be set as D65 - which value for all intents and purposes should not be used for any gamut transformations anyways - and because it is used by AMD it can make some monitors not being able to use their gamut remapping) so even without external calibration tools it would solve gamut issues for most people.
 
StartTech DP2VGAHD20 arrived
HDMI + VGA mode works just fine and is recognized as separate monitor with default HD resolutions which can be overriden via CRU
Adapter seems to be limited to exactly 377MHz as it works and 377.34MHz shows a lot of artifacts.

When connecting only CRT the same resolutions I added using Delock were used so they are pretty interchangeable.

So far so good but what happens next shocked me
It seems I am able to run 3840x2160@120Hz via this adapter 🤯
I cannot do this on HDMI from my GPU directly as I only have HDMI 2.0 GPU (RTX 2070)
It is in YCbCr420 apparently but still pretty cool it is doable at all.
The issue I see is that Nvidia locked all G-Sync functionality. It is not visible in the Control Panel at all.
But hey, at least 120Hz works so this adapter is actually pretty decent. Not that I will be using this functionality a lot. Read "at all" 🙃

Remaining thing to check is this monitor sleep issue. If only I didn't disable monitor sleep completely...

EDIT://
Tested monitor sleep and it works strangely because it turns display off for few seconds and then the last selected mode is goes on with blank screen. Blanking is not a terrible way to prevent burn-in but it would be better if it didn't turn it off for a few seconds. In either way I have this function disabled in Windows.

Computer sleep works ok, hibernation also.

BTW. I checked and actually I can also do 4K@120Hz with 4:2:0 at 8-bit using my GPU's native HDMI ports and it is within capabilities of HDMI 2.0. In either way Startech adapted doesn't seem to have any issues with maxing out HDMI transfer which is good.

Also I tested if there is any difference between using VGA alone vs with HDMI display attached and the limit is 377MHz in both cases.

Now what is left to do is some input lag testing having such adapter should allow me to do :)
By which I of course mean testing input lag of HDMI devices using CRT as reference, not testing input lag of the adapter which I know is 0ms
 
Last edited:
StartTech DP2VGAHD20 arrived
HDMI + VGA mode works just fine and is recognized as separate monitor with default HD resolutions which can be overriden via CRU
Adapter seems to be limited to exactly 377MHz as it works and 377.34MHz shows a lot of artifacts.

When connecting only CRT the same resolutions I added using Delock were used so they are pretty interchangeable.

So far so good but what happens next shocked me
It seems I am able to run 3840x2160@120Hz via this adapter 🤯
I cannot do this on HDMI from my GPU directly as I only have HDMI 2.0 GPU (RTX 2070)
It is in YCbCr420 apparently but still pretty cool it is doable at all.
The issue I see is that Nvidia locked all G-Sync functionality. It is not visible in the Control Panel at all.
But hey, at least 120Hz works so this adapter is actually pretty decent. Not that I will be using this functionality a lot. Read "at all" 🙃

Remaining thing to check is this monitor sleep issue. If only I didn't disable monitor sleep completely...

EDIT://
Tested monitor sleep and it works strangely because it turns display off for few seconds and then the last selected mode is goes on with blank screen. Blanking is not a terrible way to prevent burn-in but it would be better if it didn't turn it off for a few seconds. In either way I have this function disabled in Windows.

Computer sleep works ok, hibernation also.

BTW. I checked and actually I can also do 4K@120Hz with 4:2:0 at 8-bit using my GPU's native HDMI ports and it is within capabilities of HDMI 2.0. In either way Startech adapted doesn't seem to have any issues with maxing out HDMI transfer which is good.

Also I tested if there is any difference between using VGA alone vs with HDMI display attached and the limit is 377MHz in both cases.

Now what is left to do is some input lag testing having such adapter should allow me to do :)
By which I of course mean testing input lag of HDMI devices using CRT as reference, not testing input lag of the adapter which I know is 0ms
Did you see any differences in image quality compared to Delock?
Can you use different resolutions with different timings with VGA and HDMI or only the same?
 
Did you see any differences in image quality compared to Delock?
Unfortunately yes. Delock 62967 is definitely better.
On http://www.lagom.nl/lcd-test/gradient.php Startech exhibit vertical lines between some transitions. There is exactly 15 such lines. On upper gradient lines are brighter and on lower they are darker. The effect seems pixel clock dependent and at 320MHz is quite visible. It also disappers when image is rotated 90 degrees and when gradient image is zoomed in the effect is visible only on affected transitions edges. Image also seems slightly softer than Delock. Voltage levels seems also different, like black had slightly higher voltage and white slightly lower than Delock.

Otherwise in normal content I tested it so far on it all looks pretty normal after it is adjusted for different contrast/brightness.

Can you use different resolutions with different timings with VGA and HDMI or only the same?
It is "dumb" converter + DAC. It has one source and can only convert signals.

Interestingly enough when trying to do input lag tests I realized that doing this correctly using scanning camera is pretty impossible. The results as to which display is faster depended on orientation of monitor related to camera. I could easily make LCD show as being faster by putting it slightly below CRT by rotating camera. Probably will need analogue camera for proper test 🤣

------
All in all I think it is cool device, just not perfect. If Startech image quality was up to Delock standards it would have no need for the latter but as it stands Delock will still be my converter of choice. It does what I need of it and does it well.
I will keep Startech just in case I need it.

BTW. 2304x1440@80Hz was easily achievable on Startech without any image distortion. Just didn't work natively... which is also an issue because Startech did present 2304x1440 80Hz as valid resolution which it then did not display. Delock has programmed limits and doesn't show this resolution at all.
 
Thank you. What was the model number of the Delock again? (Have Startech's, but have not looked at gradients for some time.)

BTW...I wouldn't run a blank screen saver on a CRT. Heard some warning in the past about that causing some kind of unwanted build up.
 
Thank you. What was the model number of the Delock again? (Have Startech's, but have not looked at gradients for some time.)

BTW...I wouldn't run a blank screen saver on a CRT. Heard some warning in the past about that causing some kind of unwanted build up.
Delock 62967?

Maybe black screen is not good way to save CRT but actually Trinitrons are a bitch when it comes to proper methods because after they switch off they need to warm-up.
Maybe dark gray screen saver would be the best ?
 
I use the Mystify screensaver. I do like that it's mostly black. Pretty aggressively. If I'm not sitting at the computer I'll have it on. And for any extended period away I'll let the monitor go to sleep mode or have the display physically turned off. (I try to be as gentle as possible with the on/off switch and with the input switch.)
 
Hi everyone. What are the current best HDMI to VGA adapter(s) for Series X/S and PS5 to FW900 (as of October 2021)? Be nice if any suggested adapters are available in stock.
The best currently available, as far as I know, is the "Delock DisplayPort 1.2 Splitter 1 x mini DisplayPort in > 1 x DisplayPort + 1 x HDMI + 1 x VGA out" It is much faster thatn the Delock 62967, and, as far as I know, the fastest in the market. I have both it and the Delock 62967. For refference, the max resolution I could with the delock 62967 was 2400x1800 at 59hz. With the Splitter, I can run it at 65hz.

Also, here's a little secret. Since CRT monitors scan horizontally, the same refresh rate can be achieved as long as horizontal pixels remain the same. For example, this means that I can run 3200x1800 at 65hz as well, and 2560x1440 at the same refresh rate as 1920x1440 too, which is 86hz on my Sony g500. I tried 1942p as well, but, I don't think there is a lot more to squeeze from 3200x1800. Maybe up to 1900p at 60hz max.

https://www.lets-sell.de/detail/index/sArticle/46886?sPartner=delock here is the link
 
Last edited:
Back
Top