24" Widescreen CRT (FW900) From Ebay arrived,Comments.

i have a multimeter in which i can test the vga output voltage from my dpu3000 if you think it can be helpfull for your wondering, just confirm me 😉

nice. I'll see if I can dig something up for you to test. I've written to the LUT directly using psychtoolbox with MATLAB before, but there may be a simpler way - I think flod may have written a small app, I'll see what I can find.
 
  • Like
Reactions: 3dfan
like this
those hd fury adapters seem very expensive, even much more than synaptics based ones, if that HDF1 Nano GX costs $100 with just max 165 mhz for vga output, (discontinued by the way, they suggest x3 or x4, $179 and $289 respectively and just 225mhz each 😱) i cannot imagine how brutally expensive would be something from they around 400 mhz :hungover:

I *think* but am not sure, that one of HD Fury's selling points was removal of blu ray encryption, but may be wrong.
 
What kind of luminance are you guys getting with your calibrated 6500k screens?

If memory serves, I consistently get 87 nits in 6500K mode fresh after a WinDAS calibration where I hit all the requested targets (excepting the G2 which I tend to set lower than WinDAS recommends).

Can't recall if I've tested that with full field white as well as partial field (full field can be lower luminance)
 
Huh...this is interestering. This post from 2003 claims their calibrated 6500k 255, 255, 255 2070sb was only putting out 72cd on I'm assuming the non-superbright modes
Please clarify if you're using service menu or not.

If you are using the service menu - is one of your gains at 6500K is maxed out at FF value? If so, those 72cdmm2 are really the maximum the tube can put out.


1384x1038@120hz, 255 Mhz - blurs at over 60 contrast.
1152x864@145hz, 216Mhz - blurs a bit at 100 contrast, almost perfect picture.
Hmm. Video amp overheat?
IMG_20200905_112515.jpg

Replacing thermal paste and adding two extra screws to fix the contact between the A board casing and RGB amp radiator fixed the issue.
I also tweaked the pots on the flyback a bit.
1.jpg2.jpg
IMG_20200905_183248.jpgIMG_20200905_182842.jpg

BTW, is it normal for F520 to scale down my set up picture size after a power off/power on cycle? It makes the distance from the edge of the picture to the bezel go from 2mm set up by me to ~8mm on the sides, and ~4mm on the top and the bottom
 
Last edited:
Please clarify if you're using service menu or not.

If you are using the service menu - is one of your gains at 6500K is maxed out at FF value? If so, those 72cdmm2 are really the maximum the tube can put out.



Hmm. Video amp overheat?

Replacing thermal paste and adding two extra screws to fix the contact between the A board casing and RGB amp radiator fixed the issue.
I also tweaked the pots on the flyback a bit.

BTW, is it normal for F520 to scale down my set up picture size after a power off/power on cycle? It makes the distance from the edge of the picture to the bezel go from 2mm set up by me to ~8mm on the sides, and ~4mm on the top and the bottom

I think so?
 
RTSS has a new feature called "scanline sync" that basically gives the user control over the vsync-off tearing line, so they can put it at the edge of the screen, or all the way off-screen in the blanking interval. So this is vsync with vsync-off input lag.

But there are ways to minimize lag with traditional vsync as well. Like capping your framerate to 0.002hz below your refresh rate, which will cut out the frame buffers with a side effect of a 16ms stutter once every few minutes
I'm curious, I want to try this but is there any cons for using it on a crt monitor? Like does forcing the tearing line in a certain area create a sample and hold on a crt or am I just overthinking things?
 
Please clarify if you're using service menu or not.

If you are using the service menu - is one of your gains at 6500K is maxed out at FF value? If so, those 72cdmm2 are really the maximum the tube can put out.



Hmm. Video amp overheat?

Replacing thermal paste and adding two extra screws to fix the contact between the A board casing and RGB amp radiator fixed the issue.
I also tweaked the pots on the flyback a bit.

BTW, is it normal for F520 to scale down my set up picture size after a power off/power on cycle? It makes the distance from the edge of the picture to the bezel go from 2mm set up by me to ~8mm on the sides, and ~4mm on the top and the bottom
Well, no offense but I think you fixed your problem just by chance. Even if the contact and thermal paste application weren't very good, having a perfect contact with the shielding probably didn't change much regarding heat dissipation. I bet the shielding is barely warm next to the heatsink contact. ;)
Your improvement may rather have had something to do with cleaning some of the incredible layer of dust OR removing and better reseating the video board. The connection between this board and the D board is really dodgy on these screens.
 
Yeah...that video board seating on the F520. An odd problem for what is otherwise arguably Sony's computer CRT masterpiece.

Is one of the reasons I prefer debezeled.
 
Yeah...that video board seating on the F520. An odd problem for what is otherwise arguably Sony's computer CRT masterpiece.

Is one of the reasons I prefer debezeled.

It would be ballin if someone could figure out the reason behind this. I fear it's the RGB amplifier IC though and that's not going to be fixable.
 
Well, no offense but I think you fixed your problem just by chance. Even if the contact and thermal paste application weren't very good, having a perfect contact with the shielding probably didn't change much regarding heat dissipation. I bet the shielding is barely warm next to the heatsink contact. ;)
Your improvement may rather have had something to do with cleaning some of the incredible layer of dust OR removing and better reseating the video board. The connection between this board and the D board is really dodgy on these screens.
I have not thought about that, there was indeed a huge amount of dust inside. About the temps, I'd say they are around 45-50C after warmup with a bright picture shown on the screen now.
 
What luminance cd/m2 are you guys measuring with a fw900 at 9300k with contrast and brightness both 100%?
 
Last edited:
What luminance cd/m2 are you guys measuring with a fw900 at 9300k with contrast and brightness both 100%?

You should not have the GDM-FW900 set at 100% brightness and 100% contrast... Now, for what/which purpose and why would you want to set/measure luminance at 9.3K with those settings?

UV!
 
You probably only want someone who has done the WinDAS process to answer? At 90/31 (cont / bright) it should be less than 115)

Luminance depends on emission "juice" of the tube. Low emission tubes will demand adjustments over 200 during the second pass in all WinDAS white background adjustments, and the highest the adjustment the tube requires, the lower the emission is and the more "washed out" the tube is. Now, in the extreme cases, near-death tubes will not even allow any adjustments levels to go forward even if you keep increasing the level to 255!

At 9.3K and WinDAS brightness/contrast settings at 90/31, the luminance target should be 115 or more (I usually do not go over 117). Now, if the unit cannot achieve at least 115 even thought the level keeps increasing, then the tube's emissions is low and the tube may be done! Check the emission level with a Sencore CR70 or the Sencore CR7000 Beam-Rite, and then confirm the levels before discarding it. If the tube's emission checks out, then check the instrument for reading accuracy.

Hope this helps!

Sincerely,

Unkle Vito!
 
Luminance depends on emission "juice" of the tube. Low emission tubes will demand adjustments over 200 during the second pass in all WinDAS white background adjustments, and the highest the adjustment the tube requires, the lower the emission is and the more "washed out" the tube is. Now, in the extreme cases, near-death tubes will not even allow any adjustments levels to go forward even if you keep increasing the level to 255!

At 9.3K and WinDAS brightness/contrast settings at 90/31, the luminance target should be 115 or more (I usually do not go over 117). Now, if the unit cannot achieve at least 115 even thought the level keeps increasing, then the tube's emissions is low and the tube may be done! Check the emission level with a Sencore CR70 or the Sencore CR7000 Beam-Rite, and then confirm the levels before discarding it. If the tube's emission checks out, then check the instrument for reading accuracy.

Hope this helps!

Sincerely,

Unkle Vito!

Ok so let me ask you this, you have it set at 9300k, you have your typical brightness test REC pattern bars displayed. What luminance cd/m2 will you arrive at with the correct calibrated brightness level? Also what brightness percentage are you then on?

Obviously if you have a fully calibrated monitor and set it to 100% contrast and 100% brightness the test pattern will be well over brightened with the typical brightness fog ghosting effect.
 
Last edited:
Ok so let me ask you this, you have it set at 9300k, you have your typical brightness test REC pattern bars displayed. What luminance cd/m2 will you arrive at with the correct calibrated brightness level? Also what brightness percentage are you then on?

Obviously if you have a fully calibrated monitor and set it to 100% contrast and 100% brightness the test pattern will be well over brightened with the typical brightness fog ghosting effect.

I already answered the question... Let me ask you: have you ever performed a WinDAS WPB adjustment?
 
I already answered the question... Let me ask you: have you ever performed a WinDAS WPB adjustment?

Look I have an IQ of 3, can you just humor me for the hell of it and set your perfectly calibrated monitor to 100% contrast and 100% brightness at 9300k and tell me what 'middle screen spot' luminance value your display calibrator reads?
 
Look I have an IQ of 3, can you just humor me for the hell of it and set your perfectly calibrated monitor to 100% contrast and 100% brightness at 9300k and tell me what 'middle screen spot' luminance value your display calibrator reads?

next time i do a WPB i can get that info for you (both with full screen white and 10% screen)
 
Look I have an IQ of 3, can you just humor me for the hell of it and set your perfectly calibrated monitor to 100% contrast and 100% brightness at 9300k and tell me what 'middle screen spot' luminance value your display calibrator reads?


I am so sorry but I will not do that as cranking up the brightness of a perfectly calibrated and adjusted monitor to 100% is not only moronic but strongly NOT RECOMMENDED!

Here is a wise advice for you... When it comes to high grade/high end electronics... You play, you PAY!

And lastly, I take things seriously in this forum, and I do not joke around...

Hope this helps...

UV!
 
These ones are "old tech" compared to the FW900, the 4 years between both launches mean pretty noticeable differences in the ways electronics are built. This, and the differences between Intergraph and Sony regarding engineering.

Unless a service manual or schematics of the 28hd96 can still be found (and it would be really surprising if it could), anyone is pretty much restricted to try servicing them with guesses. I wouldn't put much faith in people trying to repair them, I bet at least half of them will be bold retards that'll actually finish blowing them off not having a clue about what they're doing. :unsure:
 
Vito, did you see where the guy found six 28" Intergraph monitors? https://www.reddit.com/r/crtgaming/comments/ifmr4a/thinking_about_programming_quake_would_these_be/

He said only one works perfectly, the other five have problems. I recommended that he contact you if he needed help on restoration. They are Panasonic tubes apparently, but I figured some of your FW900 experience might translate. I don't know what this guy's plan is though.


Please do! Tell him to PM me on this forum...

Sincerely,

UV!
 
These ones are "old tech" compared to the FW900, the 4 years between both launches mean pretty noticeable differences in the ways electronics are built. This, and the differences between Intergraph and Sony regarding engineering.

Unless a service manual or schematics of the 28hd96 can still be found (and it would be really surprising if it could), anyone is pretty much restricted to try servicing them with guesses. I wouldn't put much faith in people trying to repair them, I bet at least half of them will be bold retards that'll actually finish blowing them off not having a clue about what they're doing. :unsure:

So honestly... With CRT monitors there are only a handful of things that really go wrong with them. If you're an experienced CRT tech it shouldn't be too difficult to figure out. All of them more-or-less work the same, and a simple examination of the boards should help you figure out what does what. Bonus points if the board actually describes itself through the text printed on it. :D What I'm saying is - I think they're fixable. BUT - and this is the huge but - the person needs to have experience with CRT's and knows what they're doing.
 
Oh, and regarding the Intergraphs' picture quality. I remember seeing their specs and thinking to myself that the FW900 will probably walk all over them. My two cents. They'd be great game room monitors though.
 
So honestly... With CRT monitors there are only a handful of things that really go wrong with them. If you're an experienced CRT tech it shouldn't be too difficult to figure out. All of them more-or-less work the same, and a simple examination of the boards should help you figure out what does what. Bonus points if the board actually describes itself through the text printed on it. :D What I'm saying is - I think they're fixable. BUT - and this is the huge but - the person needs to have experience with CRT's and knows what they're doing.
Well, I think it's a bit optimistic. I mean, without schematics, even if you know well devices somewhat similar, you just can't find out easily the purpose of every component except a few very specific ones. Boards are just too complex for that. Then you're also lacking informations about some weird unexpected parts of the design, the expected voltages or waveforms at some key points of the circuitry for control purposes, or the way this specific device can be adjusted. Oh, sure, the task may not be entirely impossible but it'll be unnecessarily painful and time consuming.
Things can quickly become a nightmare even just with a service manual poorly written. For example, completely off topic, I'm currently trying to repair a malfunctionning HP 3325A function generator (35 years old, huge boards with only through hole components). The service manual is 400 pages. Yet the schematics are hand made, characters aren't always readable because of crappy scan quality, there are errors all over the place in the manual, most of the update notes are missing, and there are even newer board revisions in my device that do not match the service manual. Believe me I've lost a few hairs with that one.

Oh, and regarding the Intergraphs' picture quality. I remember seeing their specs and thinking to myself that the FW900 will probably walk all over them. My two cents. They'd be great game room monitors though.
Exactly my thoughts. They're similar to the W900 regarding specifications actually, but with a bigger tube.
 
Last edited:
My optimism comes from the fact that the person has one working one. In this case it should be straightforward to troubleshoot.

Does the unit power on? Check power board. Do output values match? Yes? Move on to the B+ hold down circuit, etc. He can literally check each portion against a known working unit. Assuming no changes in revision.
 
Alright, I just tested the Startech DP2VGAHD20.

The connector may not have a lock but it's a normal one, matching the usual dimensions standards (unlike cheapo custom garbage on the Delock 62967). The DP cable is very thin, no surprise. The screen was recognised straight away with correct resolutions (like it would be directly plugged on a VGA output), there's absolutely no weird compability behaviour with my AMD 380X, CRU allows to do pretty much what you like.
The display is sharp, nothing weird regarding the display quality as far as I could see.
I could get a picture up to 2304x1440@78hz (374Mhz pixel clock), but I also noticed some jitter on vertical lines with 2160x1350@85hz (366Mhz pixel clock). So that adapter should probably not be pushed beyond 350-360Mhz, to be determined with more extensive tests.

Oh, and input lag tests have been done. FW900 on left with analog connexion, P1130 on right with adapter connexion:

800x600@160hz (the top of the display is missing, my fault, I should probably have adjusted the blanking)
800x600_160hz_1.jpg800x600_160hz_2.jpg

1600x1200@85hz
1600x1200_85hz_1.jpg1600x1200_85hz_2.jpg

I'll probably do them again later to confirm but it appears that, surprisingly, that adapter might be very slightly more reactive than the embedded DAC of the Radeon. At least both are comparable, there's no input lag.

Definitively, (as long as the price doesn't inflate, like it seems to have happened since it was mentionned in this thread), this adapter deserves a:

pedo_seal.jpg
 
Alright, I just tested the Startech DP2VGAHD20.

The connector may not have a lock but it's a normal one, matching the usual dimensions standards (unlike cheapo custom garbage on the Delock 62967). The DP cable is very thin, no surprise. The screen was recognised straight away with correct resolutions (like it would be directly plugged on a VGA output), there's absolutely no weird compability behaviour with my AMD 380X, CRU allows to do pretty much what you like.
The display is sharp, nothing weird regarding the display quality as far as I could see.
I could get a picture up to 2304x1440@78hz (374Mhz pixel clock), but I also noticed some jitter on vertical lines with 2160x1350@85hz (366Mhz pixel clock). So that adapter should probably not be pushed beyond 350-360Mhz, to be determined with more extensive tests.

Oh, and input lag tests have been done. FW900 on left with analog connexion, P1130 on right with adapter connexion:

800x600@160hz (the top of the display is missing, my fault, I should probably have adjusted the blanking)
View attachment 279126View attachment 279127

1600x1200@85hz
View attachment 279128View attachment 279129

I'll probably do them again later to confirm but it appears that, surprisingly, that adapter might be very slightly more reactive than the embedded DAC of the Radeon. At least both are comparable, there's no input lag.

Definitively, (as long as the price doesn't inflate, like it seems to have happened since it was mentionned in this thread), this adapter deserves a:

View attachment 279131

Thank you. Ordered. :)
 
Alright, I just tested the Startech DP2VGAHD20.

The connector may not have a lock but it's a normal one, matching the usual dimensions standards (unlike cheapo custom garbage on the Delock 62967). The DP cable is very thin, no surprise. The screen was recognised straight away with correct resolutions (like it would be directly plugged on a VGA output), there's absolutely no weird compability behaviour with my AMD 380X, CRU allows to do pretty much what you like.
The display is sharp, nothing weird regarding the display quality as far as I could see.
I could get a picture up to 2304x1440@78hz (374Mhz pixel clock), but I also noticed some jitter on vertical lines with 2160x1350@85hz (366Mhz pixel clock). So that adapter should probably not be pushed beyond 350-360Mhz, to be determined with more extensive tests.

Oh, and input lag tests have been done. FW900 on left with analog connexion, P1130 on right with adapter connexion:

800x600@160hz (the top of the display is missing, my fault, I should probably have adjusted the blanking)
View attachment 279126View attachment 279127

1600x1200@85hz
View attachment 279128View attachment 279129

I'll probably do them again later to confirm but it appears that, surprisingly, that adapter might be very slightly more reactive than the embedded DAC of the Radeon. At least both are comparable, there's no input lag.

Definitively, (as long as the price doesn't inflate, like it seems to have happened since it was mentionned in this thread), this adapter deserves a:

View attachment 279131

Nice test!

About the input lag, a difference of 1-2 ms is normal, this because the ouputs sometimes are not perfectly syncronized in clone mode and one start a little earlier than the other.
In fact in some of my tests the 62967 was faster than native DAC.
So same performance as Flybye sample, that problem at 366 MHz needs to be investigated.
Are you sure it was 366 MHz with that resolution?
CRU gives me 359.21 MHz which is very close to the transition from 2 to 4 Displayport HBR2 lanes.
 
Last edited:
Nice test!

About the input lag, a difference of 1-2 ms is normal, this because the ouputs sometimes are not perfectly syncronized in clone mode and one start a little earlier than the other.
In fact in some of my tests the 62967 was faster than native DAC.
So same performance as Flybye sample, that problem at 366 MHz needs to be investigated.
Are you sure it was 366 MHz with that resolution?
CRU gives me 359.21 MHz which is very close to the transition from 2 to 4 Displayport HBR2 lanes.
That's because I used the blanking settings of the standard 2304x1440@80hz resolution to set that resolution in CRU.
Then i replaced the front porch/synch width/back porch with the ones of the standard 1920x1200@85hz resolution described in the service manual of the FW900. The pixel clock droped to about 345Mhz, still displaying 2160x1350@85hz, and the jitter was gone.
Also, note that I'm not sure the jitter problem isn't present with 2304x1440@78hz, I just didn't notice it when trying briefly that resolution and I didn't bother to try again to check (lazy me :ROFLMAO: )
 
I've got my DP2VGAHD20 yesterday too, capable of 375Mhz just like Flybye 's sample. No problems at all even at 2048x1536 82Hz.
Great!

Really the big thanks goes to Derupter for mentioning it. I had just purchased the Delock, but I wanted another converter as a backup since things DO break in time. This has been the little gold nugget of adapters.
 
Hi Everyone. Fellow FW900 user here. I am curious what would be the ideal hdmi-to-vga converters to connect the next gen consoles (Xbox Series X and PlayStation 5) to the FW900? Both consoles will feature HDMI 2.1 port for 4k/8k res. Here are the limitations:

HDMI 2.1 TV or monitor:

4K 120 Hz
1440p 120 Hz
1080p 120 Hz

HDMI 2.0 TV or monitor:

4K 60 Hz
1440p 120 Hz
1080p 120 Hz

Credit: (user kasakka)

It would be nice to put together a list of hdmi-to-vga adapters based on value to performance to help other fellow FW900 and CRT owners out with next gen consoles.

Personally, I'am using a generic HDMI-to-VGA adapter on my XB1 to FW900 I got off of ebay for around $10 USD and it looks and works amazing at 1080p in a dark room. I'd like to connect a Series X and/or PS5 to the FW900 in the same manner at the highest resolution and performance we can possibly get with the FW900 without breaking the bank.

Any feedback in this regard would be appreciated! LongLiveCRT
 
Last edited:
Hello, i have a GDM FW900 for some years now but sadly i have not give the monitor the love it deserves, so i like to change this :D

So for what attributes do i have to look for buying a bnc cable? are some cables arround 20 dollars ok? sadly the provided link at the start of the thread is dead.

I tried a Rankie VGA to DP converter but it failed at arround 315 mhz pixelclock, is going the hdmi way the better one cause some posts stated the hdmi version should handle this at least. I am going to replace my titan x maxwelll soon.

thanks and regards
 
Hi Everyone. Fellow FW900 user here. I am curious what would be the ideal hdmi-to-vga converters to connect the next gen consoles (Xbox Series X and PlayStation 5) to the FW900? Both consoles will feature HDMI 2.1 port for 4k/8k res. Here are the limitations:

HDMI 2.1 TV or monitor:

4K 120 Hz
1440p 120 Hz
1080p 120 Hz

HDMI 2.0 TV or monitor:

4K 60 Hz
1440p 120 Hz
1080p 120 Hz

Credit: (user kasakka)

It would be nice to put together a list of hdmi-to-vga adapters based on value to performance to help other fellow FW900 and CRT owners out with next gen consoles.

Personally, I'am using a generic HDMI-to-VGA adapter on my XB1 to FW900 I got off of ebay for around $10 USD and it looks and works amazing at 1080p in a dark room. I'd like to connect a Series X and/or PS5 to the FW900 in the same manner at the highest resolution and performance we can possibly get with the FW900 without breaking the bank.

Any feedback in this regard would be appreciated! LongLiveCRT

The problem with the HDMI to VGA adapters is that they are old with old chipsets.
Another problem is that the HDMI receivers are compliant with HDMI 1.3 or 1.4 without other indications, which doesn't mean they can go up to 340 MHz because most of those things are optional, so example if they connect only 2 of 3 data lanes, you are fucked up with a 225 MHz input limit (like with ITE IT6892 or Lontium LT8511)
In other cases all the 3 data lanes are connected, but they artificially limit the clock under 340 MHz because the receiver isn't very good.
For what i've seen, one of the most used chipset recently is the Algoltek AG6200, which i think is used in the cheap Benfei and Rankie adapters on Amazon.
If i had to choose now a new HDMI to VGA converter, i would choose the Vention AFVHB with the Lontium LT8612SX, which seems to be a better and modern chipset than the others (looking at the datasheet), also the adapter looks well made.
Then there is the new Lontium LT8612UX, a monster capable of a 600 MHz input, probably with a 10 bit DAC, an adapter like the AFVHB with this chip would be perfect, but until it happens you are limited to a 330-340 MHz max pixel clock.

I don't know how it works with that console, but can you set a custom resolution or a specific refresh rate or you are limited with fixed outputs like 1080 60 Hz or 1080 120 Hz and so on?
 
Last edited:
Hi Everyone. Fellow FW900 user here. I am curious what would be the ideal hdmi-to-vga converters to connect the next gen consoles (Xbox Series X and PlayStation 5) to the FW900? Both consoles will feature HDMI 2.1 port for 4k/8k res. Here are the limitations:

HDMI 2.1 TV or monitor:

4K 120 Hz
1440p 120 Hz
1080p 120 Hz

HDMI 2.0 TV or monitor:

4K 60 Hz
1440p 120 Hz
1080p 120 Hz

Credit: (user kasakka)

It would be nice to put together a list of hdmi-to-vga adapters based on value to performance to help other fellow FW900 and CRT owners out with next gen consoles.

Personally, I'am using a generic HDMI-to-VGA adapter on my XB1 to FW900 I got off of ebay for around $10 USD and it looks and works amazing at 1080p in a dark room. I'd like to connect a Series X and/or PS5 to the FW900 in the same manner at the highest resolution and performance we can possibly get with the FW900 without breaking the bank.

Any feedback in this regard would be appreciated! LongLiveCRT
Hello, i have a GDM FW900 for some years now but sadly i have not give the monitor the love it deserves, so i like to change this :D

So for what attributes do i have to look for buying a bnc cable? are some cables arround 20 dollars ok? sadly the provided link at the start of the thread is dead.

I tried a Rankie VGA to DP converter but it failed at arround 315 mhz pixelclock, is going the hdmi way the better one cause some posts stated the hdmi version should handle this at least. I am going to replace my titan x maxwelll soon.

thanks and regards
Here, I'll repost this:

Update to Deruptor's cohesive adapter summary:

Adapters capable or potentially able to go over 360 MHz pixel clock:

With Synaptics chipset (DP to VGA)
ICY BOX IB-SPL1031 (tested)
Delock 87685 (tested)
Sunix DPU3000 (tested)
Problems with some resolutions at specific refresh rates with all the three models

With Lontium chipset (USB-C to VGA maybe with 10 bit DAC)
Vention CGMHA and CMFHB (not tested)

Adapters capable or potentially able to go over 180 and up to 360MHz pixel clock:

USB-C to VGA
Delock 62994 with Realtek RTD2169U (not tested)
Delock 62796 with ANX9847 (not tested)
Plugable USBC-VGA with ANX9847 (tested up to 330-335 MHz no issue)
Sunix C2VC7A0 with ANX9847 (not tested)
Delock 63923 with Chrontel CH7212 (not tested)
Delock 63924-63925 with ITE IT6562 (not tested 10 bit DAC)
J5 Create USBC-VGA (tested up to 240 MHz no issue)


DP to VGA
Delock 62967 with ANX9847 (tested)
With some cards it can't handle HBR2 mode so no more than 180 MHz, changing the cable should solve the problem, when it works is perfect up to 340-350 MHz with any resolution and refresh rate.

Cheap adapters from Amazon (HDMI to VGA)
Benfei and Rankie HDMI to VGA, we don't know the model code and the chipset, they should handle up to 330 MHz but it is necessary to set the output to YCBCR 444 and this can cause problems to some users because sometimes and with some drivers that option is not present.
Tendak Female HDMI to Male VGA ~ https://www.amazon.com/dp/B01B7CEOVK/?ref=exp_retrorgb_dp_vv_d ~ (verified to work with no issues up to 210MHz. I could test it more extensively next time I have access to my SuperScan Elite 751 (the adapter isn't compatible with my G90f due to the G90f's fixed cable). Also, there's no need to set the output to YCBCR 444).
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
I made a video of what's happening. It's actually a little bit different from what I remember happening back when it broke, I seem to recall it was just a flash of light and then nothing. Today when I try to switch it on, it hisses at me and constantly tries to re-initialize/degauss the screen.

Here's the video:


If anyone has any ideas following the video, or wants me to try something (not like it does anything else, though...), I'd appreciate it!

Also, thanks for the link to the service manual. I'll have a look at it. I'm also gonna call around for some repair shops, maybe I'll get lucky and find someone with experience and willingness to repair a CRT.

Out of curiosity, did something new happen about your problem ? Did you find a repair shop to look at it ?
 
Back
Top