24" Widescreen CRT (FW900) From Ebay arrived,Comments.

is this a fix for the knicked coating?
if you meant a fix to replace a sratched or damaged coating, yes, it was a fix for me , however personally i dont recommend removing the original if it is in good shape, or have sligth damage toilerable by the owner.
 
duly noted. it would laso seem reading back, no one has found any suitable adapter that can do more than 1900x1200?
 
sunix dpu3000 can, tested with my own, but i doubt you can find one new, maybe second hand, there is the already mentioned ICY BOX IB-SPL1031, also it seems the delock 87685, all those seem to have the same chipset as the sunix acording to user Derupter

https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1044652495

at the moment of posting this, the delock seems to be "availabe" for preorder os something from the following link:
https://www.amazon.de/-/en/Delock-87685-Displayport-Splitter-black/dp/B075MZBXQ8

maybe you may find a direct link to purchese from delock web page direcly for the delock 87685, dont know it its available or not, when clicking "buy now" get redirected to parterts that may or not may not have the adapter, https://www.delock.com/produkt/87685/merkmale.html

for more info about adapetrs in this forum, search user Derupter posts.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
so i recenly got a borrowed AMD radeon RX 5700 non XT asrock version (already returned) to check by myself whats the deal with AMD cards issues and sunix dpu3000 and CRT i see being discused about and dont seem to happen on nvidia ones, at least on the gtx 1080 ti asus rog version i have tested.


i did not have much time with the AMD card so tried to test using the resolutions i remember were mostly mentioned having more issues, i tested on a fresh windows 10 pro 64 bit with all updates installed installation, latest AMD drivers i found at the moment of test, which were "Preview Driver May 2022" and Adrenalin 22.5.2 Optional (WHQL),

i only used the AMD control panel to create some tested custom resolutions like 2048x1536 and 2160x1350 reported to have issues on an AMD RX5700 XT, i didnt touch CRU utility since creating resolutions from this utility for the gtx 1080 ti and sunix gave me issues with the sunix, got better results by using nvidia control panel, so in this case i only used AMD control panel.


so i was able to create and temporaly use to 2048x1536 60hz by using GTF standard on the 5700 AMD control panel, there were no swapping images or so, the image was normal without issues but after about 20 minutes of using it, got a sudden no input signal monitor screen message and was stuck there, and had to reboot the pc. i retested connnecting the sunix power input to a 5v 3 amp smartphone original charger, checking display port and all cables were properly attached to both the AMD, sunix, monitor and tested with diferent AMD drives already mentioned, but the no input signal messaje showed up again after arround the same time of using 2048x1536 60hz. this issue never happened with the nvidia card, i already done this test a while ago with the nvidia card, but tested again, and after about 2 hours of using 2048x1536 60hz created via nvidia control panel custom res GTF standard on the nvidia card, there were no issues, no "no input signal" issued, so i stopped the test since i thougth 2 hours was enough time to conclude this issue only happens on the AMD card, at least RX5700 models and the sunix DPU3000.

tested with 2160x1350 at 60hz and 86hz, was able to switch at 60hz, at 86hz, was unable, got same issues as reported testing this res with the nvidia cards, here, but this time after several attemps, was not able to switch to that res withtout that issues happening,
at 60hz, 2160x1350 seemed to work fine, without swapping image issues or other typical sunis issues, but ddint test long enough since i didnt have much time with the AMD card and have to return it quickly.

so i concluded myself sunix DPU3000 have more issues with AMD cards at least 5700 non XT and XT models than with nvidia ones, at least GTX 1080 TI model

all test were done with a sunix dpu3000 adapter as it is, never attempted to update its firmware since it seems it does not fix those issues on the AMD cards. as i see Enhanced Interrogator have reported.

fortunatelly CRT flexibility to use many custom resolutions - refresh rates within the monitor scan - refresh rate limits as the user want without being worried about not using monitor "native" resolutions and worring about certain faulty resolutions on the sunix adapter and equivalents, make this issue with the sunix dpu3000 (and equivalents i guess) not a big deal after all
 
Last edited:
Interesting, wonder why the issues happen more frequently on Radeon. Either way, you're right that the flexibility of the CRT makes up for it.

And now with cool techniques like FSR 2.0 and temporal up/down sampling, I can always just do a slight downsample to 1920x1440 or a slight upsample to 2304x1728 I'm trying to use one of those tricky in-between resolutions

I actually recently played Kena: Bridge of Spirits running at 1728x1296 upscaled to 2304x1728 with Unreal Engine 4's TAA Upsampling and it looked amazing, way better than just running 1728x1296 natively. Sort of like how DLSS works
 
  • Like
Reactions: 3dfan
like this
👍

the game you mentioned such beautiful looking, left me curious to research more about it. recently i was playing an upgraded version of N64´s super mario 64, with higher models quality, textures, and smooth 60fps on pc, truly beautifull on CRT
 
Last edited:
Ordered one of these - https://aliexpress.com/item/2251801839677795.html will let you folks know how it goes. For $18 (at least that’s the sale price that shows up for me In the USA) you really can’t go wrong!

Even if it doesn’t end up driving particularly high resolutions, you’ve still got a 1:2 HDMI to HDMI & VGA splitter, two digital audio outputs and one analog audio.

I’ve also picked up a Vention AFVHB. I know someone here said they couldn’t drive it past ~250MHz, but I’m gonna mess around with it some more. It notably doesn’t convert YCbCr to RGB, or rather the LT8612SX doesn’t.
 

Attachments

  • 28FD84D3-DC75-4B7A-A9AF-4C7BA5410588.jpeg
    28FD84D3-DC75-4B7A-A9AF-4C7BA5410588.jpeg
    252.1 KB · Views: 0
  • 0ADC0797-8C11-4353-A1BD-124FC2FA1C2D.jpeg
    0ADC0797-8C11-4353-A1BD-124FC2FA1C2D.jpeg
    570.4 KB · Views: 0
  • 348367B2-DBBD-4A86-B766-25FADDA7EF84.jpeg
    348367B2-DBBD-4A86-B766-25FADDA7EF84.jpeg
    510.2 KB · Views: 0
  • E92F6B95-A4EE-4B51-8DEA-CAD2CAC78906.jpeg
    E92F6B95-A4EE-4B51-8DEA-CAD2CAC78906.jpeg
    534.6 KB · Views: 0
Ordered one of these - https://aliexpress.com/item/2251801839677795.html will let you folks know how it goes. For $18 (at least that’s the sale price that shows up for me In the USA) you really can’t go wrong!

Even if it doesn’t end up driving particularly high resolutions, you’ve still got a 1:2 HDMI to HDMI & VGA splitter, two digital audio outputs and one analog audio.

I’ve also picked up a Vention AFVHB. I know someone here said they couldn’t drive it past ~250MHz, but I’m gonna mess around with it some more. It notably doesn’t convert YCbCr to RGB, or rather the LT8612SX doesn’t.

Love me some OSSC! :)

Does anyone seem to have any consensus on whether newer Nvidia or AMD cards are better for use with the FW900 or CRTs in general? If only they still had native analog out lol... My trusty 980 Ti finally died on me for good, and I'm not sure who to buy from in the current market now that the prices have finally come down.
 
Love me some OSSC! :)

Does anyone seem to have any consensus on whether newer Nvidia or AMD cards are better for use with the FW900 or CRTs in general? If only they still had native analog out lol... My trusty 980 Ti finally died on me for good, and I'm not sure who to buy from in the current market now that the prices have finally come down.
I’m sure they all work the same. I do remember though, seeing that AMD cards could force sRGB color space though. That could be an advantage potentially? Or maybe not. CRT monitors are SMPTE-C/Rec 601. The sRGB clamp in AMD cards are really only helpful for wide gamut screens. So yeah, I’m guessing there is no real advantage. I do remember when I had CRT monitors, I appreciated Nvidia’s resolution tool’s ability to use GTF timings.
 
Damage to anti glare coating. Don't use my PC CRTs with much or any ambient light. Wanted to hear from anyone that has removed AG or have one with it removed. How are blacks in light controlled room? My main concern over removal is blacks becoming grey even in dark environment... I can live w the damage to AG if that is the case. Appreciate any replies/info. I've already spent some time searching this long thread. Just curious if anyone can speak directly to their experience. Thanks!
20220718_075207.jpg


20220718_075151.jpg
 
If it’s not bad then leave it. Biggest regret of my FW-900 besides getting rid of it.

EDIT - looking at your pictures on a larger screen than my phone. :) I can't even see where it's damaged. Leave it.
 
Last edited:
agreed, better to keep the original coating if that damage does not bother you, i removed mine and replaced with polairizer car film. on previous page i wrote about it. or if want more info search my user name and words like polarizer or coating , etc. as quick answer, the monitor without any coating blacks were only perceptible on a dark enviroment with no light source on it, didnt like that and ended installing the polarizing car film, happy with it, prefer the monitor with this than without any coating, but of course i would prefer to have the original.

by the way DarthVadetBater, were you able to fix your HP A7217A issue described here? if so, what did you do?
https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1044900243
 
I removed mine due to some EXCESSIVE scratching on it, and I think it's still rather okay, as I do use it in a lower-light controlled room. That being said, I don't have another one running with the coat still on to compare, but the black levels seem pretty great still to me if you did feel inclined to remove it. As others have said though, if it doesn't bother you much, then I'd leave it since it's a huge hassle to remove.
 
agreed, better to keep the original coating if that damage does not bother you, i removed mine and replaced with polairizer car film. on previous page i wrote about it. or if want more info search my user name and words like polarizer or coating , etc. as quick answer, the monitor without any coating blacks were only perceptible on a dark enviroment with no light source on it, didnt like that and ended installing the polarizing car film, happy with it, prefer the monitor with this than without any coating, but of course i would prefer to have the original.

by the way DarthVadetBater, were you able to fix your HP A7217A issue described here? if so, what did you do?
https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1044900243
Regarding HP A7217A, just got P1110/G500 flyback that I'll be using for donor. Not sure when I'll do it but know another user posted here they swapped it successfully.... would be awesome to get it back up & running. Holds sentimental value to me since I drove across the country to get it end of 2019 & is what started me down the rabbit hole with this crazy hobby.....now I have 22 CRTs.... including PC monitors, 240P PVMs, a 480P 38" Arcade CRT, component consumer sets ect.....
 
Regarding HP A7217A, just got P1110/G500 flyback that I'll be using for donor. Not sure when I'll do it but know another user posted here they swapped it successfully.... would be awesome to get it back up & running. Holds sentimental value to me since I drove across the country to get it end of 2019 & is what started me down the rabbit hole with this crazy hobby.....now I have 22 CRTs.... including PC monitors, 240P PVMs, a 480P 38" Arcade CRT, component consumer sets ect.....
Congrats for your big CRT collection, being a fellow CRT collector myself. :D Over the past couple years a friend and I achieved almost the same combined number of CRTs (mostly 16 Trinitron/Diamondtron monitors, 5 PVMs, 2 or 3 smaller lower end monitors and a few SD TVs). Though with the added hassle of having to ship every single CRT monitor from Europe via maritime freight overseas to where we live (at least for the monitors and a couple PVMs, the TVs were local, but no HD CRT TVs at all here). It also all started with my FW900 back in 2016. I'm probably gonna have to get my hands on one of those flybacks as well at some point, just for the peace of mind in case mine dies one day. It already scared me a few times in the past (same for my F520 at the moment, sparks inside once in a while).
Now in the next few years I have to figure out a way to get my hands on the KD-34XBR960 (and maybe KV-xxFS310) for my friend and I, considering I most likely have to organise shipping for those all the way from the US. That's gonna be tricky, but I'm gonna figure it out somehow...

On another subject, I ordered a bunch of adapters I was able to get at a discounted price from a sale. Some USB-C, a couple HDMI. When it all arrives and I get to test them, I'll report here.

I also spent quite some time trying to figure out all the solutions we have to output interlaced resolutions. The best solution I came up with so far to game with recent hardware is of course yet another compromise, but at least it worked acceptably enough. Let me explain from the beginning.

I have an old 2004 PC I built a few years ago for some retro gaming, with a ATI X850 card. I thought well, I don't remember ever testing interlaced resolutions on that PC, let's try that out! I plugged in the Iiyama Vision Master Pro 514 thinking if there is one monitor that would have a crazy potential with interlaced resolutions, it has to be this one, since it has unlimited vertical refresh rate... Well... I was not wrong. Imagine my surprise when I was able to output 1920x1080 (16:9 for lower lines count, and higher refresh rate) at an absolute whopping 230Hz.

PXL_20220705_183118058.jpg


Close to equivalent CRU settings, for clarity.

CRU_1080i230.png


That was followed by 1280x720 at an even crazier 340Hz. I had to do a lot of geometry adjustments in the menus, as the picture had a lot of pincushion distortion at those crazy refresh rates. But after fiddling with the VGA timings and the menu on the CRT it all adjusted almost perfectly.

PXL_20220705_190328101.jpg


PXL_20220705_190341057.jpg


Again, close to equivalent CRU settings, for clarity.

CRU_720i340.png


Pretty crazy I have to admit! I don't know of any other CRT that can go that far, apart from the 19" Iiyama 454 which also has unlimited vertical refresh rate, if I remember correctly, but slightly lower specs overall. If anyone else here knows about another monitor with unlimited vertical refresh rates or some other crazy specs, I'd be very curious to know!

So obviously, that was just for testing, no way I could do anything useful with such an old configuration running crazy high refresh rates and/or resolutions. But I have to say, the absolute simplicity of just using PowerStrip on Windows XP for on the fly adjustments of VGA timings was pretty damn nice. It's sad we can't do that with ease anymore.

So going from there, I started testing everything I had around, see what could work and what won't, to try and output interlaced resolutions on recent hardware and enjoy good framerates on recent games running at high resolutions. At least for the fun of it, since the monitor can do it! I'm not saying using interlaced resolutions is the best or anything, but I have to admit, being able to max out absolutely everything for example with the FW900 running 2304x1440 interlaced at it's limit of 121.9kHz horizontal and 160Hz vertical, that's pretty cool! At such high resolutions and refresh rates, on some games interlacing lines are barely visible and really not that annoying.

So, we know the Nvidia GTX 1000 series, the last generation before RTX, signed the end of support for interlaced resolutions in the Nvidia drivers. The GTX 900 series are the last cards with a built in analog output, leaving only HDMI (DVI is the same as HDMI) and DisplayPort for VGA adapters.
- I first tried outputing interlaced over HDMI with the couple really bad adapters I have, and I was able to confirm it indeed works with my 1080, not with my 3080. I was not able to do too much testing, as my adapters are quite limited (I have a couple more on the way).
- I read that DisplayPort had an incomplete implementation of interlaced resolutions, but Linux drivers allows us to output interlaced over DisplayPort anyway (via the use of the "AllowDPInterlaced" setting). No luck there. I was able to output something interlaced, but none of my DisplayPort adapters were able to produce any useable picture. Only the very top part was showing up, and that only at very low resolutions. I even tried DisplayPort to HDMI, followed by an HDMI to VGA adapter I successfully used for interlaced resolutions, but I ended up with the exact same weird picture on the CRT. Which might mean the GPU really can't output proper interlaced resolutions from the DisplayPort interface.
- That previous experiment led me to an idea. What about using a HDMI to DisplayPort adapter, followed by a good DisplayPort to VGA adapter. It probably won't work, but at least I know the GPU itself does output a proper interlaced resolution via HDMI. It's up to the adapters then to do the convertion properly. I don't think that will be a success, but I have a couple HDMI to DisplayPort adapters on the way (one that does 4K30 and one 4K60). I'll be able to do further testing when I get them.

That was a good start, but so far I had no real success yet for high resolutions. I don't have a GTX 980 Ti to try out, but this one and the Titan X Maxwell would be the most powerful cards with built in RAMDAC up to 400MHz (and maybe a little higher, I was able to go up close to 450MHz with my GTX 680). These are also not the best solution, as we are limited in performance, those cards are getting older and older. Still an interesting option though to use monitors natively without adapters.

Next in the list of things to try out, older cards used for their outputs only, keeping a recent card for rendering games.

My main GPU is a RTX 3080. I do have a GTX 730, GTX 680 and GTX 285, but I can't use any of those for interlaced output as a secondary card, as I have to use one single Nvidia driver for both the recent and the old card. The problem being that, recent drivers supporting the RTX cards just won't allow interlaced resolutions to be set even with older hardware that supports it! If I'm not mistaken, the latest Nvidia driver that allows interlaced resolutions to be set is the 411.70 WHQL game ready driver (supports up to the RTX 2080 Ti and TITAN V). With this driver, resolutions can be tested in the Nvidia Control Center, but a trick has to be used to set the interlaced resoluton permanently see this guy's YouTube video here. I however just go to the Task Manager, and kill the Nvidia Control Panel task just after testing the resolution, before it has the chance to revert back. That said, using both a recent and an old Nvidia card together for VGA output look to be a pretty bad idea.

I could have tried an AMD card for the recent GPU as I have a Radeon RX 5700 XT on another machine, but considering how Nvidia drivers were problematic anyway for interlaced resolutions and also potential compatibility of older drivers with Windows 11, I just skipped that test and went a different route instead.

I stumbled upon an old ATI Radeon HD 4550 I had laying around. I picked it up thinking there is absolutely no way I would be able to install that 2008 (14 years old!) card on my Windows 11 machine and expect it to be compatible... The latest legacy driver AMD released that supports the old TerraScale GPUs seemed to be Catalyst 13.4 (amd_catalyst_13.4_legacy_beta_vista_win7_win8). That driver is supposed to be compatible up to Windows 8, but I was able to install it just fine on Windows 11! Surprising. Unfortunately when I tried to run Catalyst Control Center, it was not showing the GPU. I almost gave up right there, but I thought I'd try ToastyX CRU and go to the Windows Display settings to check for my interlaced resolutions...

Guess what, it worked. :woot:

285255507_611230989936440_971815525410416690_n.png


I never thought that would work, but it did. Now, this might not be the best card as a secondary card for VGA output, but since then a friend and I managed to order a few more old ATI cards, a Sapphire Radeon HD 4850 X2 (dual GPU with 4 DVI-I ports!), Sapphire Radeon HD 5870 Vapor X, Radeon HD 6870. I'll be able to test a few cards from different generations when I get them.

Now how does this work in game? Well first of all, we need to rely on Windows 11's ability to select the recent card as a rendering card for each game we're running. To start off, I had to have a monitor (or at least a dummy HDMI plug or something) also plugged in to the recent card. Otherwise Windows 11 will assign the old ATI as both the performance card and the power saving one. It causes a lot of issues, for example trying to launch a recent Call of Duty, it kept throwing an error saying it could not find any compatible GPU. After plugging in a monitor on the RTX card, that was not a problem anymore.

So I went on and tried a few Call of Dutys (Cold War, Black Ops 4, Warzone,...), as far as I remember, they all have an in game option to select the rendering GPU, so that's really nice. Unfortunately I was not able to select a display from another GPU though, and I had to use windowed full screen mode to move the game to the CRT using a Windows keyboard shortcut (Shift + Win + Left/Right arrow). Then I have to admit some of the Call of Duty games didn't look like they were running as smooth as the framerate that was reported. Overall it was ok, but not the absolute best.

Moving on to World of Warcraft, this game allows to select the rendering GPU and also any monitor connected to either GPUs! So I was able to run it in fullscreen on the CRT and it runs perfectly smoothly. This one is a win, but it is for sure not the best game to profit from higher refresh rates that interlaced modes allow.

I don't play Valorant, but I downloaded it quick for a test. Never forgetting to go to the Windows graphical adapter settings and selecting the RTX card for rendering for the executable of each game. Alright so Valorant also had to run in windowed full screen mode, but the game was running really smoothly, no issue whatsoever here!

Then I tried Apex Legends, and well that one was a real struggle. I had to create a fake resolution on my LCD on the RTX card, the same 4:3 resolution as the CRT (I was using the F520 for the following tests). Then I was able to launch the game on the LCD, go to windowed full screen and then move the window using Shift + Win + Left arrow. Then it was running really smoothly, but what a hassle!

I also tried CS:GO, which was running more or less fine as well, but felt less smooth as it should. I don't remember what I had to do for this one, but I guess it was also windowed fullscreen. Not perfect for this one.

On to GTA V. As far as I rember, I was able to select the display and it was running really smoothly!

Halo Infinite was also running absolutely perfectly.

I think that's all for the tests so far. I was thinking about mirrorring the LCD and CRT in Windows display settings, so maybe that could have solved some issues. But the mirroring option was just not available for the CRT. I think that's because the ATI driver is too old and the Nvidia driver too recent. Also, I have no way of measuring system latency, at least not at the moment. Maybe I can do something in the future, I'll see.

So basically, it's more or less fine for a lot of games, but some end up being a bit less smooth than they should, some are a real struggle to get running on the secondary GPU, for some games it might not be worth the hassle as it can be pretty complicated to run correctly. Unless the game itself has the option to select the output monitor (even if that monitor is connected to a different GPU than the one used for rendering), it has to run in windowed mode or best, windowed fullscreen mode. But doing that adds slightly more latency to the mix I guess (I never felt the difference tbh).

So for now, overall I'm pretty happy with the results! Sure it's not perfect, but at least I was able to run a bunch of games really smoothly. It can require some tricks and compromises, but at least this method works. It gives one solution for running recent games at high settings on recent hardware and still use interlaced resolutions to get higher refresh rates at high resolutions on a CRT.

That's something that still impresses me. I mean seeing 2304x1440 at actually 160Hz on a 2001 CRT is pretty insane, even if it comes with the slight annoyance of interlacing lines that can be seen in some gaming conditions. Note that I'm talking about high resolutions only here! The interlacing lines become very clearly visible and annoying at lower resolutions. Once again, I'm not saying I will game on interlaced resolutions all the time, but I would really appreciate having the possibility to do so when I want to, considering the hardware allows it.

Now the absolute best solution for that would be a digital adapter that takes a progressive resolution and turns it into an equivalent interlaced resolution, dropping odd/even lines. The latest revisions of HDMI and DisplayPort have crazy high bandwidths, so imagine we can just ask Windows to simply output progressive, like for example 2304x1440 at 160Hz, and the adapter processes that, dropping the necessary lines and adjusting the timings as needed to output 2304x1440 interlaced at 160Hz. That only has to be digital, DisplayPort or HDMI in, DisplayPort or HDMI out. Then a VGA adapter can be plugged in on the output to convert that already interlaced signal to VGA (if the adapter supports it, of course). But that's something I haven't found yet. HDFury does not have a recent digital adapter that does interlacing at high resolutions, nor does Extron as far as I know. Maybe one day.
 
Something else, I tested my Sunnix DPU3000 to its limits, I was able to display 3840x2160 at 50Hz (for testing only, of course, as it gets really flickery) on the F520 and FW900.

I had to manually lower the VGA timings, to get a perfect picture on the CRT at 4K, and yet not go above the 540MHz limit of the DPU3000. I can confirm it cuts out right at 540MHz, and works flawlessly slightly below that limit.

These are the timings I ended up with.

CRU_4k50_CRT.png


And we can see it working here. I was not able to display true 16:9 on the F520, the height was at 0% here, and it was still too high and I had no room in the VGA timings to add margin.
But that was just for a test, I wanted to see how actual 4K progressive looked on these monitors I was happy with the results. :ROFLMAO:

PXL_20220711_153041726.jpg


If there is an adapter that can go way past 540MHz, I might be able to do almost 4K60 because the horizontal and vertical frequencies are far from maxed out... But I'm curious how far can the CRT RGB cathodes drive circuits actually go? 540MHz is already pretty far away from the maximum 400MHz all GPUs of the time could output!
The picture was absolutely crystal clear, though with the flicker I didn't look too long for issues :LOL:

EDIT: Quite frankly I'm really impressed! Those are things that were impossible to do back when CRTs were the norm. As far as I'm aware, there were no graphics cards even remotely cappable of going that high (and it was useless anyway I guess). That means 20 years after, we have hardware that can actually push those old CRTs past what they were able to do back then. Not that it's really useful to go that high in resolution, it's way past what the aperture grille can resolve, but it's just so impressive to see a 20 year old monitor actually capable or handling such a crazy high resolution and displaying it flawlessly... Absolute champions!
 
Last edited:
Some new updates on the interlaced resolutions subject. Today I received two more ATI GPUs:
  • ATI Radeon HD 4890, TerraScale 1 (DX10) architecture (the same generation of GPU I already tested before with my HD 4550)
  • ATI Radeon HD 5870, TerraScale 2 (DX11, pre-GCN) architecture
_DSC3407.JPG


The HD 5870 happens to have old drivers that are installed automatically by Windows Update, unlike the previous generation of ATI cards which required the manual installation of technically unsupported drivers that, fortunately, happens to still work fine with Windows 11.
Also, the 5870 is also supported by the Amernine Zone modified drivers by NimeZ, that allow the installation of recent drivers for old legacy AMD/ATI hardware. The TerraScale 1 architecture is supposed to be "work in progress" so maybe those drivers will be compatible with older cards in the future.

That said, I was able to go through some interlaced resolutions testing with both cards. Here are the results.

The transition between TerraScale 1 and TerraScale 2 introduced some limitations to interlaced resolutions. I was hitting a wall trying to go past certain values on the 5870 where both the 4870 and 4550 cards were doing just fine.
When trying to set progressive resolutions, it is of course possible to hit the hardware limit of 400MHz pixel clock, but when creating interlaced resolutions, the absolute limit is 272.72MHz. Above that, the resolution won't show up after restarting with CRU. I don't know where that value comes from, but I found it by experimentation.
The other limitation is the horizontal pixel limit of 2560. Once again trying to go above that limit, the resolution won't show up. There is no limit I found for the lines count, where I was able to create (but I didn't try to test...) above 8000 lines interlaced with the resolution still showing up.

So the highest resolutions possible on the HD 5000 and up GPUs are supposedly the following (I don't remember what I was able to test or not test, these are just calculated according to the limitations mentionned above):
  • 2560x1920i (4:3) at 78Hz (CVT)
  • 2560x1600i (16:10) at 93Hz (CVT)
  • 1920x1440i (4:3) at 133Hz (slightly modified CVT)
  • 1920x1200i (16:10) at 157Hz (CVT, can probably hit 160 after some adjustments of the timings)
While the first two are not really interesting, we really feel the pixel clock limitation at those high resolutions, the last two are really good despite those limits.

Those limitations are the exact same when using the latest 2022 drivers available with Amernine Zone. Except this time, instead of using CRU, interlaced resolutions can be created via the AMD software, no "restart" necessary unlike with CRU.
That said, the AMD user interface is really clunky and fine tuning is a hassle, but I believe we can add way more custom resolutions compared to CRU. I still much prefer using CRU though.

Now to compare with the HD 4890 (and also HD 4550), with the original legacy 13.4 drivers from AMD (amd_catalyst_13.4_legacy_beta_vista_win7_win8).
Here we can go all the way up to the full 400MHz pixel clock when setting interlaced resolutions.
The maximum horizontal pixel count I was able to set was 3840, which is perfect for 4K contents. I tried setting a resolution slightly above, with 3850, 3880, those didn't show up. I also tried crazy high vertical resolution with 8000+ lines, and no limitation here either.

So I was able to reach the following interlaced resolutions with the HD 4000 series cards:
  • 2880x2160i (4:3) at 90Hz (slightly modified CVT)
  • 3840x2160i (16:9) at 68Hz (CVT)
  • 3840x2400i (16:10) at 60Hz (CVT)
  • 3200x2048i (25:16, for the FW900) at 85Hz (CVT)
So here we have it.

I'll have a few more cards to test later, with a HD 6870 (which should perform the exact same as a 5000 series, as it's TerraScale 2) and a more interesting Sapphire Radeon HD 4850 X2 2GB, which is a CrossFire card with 4 analog outputs. It is also compatible with the legacy 13.4 drivers on Windows 11 (the card is listed, I checked). So hopefully it should perform similarly to the 4000 series I just tested, but with more outputs for CRTs!
I'm going to try a triple monitor setup with that card, but I really wouldn't be surprised if running a 2x CrossFire on a single PCIe x8 slot, and rendering games with another GPU, I'll end up hitting the limit of the PCIe x8 bandwidth. But it will sure be a fun experiment! Does anyone know how to calculate that?

_DSC3386.JPG
 
Last edited:
Some new updates on the interlaced resolutions subject. Today I received two more ATI GPUs:
  • ATI Radeon HD 4890, TerraScale 1 (DX10) architecture (the same generation of GPU I already tested before with my HD 4550)
  • ATI Radeon HD 5870, TerraScale 2 (DX11, pre-GCN) architecture
Another fun experiment could be the attempt of FreeSync on your CRTs -- if you do that, please post here.

(It's possible -- works on some multisync CRTs, requires forcing of the feature via ToastyX on an AMD card, and requires unbuffered HDMI->VGA adaptors that converts timings 1:1 verbatim. That being said, FreeSync support did not begin until Radeon RX 200 series).
 
Another fun experiment could be the attempt of FreeSync on your CRTs -- if you do that, please post here.

(It's possible -- works on some multisync CRTs, requires forcing of the feature via ToastyX on an AMD card, and requires unbuffered HDMI->VGA adaptors that converts timings 1:1 verbatim. That being said, FreeSync support did not begin until Radeon RX 200 series).
That would be something really interesting indeed to try with my RX 5700 XT maybe! It will give me an excuse to do some more testing with recent AMD cards as well.
I will probably get my hands on a 280X / 380X at some point, since these are the last AMD cards with analog output. So if I'm unsuccessful testing with the RX 5700 XT, I will try later with one of those.

Which adapters in particular would be best for testing FreeSync?

At the moment I have those at hand (plus a few more on order):
  • Delock 62967 "rewired", DP (Analogix ANX9847)
  • Sunnix DPU3000, DP (Synaptics VMM2322)
  • Vention CGKHA, USB-C (ITE IT6562FN)
  • Cheap "Anbear" DP adapter (Analogix ANX9833)
  • Cheap no brand HDMI adapter (Explore EP94Z1E)
  • HDFury 3, HDMI
 
I guess you haven't heard of Toasty X's pixel clock patcher?

https://www.monitortests.com/forum/Thread-AMD-ATI-Pixel-Clock-Patcher

That will let you blow way past 500mHz. I'm curious how it changes the interlaced limit on HD 5xxx series
Oh I completely missed that indeed! I didn't really know ATI/AMD hardware before, I mainly had Nvidia cards for the past 13 years.

So I tried it with the Radeon HD 5870, and wow indeed it pushes the pixel clock limit to exactly 545.45MHz for progressive resolutions and the maximum horizontal pixel limit to the same 3840 as the 4000-series cards! :woot:
So that makes it better than my DPU3000 adapter which has a 539.99MHz limit (almost the same), but most importantly, no instability or ghosting at all here. So that's really good!

697967_1658321234258.png


Now the issue, the limit has not been patched for the interlaced resolutions, and remains exactly 272.72MHz (exactly half of the progressive limit actually, hmmm)... :cry: Considering the limit was the same for progressive and interlaced on the previous generation, at 400MHz, I would assume it was just ignored by the patch?
I'm going to ask ToastyX about that, just in case. If he just didn't patch that, but it's something that can be done, that would be absolutely crazy to be able to go to the full 545.45MHz for interlaced resolutions! Even if 400MHz is the maximum possible for interlaced for some reason, it's still really good as well. And that would make the 5000 and up generations of ATI/AMD cards the most interesting models for me, as a secondary card in my main PC.

One more thing, those cards can run more recent drivers as I mentionned before, with the Amernine Zone modded drivers project, but the patch does not support those drivers (at least not at the time). So I did all the testing with the default drivers installed by Windows Update, version 15.201.1151.1008 (04/11/2015).
 
Short follow up. I got a reply from ToastyX, he mentionned that he indeed vaguely remembers that 545.45MHz limit with the 5000 series, but he doesn't know where it comes from. The 272.72MHz limit for the interlaced resolutions seems to be linked, considering it's exactly half the maximum frequency of progressive resolutions. So no solution here.
He doesn't know whether or not this limit also existed on the 6000 series cards, so when my friends card arrives, I'll try that out. But the 6000 series being also either TerraScale 2 (like the 5000) or TerraScale 3 for the 69xx models, I wouldn't be surprised to also see the limit on these models.

The good news is, he told me the HD 7000 series (GCN 1.0 architecture) might not have that limit, at least from his memory he doesn't remember seeing that limit on this generation of cards.
The Radeon R9 280X being the exact same model as the Radeon HD 7970, I decided to purchase a Asus Radeon R9 280X DirectCU II TOP.

I also got a Sapphire Nitro R9 380X which is the last AMD card with a VGA port, and of the GCN 3.0 architecture. I'll see how this one works as well.
I read some guy on the Microsoft forums mentionning interlaced resolutions working on that card with some specific drivers, quote below. So at least I know interlacing should work on it if I use the correct drivers.

The default download for Windows 10 and my R9 380x card does allow interlaced resolutions, but it is a very old driver. It is Crimson 16.6, which is almost a year old now. It is missing support for a lot of games that have released in the past year. So this really isn't a valid solution for me.

I did however go back through the AMD's beta drivers, and found that 17.4.2 is where interlaced resolutions were broken. I can install 17.4.1 and anything before that, and interlaced resolutions will work.

So I believe this is something Microsoft engineers will have to work on with AMD directly, because there is no good solution to fix it on the user-side. And my memory is a little fuzzy, but I think I had 17.4.2 installed prior to the 1703 Creator's update, and interlaced resolutions worked. So I think the bug is from a combination of changes introduced in the Crimson 17.4.2 driver, and the 1703 creator's update.

That said, both of these cards support FreeSync as pointed out by Chief Blur Buster, so it will be the next thing for me to try out! Then I'll have to find a way to measure latency proprely, I might have some ideas for that.
 
Short follow up. I got a reply from ToastyX, he mentionned that he indeed vaguely remembers that 545.45MHz limit with the 5000 series, but he doesn't know where it comes from. The 272.72MHz limit for the interlaced resolutions seems to be linked, considering it's exactly half the maximum frequency of progressive resolutions. So no solution here.
He doesn't know whether or not this limit also existed on the 6000 series cards, so when my friends card arrives, I'll try that out. But the 6000 series being also either TerraScale 2 (like the 5000) or TerraScale 3 for the 69xx models, I wouldn't be surprised to also see the limit on these models.

The good news is, he told me the HD 7000 series (GCN 1.0 architecture) might not have that limit, at least from his memory he doesn't remember seeing that limit on this generation of cards.
The Radeon R9 280X being the exact same model as the Radeon HD 7970, I decided to purchase a Asus Radeon R9 280X DirectCU II TOP.

I also got a Sapphire Nitro R9 380X which is the last AMD card with a VGA port, and of the GCN 3.0 architecture. I'll see how this one works as well.
I read some guy on the Microsoft forums mentionning interlaced resolutions working on that card with some specific drivers, quote below. So at least I know interlacing should work on it if I use the correct drivers.



That said, both of these cards support FreeSync as pointed out by Chief Blur Buster, so it will be the next thing for me to try out! Then I'll have to find a way to measure latency proprely, I might have some ideas for that.
Very interesting stuff, I have a rig with an RTX 3090 TI and a (Pascal) Titan XP and have gotten 1440i @ 144hz to work with an Alfais HDMI adapter (caps at ~350 MHz) I got from someone in the CRT discord (I render on the 3090 TI, output HDMI over the Titan XP) . I also have an AMD R9 380x, and now I’m wondering if using that would be a better option rather than the Titan XP, do you think using the old AMD drivers would present any issues with new games assuming that I’m using the RTX card for rendering and AMD just for analog output?
 
Very interesting stuff, I have a rig with an RTX 3090 TI and a (Pascal) Titan XP and have gotten 1440i @ 144hz to work with an Alfais HDMI adapter (caps at ~350 MHz) I got from someone in the CRT discord (I render on the 3090 TI, output HDMI over the Titan XP) . I also have an AMD R9 380x, and now I’m wondering if using that would be a better option rather than the Titan XP, do you think using the old AMD drivers would present any issues with new games assuming that I’m using the RTX card for rendering and AMD just for analog output?
I would say that depending on the game it might be hit or miss. But from my testing a few posts above, most games I tried did run well using old 2015 drivers with the 4890. There are a few tricks needed sometimes though, for example I had to have a monitor plugged in to the 3080 for Call of Duty to even launch for example, otherwise I would get an error saying there was no compatible GPU found. But that was a really old card with ancient drivers. Now, for the 380X it might be different. I'm really interested in seeing what the 380X can do with patched drivers.

Also I forgot to mention, I pointed out to ToastyX that the NimeZ drivers are not compatible with his patch, and he thinks it might be possible to apply the patch to the modded drivers. If that's indeed the case, it should be more optimised for recent games.
 
I'm really new to this whole custom/tweaked drivers thing. How exactly do you guys recommend getting started or attempting to learn about this stuff? Is installation generally as easy as just using DDU to uninstall the official one(s) and installing a different one you find pre-assembled on some message board somewhere (like this one)?
 
Kinda curious if anyone has had any luck with 1440p@60hz over various VGA converters out there, especially now that the PS5 apparently supports it.
 
Regarding HP A7217A, just got P1110/G500 flyback that I'll be using for donor. Not sure when I'll do it but know another user posted here they swapped it successfully.... would be awesome to get it back up & running. Holds sentimental value to me since I drove across the country to get it end of 2019 & is what started me down the rabbit hole with this crazy hobby.....now I have 22 CRTs.... including PC monitors, 240P PVMs, a 480P 38" Arcade CRT, component consumer sets ect.....
Where did you get the VX1120 flyback? I need one for my VX1120.
 
Alright, I just tested both the Sunnix DPU3000 and the Delock 62967 with interlaced resolutions. They both display interlaced resolutions perfectly fine! :D I have to say I'm really impressed!

So I just received a few adapters I ordered (all at a really good discount of -50 to -70%, which is why I got all these to try out). I tried them all out quickly today and then took them all apart to check what are the chips inside. Here is a short summary of the results.

Nedis CCBP64850AT02 (USB-C to VGA) -> NXP PTN3355
Only computer I was able to use for the time being to try the USB-C adapters is my laptop with a 2060 Max-Q. The maximum bandwidth I was able to get was 238.80MHz.

StarTech.com CDPVDHDMDP2G (USB-C to HDMI/DVI/MiniDP/VGA) -> Parade PS8617 (I was not able to see the chips on the other side of the board)
Well... I got the exact same result with this one unfortunately. Which made me wonder if there was not something else limiting. So I went on and tried again the following adapter I have for some time.

Vention CGKHA (USB-C to HDMI/VGA) -> VIA/VLI VL100 + ITE IT6562FN
I was able to hit around 398MHz with this one just fine. Not above that.

StarTech.com CDP2VGAFC (USB-C to VGA) -> Realtek RTD2169U
That one turned out to be pretty good, with a maximum pixel clock of 360.50MHz, which is not bad at all!

Lastly, the only HDMI adapter of the bunch I tested today.

StarTech.com HD2DPVGADVI (HDMI to DVI/DisplayPort/VGA) -> Lontium LT8511A (HDMI to VGA) + STDP2600 (HDMI 1.4b to DisplayPort 1.2a) + Lontium LT86104SX (HDMI 1.4/DVI 1.0 4-port splitter)
So I was really curious about this one since HDMI allows for proper interlaced resolutions, and not only does it have a VGA output to try out directly, but also DisplayPort to daisy chain other adapters...
For the VGA output, unfortunately nothing to write home about as I was only able to reach a maximum bandwidth of 177.50MHz which is incredibly low! Going above that, the display starts showing black horizontal lines moving everywhere across the screen, increasingly going up in bandwidth until only the top of the display is visible then nothing at all. Really bad. That said, interlaced resolutions work perfectly, within the same limits.

Next up was trying the DP adapters I have on the DP output, see if DisplayPort even can be used for interlaced resolutions one way or another. I tried that already on a few different GPUs, AMD and Nvidia both in Linux and Windows ranging from 7 to 11. I never had any success at all, but the results lead me to believe that the adapters were not the issue, only the GPUs not being able to output proper interlaced signals. But here, I was able to output clean interlaced signals over HDMI, and it would be up to the adapter to convert that to interlaced over DisplayPort, if that's even possible to do.

First trying with the Sunnix DPU3000. The StarTech adapter is advertised as having a maximum of 4K at 30Hz on the DisplayPort output, which would definitely be limiting the DPU3000. I was able to output progressive resolutions up to a pixel clock of 324.25MHz. After that, the picture starts shivering more and more going up in bandwidth until it just stops displaying completely. Unfortunately some resolutions just won't register at all, but I remember that being also an issue with the DPU3000 on its own maybe? Not sure, I have to run more tests. After going a little higher or lower in frequency, it works again. So it works at least.

Moving on to the interlaced resolutions, the really impressive part is that it also works absolutely flawlessly!!! Same limlitations as well, some resolutions won't work, but when it does it does so perfectly well. That was on my GTX 1080 and Nvidia drivers 411.70 (setting the resolution in the Nvidia Control Panel, and then going to the Windows 11 modes list to select the interlaced resolution).

I then tried my Delock 62967, also working flawlessly in interlaced resolutions through HDMI to DisplayPort!

So I was able to do for example 1920x1200 interlaced at 150Hz on the FW900 without any issue for a few hours using the following chain.
Nvidia GTX 1080 [HDMI] StarTech.com HD2DPVGADVI [DP] Delock 62967 / Sunnix DPU3000 [VGA] Sony GDM-FW900

So now I know for sure that at least those two really good DP to VGA adapters can actually to interlaced modes without any issue, provided that we use a HDMI to DP adapter that also supports it.
Now I need to get my hands on a better 4K60 HDMI to DP adapter, and hope that it will also do interlaced modes. I already ordered the Club 3D CAC-1331, I'll see what it's cappable of when I get my hands on it.

I also got a DisplayPort to Type-C adapter, the Sunix UPD2018 PCIe card (actually the Dell version) that can also be used externally without a PCIe port. That way I can try out the Type-C adapters I have, see whether or not these can also do interlaced.
I also got a direct HDMI to Type-C adapter, the Gofanco HDMIUSBC, see how that one performs.

The one thing I never heard of and that would now completely solve the interlacing issue, would be a DisplayPort (or HDMI) to DisplayPort (or HDMI) device that interlaces the signal on the output, dropping odd/even lines on each frame.

Are there any news on more recent adapters that might have hit the market in late 2021 or 2022? Any news about a potential Lontium LT8612UX based adapter?
I'm curious to see if we have anything else than the DPU3000 (and equivalents) that goes at least to or above 500-540MHz and that might be better in quality. My DPU3000 outputs a pretty good quality picture, but with some definitely noticeable ghosting.
 
My DPU3000 outputs a pretty good quality picture, but with some definitely noticeable ghosting.

Mine doesn't really have ghosting, it just is a bit blurrier than a typical VGA output at really high frequencies, like when I'm running 1920x1440 @ 90hz.

I could have sold you my Gofanco HDMI>USBC adapter, haha. I ordered it for the same reason you did. I couldn't get it to go to very high pixel clocks, but I had it running through like 2 or 3 other adapters so I started to lose track of what the real problem might be.

I am looking forward to how you do with a Club3D>DPU3000 setup, since it seems later Nvidia cards don't have maximum interlaced resolutions (if I recall correctly)
 
Alright, I just tested both the Sunnix DPU3000 and the Delock 62967 with interlaced resolutions. They both display interlaced resolutions perfectly fine! :D I have to say I'm really impressed!

Thank you!

I've recently started looking for a better adaptor for some weird cases like these -- for some of CRT-vs-(digital flat panels) experiments, in Blur Busters' endeavour to check ever more-and-more of the CRT-replacement versus checklist over the coming years/decade(s) -- via multiple routes (hardware strobing, software based CRT electron beam simulators, etc).

Still tons of 1080i material, plus interlace is an easy venue of getting extra CRT Hz on some tubes.

As more analog-output graphics cards become thrown into the closet because of OS/software incompatibility issues, solutions like these are good!
 
Last edited:
Mine doesn't really have ghosting, it just is a bit blurrier than a typical VGA output at really high frequencies, like when I'm running 1920x1440 @ 90hz.
Interesting, I can't really say it looks blurry, but I didn't do a side by side comparison with a direct GPU output. I will do that when I can. The ghosting is only noticeable in certain conditions, so it's not too annoying.

I could have sold you my Gofanco HDMI>USBC adapter, haha. I ordered it for the same reason you did. I couldn't get it to go to very high pixel clocks, but I had it running through like 2 or 3 other adapters so I started to lose track of what the real problem might be.

I am looking forward to how you do with a Club3D>DPU3000 setup, since it seems later Nvidia cards don't have maximum interlaced resolutions (if I recall correctly)
Haha damn, I went too fast.

That said I'm interested in getting the following USB-C adapters (if I made no mistake), I'll probably order some of them soon to try out.
  • Delock 87776 -> ITE IT6224
  • Delock 63924 or 63925 -> ITE IT6562 (EDIT: I already have the Vention CGKHA with that chipset)
  • Delock 64002 -> Algoltek AG9300
  • Delock 63923 -> Chrontel CH7212
  • j5create JCA111 -> ITE IT6516BFN?
And a little unrelated, tomorrow I'm gonna get a 8K60 bidirectional DP <> USB-C. I just want a cable I can use both ways as I have both Type C and DP monitors and both Type C and DP outputs on GPUs. I will of course also test it with CRT adapters to see how it performs. :ROFLMAO:
https://www.unitek-products.com/products/8k-usb-c-to-displayport-1-4-cable

Thank you!

I've recently started looking for a better adaptor for some weird cases like these -- for some of CRT-vs-(digital flat panels) experiments, in Blur Busters' endeavour to check ever more-and-more of the CRT-replacement versus checklist over the coming years/decade(s) -- via multiple routes (hardware strobing, software based CRT electron beam simulators, etc).

Still tons of 1080i material, plus interlace is an easy venue of getting extra CRT Hz on some tubes.

As more analog-output graphics cards become thrown into the closet because of OS/software incompatibility issues, solutions like these are good!
Totally agree. I definitely will continue doing more testing! Really impressive work you're doing!
 
Alright, I just tested both the Sunnix DPU3000 and the Delock 62967 with interlaced resolutions. They both display interlaced resolutions perfectly fine! :D I have to say I'm really impressed!

So I just received a few adapters I ordered (all at a really good discount of -50 to -70%, which is why I got all these to try out). I tried them all out quickly today and then took them all apart to check what are the chips inside. Here is a short summary of the results.

Nedis CCBP64850AT02 (USB-C to VGA) -> NXP PTN3355
Only computer I was able to use for the time being to try the USB-C adapters is my laptop with a 2060 Max-Q. The maximum bandwidth I was able to get was 238.80MHz.

StarTech.com CDPVDHDMDP2G (USB-C to HDMI/DVI/MiniDP/VGA) -> Parade PS8617 (I was not able to see the chips on the other side of the board)
Well... I got the exact same result with this one unfortunately. Which made me wonder if there was not something else limiting. So I went on and tried again the following adapter I have for some time.

Vention CGKHA (USB-C to HDMI/VGA) -> VIA/VLI VL100 + ITE IT6562FN
I was able to hit around 398MHz with this one just fine. Not above that.

StarTech.com CDP2VGAFC (USB-C to VGA) -> Realtek RTD2169U
That one turned out to be pretty good, with a maximum pixel clock of 360.50MHz, which is not bad at all!

Lastly, the only HDMI adapter of the bunch I tested today.

StarTech.com HD2DPVGADVI (HDMI to DVI/DisplayPort/VGA) -> Lontium LT8511A (HDMI to VGA) + STDP2600 (HDMI 1.4b to DisplayPort 1.2a) + Lontium LT86104SX (HDMI 1.4/DVI 1.0 4-port splitter)
So I was really curious about this one since HDMI allows for proper interlaced resolutions, and not only does it have a VGA output to try out directly, but also DisplayPort to daisy chain other adapters...
For the VGA output, unfortunately nothing to write home about as I was only able to reach a maximum bandwidth of 177.50MHz which is incredibly low! Going above that, the display starts showing black horizontal lines moving everywhere across the screen, increasingly going up in bandwidth until only the top of the display is visible then nothing at all. Really bad. That said, interlaced resolutions work perfectly, within the same limits.

Next up was trying the DP adapters I have on the DP output, see if DisplayPort even can be used for interlaced resolutions one way or another. I tried that already on a few different GPUs, AMD and Nvidia both in Linux and Windows ranging from 7 to 11. I never had any success at all, but the results lead me to believe that the adapters were not the issue, only the GPUs not being able to output proper interlaced signals. But here, I was able to output clean interlaced signals over HDMI, and it would be up to the adapter to convert that to interlaced over DisplayPort, if that's even possible to do.

First trying with the Sunnix DPU3000. The StarTech adapter is advertised as having a maximum of 4K at 30Hz on the DisplayPort output, which would definitely be limiting the DPU3000. I was able to output progressive resolutions up to a pixel clock of 324.25MHz. After that, the picture starts shivering more and more going up in bandwidth until it just stops displaying completely. Unfortunately some resolutions just won't register at all, but I remember that being also an issue with the DPU3000 on its own maybe? Not sure, I have to run more tests. After going a little higher or lower in frequency, it works again. So it works at least.

Moving on to the interlaced resolutions, the really impressive part is that it also works absolutely flawlessly!!! Same limlitations as well, some resolutions won't work, but when it does it does so perfectly well. That was on my GTX 1080 and Nvidia drivers 411.70 (setting the resolution in the Nvidia Control Panel, and then going to the Windows 11 modes list to select the interlaced resolution).

I then tried my Delock 62967, also working flawlessly in interlaced resolutions through HDMI to DisplayPort!

So I was able to do for example 1920x1200 interlaced at 150Hz on the FW900 without any issue for a few hours using the following chain.


So now I know for sure that at least those two really good DP to VGA adapters can actually to interlaced modes without any issue, provided that we use a HDMI to DP adapter that also supports it.
Now I need to get my hands on a better 4K60 HDMI to DP adapter, and hope that it will also do interlaced modes. I already ordered the Club 3D CAC-1331, I'll see what it's cappable of when I get my hands on it.

I also got a DisplayPort to Type-C adapter, the Sunix UPD2018 PCIe card (actually the Dell version) that can also be used externally without a PCIe port. That way I can try out the Type-C adapters I have, see whether or not these can also do interlaced.
I also got a direct HDMI to Type-C adapter, the Gofanco HDMIUSBC, see how that one performs.

The one thing I never heard of and that would now completely solve the interlacing issue, would be a DisplayPort (or HDMI) to DisplayPort (or HDMI) device that interlaces the signal on the output, dropping odd/even lines on each frame.

Are there any news on more recent adapters that might have hit the market in late 2021 or 2022? Any news about a potential Lontium LT8612UX based adapter?
I'm curious to see if we have anything else than the DPU3000 (and equivalents) that goes at least to or above 500-540MHz and that might be better in quality. My DPU3000 outputs a pretty good quality picture, but with some definitely noticeable ghosting.
Thanks for testing all these adapters and also for the other things.

About the chipset:
PTN3355 should stop at 180 MHz with 8bpc and 240 MHz with 6bpc, maybe beyond a certain bandwidth it drops to 6bpc the input?
Same thing for the PS8617.
Vention CGKHA with IT6562 should stop at 360 MHz, i don't know how it can handle 398 MHz with four HBR lanes, this explains how Digital Foundry reached 2304x1440 80Hz.
Also IT6562 from specs has a 10 bit DAC, are you able to set 10 bit on digital output?
Realtek RTD2169U, nice to see it can handle the full bandwidth at 360 MHz.
Lontium LT8511A, it is a very old chip, not all data lanes are available so the low performance are not a surprise.
I don't think you can do interlaced resolutions with the Sunix UPD2018, the signal is displayport from start to finish, but at least you can use your USB-C adapters with any video card.

About the adapters you are interested in:
Delock 87776, IT6224 is like IT6564 with USB-C, so the performance should be the same.
Delock 64002, well this is interesting, AG9300 should handle up to 720 MHz on input, DAC from specs is the usual 1920x1200 60Hz.
Delock 63923, another interesting chip, CH7212 can handle up to 360 MHz on input, DAC is a 240 MHz 9bpc, from specs is the fastest after the ANX9847.
j5create JCA111, if I remember correctly it has been tested up to 240 MHz, should be like PS8617 and PTN3355.

It would be nice to know the difference in output quality between the different DACs.
Example someone has noticed that 62967 with ANX9847 is better than DP2VGAHD20 with IT6564.

About the LT8612UX, last time i talked with one R&D guy on Lontium, i asked for an adapter with this chipset and he told me he could send one sample.
Someone in the past has tested a different Lontium chipset and the performance of the DAC was not good, they usually use the same circuit for their things, so probably it is bad also on LT8612X but who knows.
If someone is interested i can PM the contact of that guy, so maybe he can sell one sample to test.
 
My DPU3000 outputs a pretty good quality picture, but with some definitely noticeable ghosting.
Mine doesn't really have ghosting, it just is a bit blurrier than a typical VGA output at really high frequencies, like when I'm running 1920x1440 @ 90hz.

also my dpu3000 doesnt, to me eyes, it looks the same as like using the native analog outputs from a video card supporting vga port at any resolution and refrehses i have tested, was not able to test 1920x1440 90hz since its out of FW900 horizontal frecuency, and my other CRT compaq 7550 also does not support that, however i have seen ghosting issues when vga cable is defective or poor quality. etienne51, are you sure its not your case with your vga cable?, by the way, there is a way to connect the DPU3000 directly to monitor vga port without a vga cable with both sides male vga adapter someone reported some time ago, like this:
IMG_1653.JPG


IMG_1652.JPG
 
Last edited:
Totally agree. I definitely will continue doing more testing! Really impressive work you're doing!
Thank you!

I wish the pandemic didn't happen because I had some of other companies about to add 60 Hz single strobe. Some of these projects have resumed already, but it did add 2 year delays, making some people think I am playing favorite with companies. Ah well :) (The Programme is open to all manufacturers that express interest)

Baby steps. Sometimes I have to be friendly to companies to be invited to board rooms. Then I have to start screaming (the right amount, but not throwing chairs around like Steve Jobs) at the companies in actual person inside their actual headquarters, just to shame them to add 60 Hz single strobe. About 50% like it and 50% hate it -- it's such a polarizing feature. I've been collecting hundreds of rave reviews of 60Hz single strobe despite the people who hate 60 Hz single strobe. I know, it's not as good as CRT -- but it's closer to CRT than LCD has ever been. Not everyone is flicker sensitive (especially if you follow best-practices and only run it with retro 60fps content, not at Windows desktop) -- and I've got plenty (hundreds) of messages, comments, youtube comments spread across about 50 sites, saying it's the best simulation of CRT motion clarity they've ever seen. Some of the "Big Companies Are Evil" people don't know how much work I have to get invited into boardrooms in order to convince them to add features that they aren't adding; and they don't realize how I'm willing to WALK AWAY from contracts. And because of that, some features got added to displays on the market.

Regardless, achievement unlocked. While some people aren't that big fan of me having to ally with companies (e.g. Blur Busters Approved 2.0 programme), you don't know how many arguments inside company headquarters I've had already just to convince stakeholders inside companies to add a feature. I can't argue at them via twitter, I have to argue inside actual board rooms inside headquarters, with a massive TestUFO PowerPoint presentation. People who want perfect blacks, people who want same flicker-comfort, people who want the same CRT texture, etc. It's hard to check ALL the checkboxes.

But, we actually managed to beat CRT motion blur while simultaneously having zero* strobe crosstalk (*only after QFT + Strobe Utility recalibration) - less motion blur because of lack of phosphor decay. Even if we couldn't do other checkboxes (e.g. flicker, blacks, etc). Some CRT fans are only worried about the motion blur, while other CRT fans need the other checkboxes.

Baby steps, baby.

Incrementally, future works will keep getting better and better.

Tomorrow: CRT Electron Beam Simulators (via brute Hz) essentially the 10 year plan, but there will be lots of incremental BFI improvements (blacks / comfort / brightness / etc) edging the checkboxes more and more.

Note: Ideally I want the CRT electron beam simulator to be an open source project (such as Windows Indirect Display Driver) that's display independent is the ultimate goal. If any of youz know indirect display drivers -- PM me -- since within 1-2 years I'm thinking of beginning to incubate an open source CRT electron beam simulator on github. It's a low-priority "20% bonus time" project at the moment because it's a distant-future incubation. The 240Hz OLEDs are hitting the market, and that's the bare minimum of "Better Looking Than 60Hz BFI", even if it's not enough Hz to simulate a CRT accurately enough (1/240sec worth of CRT electron beam simulation in GPU shader is not quite retina, we need about 1000-4000Hz to retina the simulation out out). Long term, a 1920 Hz display would allow 32 digital refresh cycles to accurately simulate 1 analog CRT refresh cycle temporally (by rendering 1/32sec worth of CRT electron beam refresh cycle, with everything factored, including shadow/slot mask simulation, accurate zero-blur (same as original tube), accurate phosphor decay match (adjustable). But even 240Hz will make it usable (in the sense it will be superior to more-flickery 60Hz BFI).
 
Last edited:
especially if you follow best-practices and only run it with retro 60fps content
I think you're selling the feature short here. There are lots of modern 60fps-locked games too, like fighting games. Street Fighter 5 looks fantastic on my CRT.

And then you'll have people using consoles on their monitors, where they default to 60hz output and a good chunk of the games are 60fps.

So really, outside of 100% sweaty esports people, you have a majority of people that will benefit from 60hz single strobe.

Even I play mostly on PC, and I play lots of single player games at 60hz because I'd rather crank the resolution and settings than the frame rate. Control being a recent example
 
Something else, I tested my Sunnix DPU3000 to its limits, I was able to display 3840x2160 at 50Hz (for testing only, of course, as it gets really flickery) on the F520 and FW900.

I had to manually lower the VGA timings, to get a perfect picture on the CRT at 4K, and yet not go above the 540MHz limit of the DPU3000. I can confirm it cuts out right at 540MHz, and works flawlessly slightly below that limit.

These are the timings I ended up with.

View attachment 493057

👍
very interesting experiments, also for the records.
wanted to test myself, so i tested on my dpu3000 and fw900 but i was not able to get this to work from CRU, using exact you settings, even after restaring the pc, that custom resoution was never available on the windows OS main resolution list , advanced display list, or from the nvidia control panel pc resolution list list, i tried with the latest CRU 1.5.1, and also setting the custom resolution in the "detailed resolutions" list and setting it first as "native" resolution.

but by creating that resolution and refresh rate using the same setting from yours, but in the nvidia control panel custom resolution, i finaly was able to create and use it, but horizontal image area was a little outside of screen area, not even using monitor OSD size or screen parameters i was able to fit the entire image inside the screen visibile area.

who knows if windows versions as someting to do here, it seems you used windows 11, i tested on win 10 64bit pro and windows 7 ultimate 64bit, both with latest updates and latest GPU drivers, was not able to test on windows 11 since it refuses to install on my system, i guess due to my unsupported processor from what i read (i7 4770k) and lack of TPU since i see win 11 requires.

by the way i tested with following parameters from nvidia control panel and was able to achieve 3840x2160 54hz, but again with a lot of horizontal image area out of visible screen area, this time even more than with 50hz, maybe you may try on your system to check if you achieve better results?

FW900 4K 54hz sunix dpu3000 CRU and NVCP settings side areas out of viewable screen area.jpg
 
Back
Top