24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Sad to hear you're completely getting out JBL, but I'm glad you gave us a lot of good posts and insights over the years.

I gotta say thought if I had to get rid of all my tubes and only keep one, it would definitely be a 15kHz set, whether a PVM or even a plain old TV. I would miss my PC monitors but life would go on. But it would be tough to not be able to play Mario Bros 3 or Super Metroid on proper 240p CRT.
 
Sad to hear you're completely getting out JBL, but I'm glad you gave us a lot of good posts and insights over the years.

I gotta say thought if I had to get rid of all my tubes and only keep one, it would definitely be a 15kHz set, whether a PVM or even a plain old TV. I would miss my PC monitors but life would go on. But it would be tough to not be able to play Mario Bros 3 or Super Metroid on proper 240p CRT.
My wife hates it but I just can't let it go, 240p on the CRT can't be beat.

1716292486189.png
 
lol just a quick update, so about 3 weeks ago there was a blackout in my hood and it killed my motherboard (Asrock Z590 Steel Legend) , up until that point i was using a 980Ti with dvi-i to vga passive adapter and ran my desktop usually at 1920x1200 interlaced 144hz on my Samsung Syncmaster 997MB , after it broke i sold the gpu and cpu (i3-10100F) and i had to use my living room computer for a couple weeks, living room pc is Athlon 3000G with igpu, when i hooked it to my 997MB crt the drivers were total garbage, the motherboard had a vga port so i connected my crt there and i couldnt get any resolution besides 1920x1080p60hz to work, they all blackscreen, i think 1440x900 60hz also worked actually, either way...it was shit.

Well yesterday i finally went and picked up my replacement pc for the time being, Gigabyte Z390UD + i3 9100 , limited to iGPU for the time being, this motherboard ONLY has HDMI port, i didnt have any hdmi adapters left, i literally went out and bought a super cheap generic 7U$D hdmi to vga adapter in the streets, just 12 streets down from my home, and i LUCKED OUT lmaooo, this thing can do 1920x1200i 90hz (havent tested further) with perfect blacks and equally sharp picture as the dvi-i to vga adapter , this is so ridiculous hahahahah , i thought i was gonna be limited to 1080i 60hz or some shit, i am soooo happyyyy , interlaced resolutions works EVEN BETTER on the intel iGPU than on the 980Ti with native vga on 368.81 drivers rofl.
 
Nothing in the conversation does let imagine they may come back with a solution for users to fix a serie they know has been faulty from the start, when a simple firmware update shouldn't be any difficult. I think my contact just dropped the line. A real shame of a company.
Just a little update to set things right. I actually got another message when I didn't expect any more. I'm told the firmware can't be updated on those adapters. Whether this is true or not, the situation doesn't change: either they're contacted within the warranty period and they may replace the faulty adapters, otherwise they won't stand behind their product.
 
lol just a quick update, so about 3 weeks ago there was a blackout in my hood and it killed my motherboard (Asrock Z590 Steel Legend) , up until that point i was using a 980Ti with dvi-i to vga passive adapter and ran my desktop usually at 1920x1200 interlaced 144hz on my Samsung Syncmaster 997MB , after it broke i sold the gpu and cpu (i3-10100F) and i had to use my living room computer for a couple weeks, living room pc is Athlon 3000G with igpu, when i hooked it to my 997MB crt the drivers were total garbage, the motherboard had a vga port so i connected my crt there and i couldnt get any resolution besides 1920x1080p60hz to work, they all blackscreen, i think 1440x900 60hz also worked actually, either way...it was shit.

Well yesterday i finally went and picked up my replacement pc for the time being, Gigabyte Z390UD + i3 9100 , limited to iGPU for the time being, this motherboard ONLY has HDMI port, i didnt have any hdmi adapters left, i literally went out and bought a super cheap generic 7U$D hdmi to vga adapter in the streets, just 12 streets down from my home, and i LUCKED OUT lmaooo, this thing can do 1920x1200i 90hz (havent tested further) with perfect blacks and equally sharp picture as the dvi-i to vga adapter , this is so ridiculous hahahahah , i thought i was gonna be limited to 1080i 60hz or some shit, i am soooo happyyyy , interlaced resolutions works EVEN BETTER on the intel iGPU than on the 980Ti with native vga on 368.81 drivers rofl.
VGA docks on MBs are mostly pure trash. Get yourself a Quadro M6000 24GB, the fastest and highest VRAM with DAC.
 
Enhanced Interrogator https://habr.com/ru/companies/ruvds/articles/524170/

Very insightful article. I learnt we're capped at 3800x2500 without software workaround.
Lol, that photo in the article is my FW900 on my desk I built in 2015! I keep seeing this photo on every article online wtf :ROFLMAO:
That's the thumbnail I used for that old video:

View: https://youtu.be/vIIWTMOvisw

fw900_youtube_vid_vIIWTMOvisw.png


380x has the same 400Hz DAC as all of R200-300s. There hasn't been a single GPU since ATi 9700 era with 500/+ Hz DAC, that's why everyone here always recommended people to use Sunix DPU3000-D4 and DeLock-iCYBox rebranded adapters. Only those 3 adapters with HQ RAMDAC over 535Hz default.
Xar, note that the AMD Radeon R9 380X, the R5 250, plus a few others I haven't tested, all have a 655.35MHz RAMDAC after applying the ToastyX Pixel Clock Patcher. And it works absolutely flawlessly!

I was able to do 3840x2160 at 60Hz progressive once, directly out of the R9 380X on my F520 after carefully adjusting the timings. It was not perfect, I could not even center the picture quite right, but I was very impressed at the time.
I had no idea back then that Intel UHD 7xx iGPUs and Intel DG1 dGPU could reach even higher resolutions when going interlaced over DisplayPort, so that below was the highest resolution I had ever reached on a CRT at the time!

PXL_20220830_164801077.jpg


Now for interlaced on the Radeons, the drivers have very very dumb limitations, after a lot of testing, I was able to get this...

5fw4f63b7pzc1.png


So basically, the RAMDAC is outputing 4800x3000 interlaced at 60Hz here, it's possible to go a bit higher as well with like 5000x3125. I was able to confirm the horizontal and vertical frequencies on the monitor's OSD, and everything looked perfectly stable.
Issue is, because of the weird AMD drivers, Windows desktop resolution can't go that high... So it uses the previous available resolution below as the desktop resolution, basically upscaling the 2560x1600 desktop resolution here, up to 4800x3000 to be displayed on the CRT, turns out blurry of course. How dumb!

So now for the limits on interlaced resolutions. Basically, you can't go above 2728 pixels horizontal. After that the resolutions stops showing up in Windows settings...

After a lot of testing taking advantage of the weird resolution scaling behavior above, I noticed that past 2728 pixels horizontal, the higher the horizontal resolution, the higher the pixel clock limit. For example:
- At 3840x2400i the pixel clock limit was 597.01MHz
- Going up to 4080x2550i the limit moved up to 634.32MHz

I tried changing only the vertical resolution, the pixel clock limit remained exactly the same. Only changing the horizontal resolution affected the maximum pixel clock limit. There is some weird math going on in the AMD driver file ATIKMDAG.SYS.
If ToastyX or someone else could figure out a way to patch the driver and remove those weird horizontal pixels limits and maybe also the weird pixel clock variable limit on interlaced resolutions, that would be absolutely incredible! Not sure that can be done, considering it does not look like a value to simply change, but some weird calculations.
Maybe I will drop ToastyX an email to explain to him what I found...

So overall those Radeons definitely have the absolute fastest native VGA output I've ever seen so far. Second best being the Sunix DPU3000, but over DisplayPort. The Radeon wins on progressive with native VGA output, while Intel UHD 7xx & Intel DG1 wins for interlaced over DP with the DPU3000.
Both Radeon and DG1 seems to behave very similarily when used in passthrough with a more powerful GPU taking care of the rendering. The Intel UHD 770 iGPU seems to work quite a bit better, but I need to do some more testing and compare better.

I didn't get any new display. I've been downsizing and decluttering. My kinds don't care about this kind of stuff. I care more about spending time with them. By the time they're out of the house these monitors will be at least 30 years old. No sense in hoarding them to myself.
Oh man, really sad to see you leave the hobby! Family is for sure the most important thing in life!

That said, sad to see you're not keeping any CRT at all, it's a pretty cool hobby especially since you've been around for such a long time!

I was talking to spacediver recently seeing him sell his defective FW900. He told me he had another one and I was relieved haha.
I was telling him how I remember all you guys here welcomed me and helped me through calibrating my monitor and everything back in the days. I was super happy to join this legendary thread, and I am still impressed all the time seeing how active it remains to this day!

On my side I feel lucky both a friend and I had the opportunity to get our hands on a bunch of monitors a few years ago. I will have to go through the recapping process for some if not all of them over the years, to make sure it will last as long as possible. Especially the FW900 since I was not able to find a second one as a backup.
I can't imagine myself selling any of that stuff, I would rather open a museum or something haha. I'm planning on making a gaming room one day, with a wall of retro PC hardware and the CRTs aligned on a large table below. But I don't have any room available for that at this time, so later I will figure something out somehow.

I'm currently thinking of getting a AMD-Xilinx Artix UltraScale+ FPGA dev board with DisplayPort 1.4 sink & source module. The goal being for me to learn how to work with FPGAs, as it would be my first FPGA project, and try to create a progressive to interlaced converter, with as low latency as possible.
That way on the PC I can just output anything progressive directly out of any recent GPU with latest drivers, no compatibility issues, no fighting with old drivers for GPU passthrough, no extra complexity and hopefully ultra low added latency.
Resolutions within the CRT limits could remain progressive and be passed through without any processing, and anything above those limits woud be automatically converted to interlaced using either GTF timings, or custom modelines. Something like that.
I've been looking at every single way I could find to get interlaced with recent hardware, and it's always a pretty meh compromise. So going that route seems to be the only remaining solution.

progressive_to_interlaced_schematic.png
 
Last edited:
By the way, I mentionned the Intel DG1 GPU earlier... I don't think I talked about that here before. Well... imagine my surprise when I learned Intel created a dedicated GPU before ARC existed. ARC is codenamed DG2, and DG1 was its predecessor. It uses PCIe 4.0 8x.
It's a very very quirky Iris Xe dedicated GPU, that absolutely requires the BIOS to have "Resizable Bar" enabled, or it will not even POST. Originally, the card was designed to work only on two specific Asus motherboards, and alongside an Intel CPU. But with that option enabled in the BIOS it will work on any motherboard and also on a Ryzen system.

That said, there will be no output whatsoever during the boot, video output from the DG1 will only be available after the Windows drivers are loaded. While being a bit annoying, it's acceptable, since the goal is to use it for passthrough alongside another GPU.

Also, when switching resolutions on a CRT, it will very often (if not always) disconnect the monitor... Unplugging and replugging is not the solution, you need to go to the Windows monitor settings, and chain click on "Extend these displays" (sometimes 20 times...) until it finally works! Don't ask me how I even found out about that... let's just say I was incredibly annoyed.
I've experienced this a LOT with interlaced resolutions, I don't remember if it's the same on progressive, as the only reason to use GPU passthrough is exclusively for interlaced, as I can connect the VGA adapter directly to the Nvidia card for progressive anyway.

So, this bug is not the end of the world though. I made the following little PowerShell script to automate the "Extend these displays" procedure. It retries as many times as needed until it works, then it automatically stops.
I turned it into an exe with the PS2EXE tool, and assigned it a keyboard shortcut Ctrl+Alt+F12. So anytime the CRT shuts off when switching resolution, I use the shortcut and wait a couple seconds for it to get back on.

Code:
while ((Get-WmiObject -Class "win32_videocontroller" -Namespace "root\CIMV2" | ?{ ($_.VideoProcessor -eq "Intel(R) UHD Graphics Family") -or ($_.VideoProcessor -eq "Intel(R) Iris(R) Xe Graphics Family") } | Select-Object -ExpandProperty MaxRefreshRate) -in ($null, 0)) {
    DisplaySwitch /extend
}

I believe this only works / works best when using the CRT alongside a secondary monitor plugged in to the main GPU. At least that's how I tried it myself. Now if you launch a game that changes resolution on startup, well... That's another problem.
I tried Counter-Strike Source, and fortunately the game has launch parameters to set the resolution, so problem avoided here.

This monitor disconnection issue seems to have been fixed on drivers 31.0.101.4575 and above, not too sure, but unfortunately support for interlaced resolutions was already long gone at this point. So no good here.

Now, why even bother with this weird GPU that doesn't seem interesting at all? Well, while ARC is very very limited with interlaced resolutions around 220MHz pixel clock limit (221.71MHz at 1920x1440i, 223.63MHz at 1600x1200i, so not a fixed pixel clock limit here), the DG1 on the other hand has the same 503.23MHz bandwidth as the Intel UHD 770 iGPU when running interlaced over DisplayPort! That's getting very close to the 540MHz limit of the Sunix DPU3000! This is only doable with earlier driver versions, I am using version 31.0.101.3975, but version 31.0.101.4032 also works. Support for interlaced was dropped mid January 2023 for the DG1, and I seem to remember the iGPUs got one or two more driver revisions before also seeing interlaced resolutions go away.

So this seems to be a very interesting way to do interlaced with GPU passthrough alongside a recent AMD or NVIDIA GPU, on a Ryzen system (without a Intel UHD 7xx iGPU). Also, future generations of Intel CPUs most certainly won't support interlaced output, since support is already gone on latest drivers with current 12/13/14th gen, so the DG1 is an interesting alternative to continue upgrading the CPU to a more modern one in the near future.

With all that said, I finally received my Intel Core i5 13600K last week! So for the time being I will be using it for passthrough with interlaced resolutions. The DG1 is such a hassle, it's still nice to have it for some future use, but for now the current gen Intel is a much better choice.

Now there is the latency issue with running passthrough... a topic for another time. But yeah, it's very low but it's there. It seems lower on the iGPU compared to both DG1 and old Radeons. Not too sure about that.
But either way it's enough to be just noticeable in some games. Which is why I'm now thinking about the custom FPGA hardware solution I mentionned above. Not sure I can get that done, but I would be very interested in trying.
 
Last edited:
Lol, that photo in the article is my FW900 on my desk I built in 2015! I keep seeing this photo on every article online wtf :ROFLMAO:
That's the thumbnail I used for that old video:

View: https://youtu.be/vIIWTMOvisw

View attachment 655183

Didn't know the russian who wrote the thread was a copycat, 🤣
Xar, note that the AMD Radeon R9 380X, the R5 250, plus a few others I haven't tested, all have a 655.35MHz RAMDAC after applying the ToastyX Pixel Clock Patcher. And it works absolutely flawlessly!
Yup. I've already noticed you can 🔓 those DACs nearly a decade back with ToastyX's. It just unlogical and nonsensical for AMD and NV to capped theirs at the 400MHz standard even though they knew its full capacity. The initial rumour even had Kepler at 500MHz before NV clamped it down to 400 like its predecessors since 2006.
I was able to do 3840x2160 at 60Hz progressive once, directly out of the R9 380X on my F520 after carefully adjusting the timings. It was not perfect, I could not even center the picture quite right, but I was very impressed at the time.
I had no idea back then that Intel UHD 7xx iGPUs and Intel DG1 dGPU could reach even higher resolutions when going interlaced over DisplayPort, so that below was the highest resolution I had ever reached on a CRT at the time!

View attachment 655184
This model costs over 4000€ in my region. It's almost as expensive as FW900 and has even better tube than the FW900. I gave up buying it since the CRTs I bought never cost more than 350€ maximum.

Never expected Intel cards to be scaling that well with CRT given how unsupportive they are.
Now for interlaced on the Radeons, the drivers have very very dumb limitations, after a lot of testing, I was able to get this...

View attachment 655185
So basically, the RAMDAC is outputing 4800x3000 interlaced at 60Hz here, it's possible to go a bit higher as well with like 5000x3125. I was able to confirm the horizontal and vertical frequencies on the monitor's OSD, and everything looked perfectly stable. Issue is, because of the weird AMD drivers, Windows desktop resolution can't go that high... So it uses the previous available resolution below as the desktop resolution, basically upscaling the 2560x1600 desktop resolution here, up to 4800x3000 to be displayed on the CRT, turns out blurry of course. How dumb!
So now for the limits on interlaced resolutions. Basically, you can't go above 2728 pixels horizontal. After that the resolutions stops showing up in Windows settings...

After a lot of testing taking advantage of the weird resolution scaling behavior above, I noticed that past 2728 pixels horizontal, the higher the horizontal resolution, the higher the pixel clock limit. For example:
- At 3840x2400i the pixel clock limit was 597.01MHz
- Going up to 4080x2550i the limit moved up to 634.32MHz

I tried changing only the vertical resolution, the pixel clock limit remained exactly the same. Only changing the horizontal resolution affected the maximum pixel clock limit. There is some weird math going on in the AMD driver file ATIKMDAG.SYS.
If ToastyX or someone else could figure out a way to patch the driver and remove those weird horizontal pixels limits and maybe also the weird pixel clock variable limit on interlaced resolutions, that would be absolutely incredible! Not sure that can be done, considering it does not look like a value to simply change, but some weird calculations.
Maybe I will drop ToastyX an email to explain to him what I found...

So overall those Radeons definitely have the absolute fastest native VGA output I've ever seen so far. Second best being the Sunix DPU3000, but over DisplayPort. The Radeon wins on progressive with native VGA output, while Intel UHD 7xx & Intel DG1 wins for interlaced over DP with the DPU3000.
Both Radeon and DG1 seems to behave very similarily when used in passthrough with a more powerful GPU taking care of the rendering. The Intel UHD 770 iGPU seems to work quite a bit better, but I need to do some more testing and compare better.
Amazing finding dude 👍🏼
I never played around 60 Hz or anything lower before. The guys always told me you'd be defeating your own purpose of getting a CRT in 2023. So I went with the usual extreme RR tweaking. I capped at 170 Hz 1920x1440 max on IBM P275 with 3090 Ti using Sunix DPU3000-D4. Will try lower the RR and boosting Res trick like you do. 😍

Have you ever play around with Maxwell (745-Titan X-M6000 24GB) or Pascal's (1030) btw?

Over 8 communities since 2003, I did heard beyond thousands of time ATi/AMD has evidently superior DAC and overall output quality even by TechYesCity's reply.
But about those limitations you encountered, it might be Radeon's exclusive (whether it happened because of ToastyX's magic or good ole' Radeon pipeline calculation bugs). I wanna know if I could ToastyX GeForce/Quadro GPUs beyond the DAC MHz limits with P275 and DPU3000-D4.
 
Last edited:
VGA docks on MBs are mostly pure trash. Get yourself a Quadro M6000 24GB, the fastest and highest VRAM with DAC.
Dude that's only like 10% more powerful than a 980Ti , admittedly it is quite cheap on eBay, i think i saw it for like 200$ on average but just like the 980Ti , interlaced on modern drivers will be limited to the HDMI port, modern nvidia drivers absolutely kill the analog ports check it out if you dont believe me you cant even make 1280x800 75hz render, on my 980Ti it turned the entire screen into a yellow hue mess.

At that point in time i think i'd just go down the passthrough route, pay 100$ more and buy whatever best used rtx gpu i can get, which will probably destroy all maxwell gpus in performance even if it has less ram, the 2060 non-super is already 25% faster in gaming than the Quadro M6000 and Titan X Maxwell.

I will be sticking with this setup for at least 5 more months though, im not working currently and Uni is killing me.
 
  • Like
Reactions: Xar
like this
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs? Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.

That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
 
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs?
Differences are:

-GRD and SD for GeForce RTX and GTX (745-Titan Xp-Titan V CEO Ed.)

-Production Branch/RTX Studio and New Feature Branch/RTX Enterprise for Quadro RTX/RTX xxxx

-DCH package and Standard package

-Windows vers

-Languages
 
Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.
There might be slight differences in how much you could get around playing with the limits each driver between GeForce and Quadro's DAC I reckon. ToastyX and Chief Blur Buster might knew about this.

In terms of components used within each GPU they produced, Quadro is the absolute finest. I imagine the same applies with its DAC (could be even better than Radeon's)
That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
Yeah. The most convenient method is to past Analog signal through external adapter/converter. Even though input-lag and latency, FPS compensates them.
 
I might be wrong with that, but does the Nvidia Pro GPUs use a completely different driver compared to the gaming GPUs? Meaning you could run both a RTX with the latest drivers, and the Quadro with correct drivers for interlaced.

Someone mentionned that to me, I completely missed that as I never used any pro Nvidia GPUs before. I need to get my hands on a Quadro to test at some point.

That said, I don't think there is anything better than passthrough for good performance these days. The really annoying part is it adds a little bit of input lag, which is very undesirable.
Pascal gpus including the 1080Ti and Titan XP can do interlaced on the newest modern drivers too, but ONLY on the HDMI port and only limited to 400mhz bandwidth i think, even if your adapter can go further.
 
Pascal gpus including the 1080Ti and Titan XP can do interlaced on the newest modern drivers too, but ONLY on the HDMI port and only limited to 400mhz bandwidth i think, even if your adapter can go further.
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
 
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
I was always skeptical if Pascal was initially gonna featured more cards with VGA/DL DVI-I. How they ended up with only 1030 variants featuring integrated DAC is just unusual.

Make more sense to just retire signal converting with the whole Pascal, even better extending it to Turing GTXs.
 
On the latest driver?! I remember we used to be stuck to 411.70 as the latest driver than could do interlaced... They added it back later? I just read on the Nvidia forum some guy saying 537.58 is the latest driver supporting interlaced early 2024?
Damn, if Nvidia plays with us going back and forth with drivers supporting and not supporting interlaced, it's very annoying and quite an unreliable option.

I'm very curious though, I have a GTX 1080 I will test that.

That said, it's kinda pointless for me as it should perfom just like passthrough with an Intel iGPU that can do a whole lot better with 503MHz and over DisplayPort with better adapters.
And also if Nvidia requires some trickery to get interlaced resolutions to display, as I remember it used to be (is it still like that?), Intel iGPUs just works. You set the resolution in CRT, and once it's done, you just switch resolution in Windows display options, that's all.

I'm gonna test and see by myself.

To me, the only good usecase for this would be if you use the LK7112 adapter, as Intel would be more limited over HDMI than Nvidia here if it goes up to 400MHz interlaced over HDMI.
Or if you intend on using a 1080 Ti standalone, for rendering as well as display. But since interlaced allows for really high resolutions and refresh rates like 2560x1600 at 140Hz or something like that, the 1080 Ti won't really keep up with latest and upcoming titles.
yeah, im Argentinian and another Argentinian user from reddit "druidvorse" bought the LK7112 adapter from that turkish guy on eBay to use with his GTX 1060 , that is his only gpu and his only monitor is the same as mine (Samsung Syncmaster 997MB) , i asked him 8 days ago on private message and he confirmed, he can still run 1920x1200i 144hz out of his 1060 on the hdmi port with the 7112 adapter with the latest nvidia drivers, the 400mhz i pulled out of my ass, is just my guess because that is traditionally what they limited us to on their own ramdacs, but i think its the 7112 limit as well anyways, i think the best use case for this would be a 1080Ti with the LK7112 to run somethig crazy like 1680x1050i 165hz or 1920x1200i 165hz , something like that, the 1080Ti can still pull that type of performance in some decent games before 2022, but most importantly, you dont get the added input lag of passthrough.

btw since i havent passthrough ever i wanna know, is the input lag difference actually sensible? do you genuinely feel it when you go from native to passed through??? be honest wit meee :DDD
 
that is his only gpu and his only monitor is the same as mine (Samsung Syncmaster 997MB) , i asked him 8 days ago on private message and he confirmed, he can still run 1920x1200i 144hz out of his 1060 on the hdmi port with the 7112 adapter with the latest nvidia drivers
Well, that's really interesting stuff then! Tomorrow I will fire up a PC I was gonna put for sale (GTX 1080 & i7 6700K), and test that before listing it. I plan on getting a EVGA 1080 Ti Kingpin if I ever find one for a good price one day, it would be really nice to know it is the last GPU that can run interlaced properly natively with the LK7112 adapter all the way up to the adapter limit and on recent drivers too!

the 400mhz i pulled out of my ass, is just my guess because that is traditionally what they limited us to on their own ramdacs, but i think its the 7112 limit as well anyways
I don't remember properly testing the limits of this adapter on interlaced, so I'll finally be able to do that. I will report here when I'm done testing.

btw since i havent passthrough ever i wanna know, is the input lag difference actually sensible? do you genuinely feel it when you go from native to passed through??? be honest wit meee :DDD
So, I didn't play a whole lot, but I can tell you how it felt for me.

When I was testing interlaced passthrough for the first time on the Intel UHD iGPU (i5 13600K & RTX 4060), I started by playing a few games of Call of Duty MW3 Multiplayer, and I didn't really notice the added input lag at first. Everything felt really smooth, like it should at 140Hz.
Then I went to play some WoW Classic, and here after just a few minutes of gameplay flying over the zones I really started to feel like the movements were not quite right. It surprised me a bit, because that's something I definitely would have felt playing a FPS, and now I was noticing this in WoW o_O
So I switched back to direct output from the RTX (progressive, same resolution, lower refresh rate), and yeah the difference here felt really obvious immediately. Went back to passthrough, and for sure, there is noticeable input lag... unfortunately.

So overall, the added input lag is quite minimal I would say, it did not prevent me from winning in Call of Duty :ROFLMAO:, but in some more competitive games I guess it can become a real problem. I'm more of a casual player, but still it annoyed me for sure.

With a good adapter, on direct output, you can get close to maxing out the CRT on progressive. So going through all that trouble pretty much just for interlaced, getting higher refresh rates at potentially higher resolutions but at the expense of slight but just noticeable added input lag... I'm not sure here. Not quite convinced by that setup for now.

That said, I will test some more. The difference here between WoW Classic and MW3 is, I'm running a 4060 and MW3 was running around 130-140fps overall, not really over that. On the other hand, WoW had the potential to go a whole lot higher in fps.
I'm trying to remember if I did cap the fps or not, which could make a noticeable difference in input lag, as I noticed when testing the PCIe GPUs. The iGPU was much more forgiving with that, but still, I need to retest tomorrow and enable VSYNC or just enable the fps limiter in game.
Once again that would be sort of a compromise, but if it lowers the input lag it would be great!

All that is why I'm now thinking of going all out and try to design my own DisplayPort to DisplayPort adapter (DP1.4), that takes progressive on its input and turns it into interlaced on its output, then followed by the 540MHz Sunix DPU3000 for VGA out.
I've always been curious about FPGAs, how to work with them and all. So it's gonna be the occasion for me to have a closer look and see if I can pull this off or not. Very curious about this...
Hopefully that should be as lag free as it gets, and all Windows will see here is a progressive resolution directly from a single GPU, so no weird compatibility issues or anything.
 
Well, that's really interesting stuff then! Tomorrow I will fire up a PC I was gonna put for sale (GTX 1080 & i7 6700K), and test that before listing it. I plan on getting a EVGA 1080 Ti Kingpin if I ever find one for a good price one day, it would be really nice to know it is the last GPU that can run interlaced properly natively with the LK7112 adapter all the way up to the adapter limit and on recent drivers too!


I don't remember properly testing the limits of this adapter on interlaced, so I'll finally be able to do that. I will report here when I'm done testing.


So, I didn't play a whole lot, but I can tell you how it felt for me.

When I was testing interlaced passthrough for the first time on the Intel UHD iGPU (i5 13600K & RTX 4060), I started by playing a few games of Call of Duty MW3 Multiplayer, and I didn't really notice the added input lag at first. Everything felt really smooth, like it should at 140Hz.
Then I went to play some WoW Classic, and here after just a few minutes of gameplay flying over the zones I really started to feel like the movements were not quite right. It surprised me a bit, because that's something I definitely would have felt playing a FPS, and now I was noticing this in WoW o_O
So I switched back to direct output from the RTX (progressive, same resolution, lower refresh rate), and yeah the difference here felt really obvious immediately. Went back to passthrough, and for sure, there is noticeable input lag... unfortunately.

So overall, the added input lag is quite minimal I would say, it did not prevent me from winning in Call of Duty :ROFLMAO:, but in some more competitive games I guess it can become a real problem. I'm more of a casual player, but still it annoyed me for sure.

With a good adapter, on direct output, you can get close to maxing out the CRT on progressive. So going through all that trouble pretty much just for interlaced, getting higher refresh rates at potentially higher resolutions but at the expense of slight but just noticeable added input lag... I'm not sure here. Not quite convinced by that setup for now.

That said, I will test some more. The difference here between WoW Classic and MW3 is, I'm running a 4060 and MW3 was running around 130-140fps overall, not really over that. On the other hand, WoW had the potential to go a whole lot higher in fps.
I'm trying to remember if I did cap the fps or not, which could make a noticeable difference in input lag, as I noticed when testing the PCIe GPUs. The iGPU was much more forgiving with that, but still, I need to retest tomorrow and enable VSYNC or just enable the fps limiter in game.
Once again that would be sort of a compromise, but if it lowers the input lag it would be great!

All that is why I'm now thinking of going all out and try to design my own DisplayPort to DisplayPort adapter (DP1.4), that takes progressive on its input and turns it into interlaced on its output, then followed by the 540MHz Sunix DPU3000 for VGA out.
I've always been curious about FPGAs, how to work with them and all. So it's gonna be the occasion for me to have a closer look and see if I can pull this off or not. Very curious about this...
Hopefully that should be as lag free as it gets, and all Windows will see here is a progressive resolution directly from a single GPU, so no weird compatibility issues or anything.
Hey could you do me a solid if you try the 1080ti interlacing on modern drivers? could you please try Cyberpunk 2077 on 1080i ??? and snap a couple photos at least (if you can shoot a vid even better!) if you cant is all good lol
 
Im honestly still baffled by how well intel uHD handles interlaced, this is insane, it's wayyyy better than nvidia drivers or anything on linux, i can turn off the computer and its all fine, i can switch between different resolutions, i dont have to worry if a game changes the resolution, the transition between progressive and interlaced is super smooth.

I wish there was a way to talk to intel about this, there is gold there.
 
he can still run 1920x1200i 144hz out of his 1060 on the hdmi port with the 7112 adapter with the latest nvidia drivers
Sorry lots of unexpected stuff, my testing was delayed. I just had some time and did a quick test with the GTX 1080, on drivers 552.22. While I was able to hit the 430-ish MHz with my LK7112 no problem on progressive, I just can't see any interlaced resolution show up anywhere.

I tried with the Nvidia Control Panel at first, not working, it just rejects interlaced resolutions as I remembered with any recent driver.

Then I followed this below, thank you a lot @Petrasescu_Lucian for that tutorial it was super helpful to be able to use CRU resolutions correctly with Nvidia drivers!
I've assembled a small tutorial of how to get the most of your FW900 with the RTX 4000 series with the latest drivers on Windows 10 and I've attached a zip file with all the necessary files.

1. Extract the latest geforce driver and edit nv_dispig.inf like this in order to see ONLY the resolutions you set into CRU:

[nv_commonDisplayModes_addreg]
HKR,, NV_Modes, %REG_MULTI_SZ%, "{*}S 3840x2160x8,16,32,64=FFFF"

2. Install the driver via its Setup (you may need to disable the windows driver signature enforcement via F7 after an advanced restart in Win10 prior to this)

3. Run Nvidia Pixel Clock Patcher

4. Install the attached FW900 inf driver file via Device Manager -> Monitors.

This driver has been created with CRU and contains the vital HDMI datablocks required for the converter to use all of its bandwidth (feel free to load it up in CRU via Import to check it out). You'll see there that the highest pixel clock timing is 2560x1600 73Hz which is ~430MHz. The best I could do with the GTX 980Ti at that resolution was 68Hz (400MHz) so I get an awesome 5Hz upgrade at the resolution I keep my desktop at all times.

5. Restart your PC. You should now see ONLY the resolutions set-up in CRU.

6. Use RefreshLock to lock your resolutions to the highest refresh rate (very useful in games). You can do it globally or per resolution.

7. All current geforce drivers are DCH so you'll notice the Control Panel is missing and you get an annoying Windows Store notification. Copy the Control Panel Client wherever you want and execute the included registry file.

8. Enjoy!
I added the interlaced resolutions to the CRU driver inf file. And every resolution showed up... once again except the interlaced ones.

So no luck at all here. Do you have any idea how he managed to get interlaced working with recent Nvidia drivers?

When I find more time I will install older drivers and test interlaced correctly over HDMI with this adapter.

On another subject, I was able to quickly test the following adapters:
  • Delock 64002
  • Unitek V1126A
  • Vention TFAHB
I don't know if anyone did test those here before. I will get back on that soon, but it's nothing interesting.

Here is the inside of the Vention TFAHB with the Lontium LT8712X, but it has a USB 3.0 port, so from the start it was probably not gonna be a good performer anyway, sadly.
The USB C Power Delivery injector is a nice thing though, but I'm only interested if the VGA performance is good...

DSC00758.JPG
DSC00759.JPG
DSC00760.JPG
 
Interlaced works fine on driver version 537.58, tried it with the CRU technique. It's a fairly recent-ish driver dating Oct. 10, 2023 so just last year. But definitely not the latest driver. Some recent games might not like an older driver like this one.

Tried 2560x1600i at 140Hz, which is 429.95MHz. It displayed fine, but I was starting to see some noise looking closely at the screen.
My LK7112 adapter does not have a custom heatsink yet, just the tiny one from the eBay seller. So in just a few seconds it did start cuting off. So no more testing for now, I'll continue tomorrow.

I will get some of those heatsinks on Mouser, I have an order to pass soon anyway, but tomorrow I'll try to find something to help dissipate the heat for further testing.

EDIT: Tried it quick again at 3200x2000... and it crashed the driver. Same thing on 3840x2400 of course. Oh well, that damn Nvidia driver that never allowed going past 2560 pixels wide on interlaced. I forgot about that!
Well... so 2560x1600 around 140, maybe 144Hz, is the best I can do here it looks like so far.

EDIT2: 2800x1750 interlaced at 118Hz works. The limit is not 2560 then, it's a bit higher.
Having to reboot in the mode that ignores driver signatures, apply the CRU inf file, and reboot is not the best way to test resolutions efficiently, but well it gets the job done lol.

EDIT3: 3000x1920 interlaced at 100Hz works as well! Considering 3200 does not, the limit is either 3000 or very close. I will find the exact limit later.
 
Last edited:
I continued the testing today, and it seems like 3065 is the maximum horizontal size we can use with interlaced here on the GTX 1080 (running driver 537.58).
This is what I get at 3066... I reverted very quickly, things were not happy at all with the whine it made showing this garbage with completely messed up sync.

DSC00771_resized.JPG


At 3065 it works flawlessly. So I don't know what weird stuff is going on with interlaced resolutions in the drivers but yeah.

So if we want to stick to standard aspect ratios, these are the highest interlaced resolutions we can get:
  • At 16:9 we have 3040x1710
  • At 16:10 we get 3056x1910 or 3040x1900
Otherwise 3065x1962 gets close to the same aspect ratio as 1600x1024, which is the aspect ratio of the FW900 tube if I'm not mistaken?

So for all that testing I ended up strapping the LK7112 converter to an old Pentium 4 heatsink I had around. It's a huge heatsink, the IC does not even get warm anymore so that's great!

DSC00768_resized.JPG


So with that I was able to finally test the pixel clock limit of the adapter, which turns out to be 442.52MHz on mine. I tested at 3040x1900 interlaced just below 105Hz.
Going to 442.53MHz I can see some quick artifact lines showing up from time to time, but the display remains good.
Going past that, at 442.54MHz it starts cutting the bottom of the display more and more, with like 1 third cut at that frequency, and more than half of the screen cut at 442.55MHz.

DSC00775_resized.JPG


I didn't test long term yet of course, but it looks like 440-442MHz seems to be good. That's definitely very nice, and 100MHz shy compared to the Sunix DPU3000.
That's still a huge difference but that said, when I use interlaced for gaming, I want to run a resolution that is high enough to not be bothered by the alternating lines blending together at high enough resolution, but maybe not go too high to have the highest refresh rate I can get without annoying lines.
So that turns out to be around 2560x1600 on the FW900 for me. At that resolution, the maximum refresh rate I can run interlaced on the FW900 reaching the 121kHz horizontal limit is 140Hz at default GTF timings, and 144Hz no problem after tweaking the timings a bit.
Running that resolution at that refresh rate also turns out to be in the 400-440MHz range for the LK7112 adapter! So I'm basically maxing out everything here, and it all works very well!

Now that 3065 horizontal limit is still very annoying, because I really appreciate the flexibility of being able to reach 4K on interlaced, and in fact going to 16:9 aspect ratio, it's possible to reach 3840x2160 at 72Hz (perfect for movies) with the LK7112 as it gets to 423.06MHz on GTF timings. But the driver won't let me... wtf.

But anyway, I might very well build a small GTX 1080 Ti Mini SFF PC for fun at some point hehe. Probably paired with a Intel 12/13/14th gen or something for the CPU, so I can use the UHD 770 iGPU for fun at 4K with 503.23MHz max pixel clock and direct output from the GTX at 2560x1600 for gaming, with both connected at the same time on inputs 1 and 2 of the CRT.

EDIT: Oh damn, the max pixel clock of the adapter depends on the resolution?! I was testing 2560x1440i at 144Hz, with custom timings I was at 418.84MHz, and there is only a small band at the top of the screen, everything else is cut! But it was looking perfectly fine at 442MHz at 3040x1900 just before...

EDIT2: That adapter just hates 144Hz at that resolution for some reason. I went below 400Mhz, and it was still going back and forth between the full and cut picture. I switched to 140Hz at 429MHz pixel clock, and it works flawlessly... Really weird behavior.
 
Last edited:
Another thing I was able to retest. HDMI to DisplayPort adapters just don't work at all with interlaced resolutions, at least of the few I tested, none did.
  • Akasa AK-CBHD24-25BK
  • Delock 63206 (Lontium LT6711A)
  • Gofanco HDMIUSBC
  • Startech HD2DPVGADVI (works fine with interlaced on the VGA port, but not on the DisplayPort output)
The other way around, DisplayPort to HDMI, things just get weird really. It works at some resolutions, some other resolutions don't show up in Windows with Intel at least. With some adapters more resolutions show up... Overall interlaced works, but it's not a good solution at least from my experience.

Regarding DisplayPort to Type C, the good old UPD2018 works very nicely. In fact it does not need to be plugged in to an actual internal PCIe slot in a PC. I just used a PCIe riser to provide power to the card externally, and hooked up DisplayPort to it, and it works just fine on the Type C output.
Since my PC is a Mini ITX, I don't have a spare PCIe slot for that card, so I'm glad that solution worked.

DSC00762_resized.JPG


I tested that with a few Type C adapters, and I also tested my Type C dock Icy Box IB-DK4050-CPD, with the DPU3000 on the dock DisplayPort output. I was able to reach 3840x2400 at 80Hz interlaced no problem, through the UPD2018.

I was also shocked to find out my LattePanda 3 Delta 864 has the same 503.23MHz bandwidth limit as my 13600K iGPU on its Type C port (through the same Icy Box dock, then the DPU3000)!
I did test that very thing twice in the past, and never reached that, I must have done something wrong before... Right now it's on driver 31.0.101.1999.
It has a Intel Celeron N5095 CPU with "Intel UHD Graphics", Intel Gen 11 graphics, not the same as the UHD 700 series, but the generation just before. Also works well then!
Not the most useful thing of course, but neat nonetheless. If I need to test an adapter or a random CRT monitor, it's the only thing I have that does interlaced with native Type C output and is very quick to setup, it's practical.

I'm gonna have to retest my 9700K a bit later with its UHD Graphics 630 and see about that one, I remember it was limited but I might have been wrong (not the correct driver?). That one is like Gen 9.5 Intel graphics if I'm not mistaken, one more generation before.

EDIT: Alright, I didn't know about DisplayPort Dual Mode (DP++), which basically allows DisplayPort to output HDMI compatible signal directly.
This is probably the reason why when using DisplayPort to HDMI adapters, and for example the LK7112 over DisplayPort, I see more or less the same limitations as native HDMI with the Intel drivers.
By that I mean, running the Sunix DPU3000 over DP, I can reach run interlaced no problem at high bandwidth, but when running the LK7112 over DP via an adapter, suddenly it becomes a problem, and most interlaced resolutions don't show up in Windows anymore.
Looks like the adapter is not doing an active conversion from DP to HDMI, but the GPU itself is asked to output HDMI through the DP out... Well.

It would also explain why the other way around, converting HDMI to DisplayPort, interlaced resolutions are not supported at all by any adapter I tested.
In this direction, an actual conversion is done by the adapter, since HDMI does not have such a dual mode like DP does.
And I'm sure manufacturers wouldn't go through the trouble of supporting something like interlaced when designing an adapter.
 
Last edited:
On another subject, I was able to quickly test the following adapters:
  • Delock 64002
  • Unitek V1126A
  • Vention TFAHB
I don't know if anyone did test those here before. I will get back on that soon, but it's nothing interesting.
Thanks for all these tests, a lot of good information.
I'm also interested in those adapters, Algotek AG9300 and two Lontium chipset.
 
Haha, it is possible, but I for sure would not want to run my FW900 to 133kHz for long. Definitely don't want to take those risks.

That said, this is only for progressive, but at 4K you can run interlaced, and there is absolutely no visual difference whatsoever compared to progressive at such a high resolution. It looks 100% like progressive to my eyes, even looking very close.
So running iterlaced it goes up to 80Hz at 3840x2400 after adjusting the timings a little, and it does 72Hz no problem on stock GTF timings.

At 72Hz you get 90.108kHz / 471.45MHz and at 80Hz you are around 100kHz at 500MHz or so. So the display is not the limiting factor here, it's the Intel iGPU I use to run interlaced that is limited to 503.23MHz bandwidth.
Otherwise the FW900 could run 96Hz at that resolution on stock GTF timings (121.776kHz, 644.93MHz). That would be mindblowing.
The R9 380X has the RAMDAC to reach that high... but interlaced is way too limited sadly. Always a compromise somewhere... if only a ToastyX patch could fix that, it would be the absolute best solution out there. (y)
 
I think I remember there's a mode with something like 129kHz in horizontal frequency in Windas, so that was probably possible at a moment during the product development, but that mode doesn't work on my screens. There were probably good reasons to lock it if they did this.
 
If only someone figured how to increase FW900's horizontal refresh limits to 133KHz then we could have 4K UHD on GDM-FW900 and as a species achieve display singularity...

But I guess its possible at slightly lower refresh rate like 54-55Hz...

You'd still be limited by the electron optics of the tube, which are not powerful enough to properly resolve such a high resolution.
 
Guys,

Do you know if there is a possibility to alter the monitors serial number via WinDAS, or otherwise? It is not stored it a .DAT file that WinDAS downloads from the EEPROM and I dont know where to search.

For example, when you have two monitors (one with good casing but defective, one with perfect electronics but broken chassis) and want to combine them both into one pristine unit? Just so the displayed SN would match the casing SN.
 
Back
Top