• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.

24" Widescreen CRT (FW900) From Ebay arrived,Comments.

You should probably read the words on the adapter.....
but just for the record though, the box said the opposite thats why i didnt check the device, i was too anxious trying to get this thing to work.

1740617968231.jpeg
 
ok brothers, i gave up and just bought the god damn GTX 1080 Ti , now i just gotta figure how to get interlaced to work with modern drivers.
 
ok brothers, i gave up and just bought the god damn GTX 1080 Ti , now i just gotta figure how to get interlaced to work with modern drivers.
For CRT without external adapters 1080Ti is not that good of a choice as it lacks analog output and you might just as well got something newer.

As for interlacing it is unsupported on GPUs for a long long time. Newest system I was able to make interlacing work it was iGPU on 9900K as Intel did support interlaced modes on iGPUs. Not sure their discrete Arc GPUs.
That said I personally don't really see the point of interlacing and I don't miss it one bit. In the past it was always an option but I never used it.
I would very much prefer to have ability to define custom scalers to have e.g. 2560x1440 exposed to game but displayed as 2560x720 with lines being blended providing form of super sample anti-aliasing. This is possible on Linux using GameScope by tinkering with it but on Windows not so much.
 
For CRT without external adapters 1080Ti is not that good of a choice as it lacks analog output and you might just as well got something newer.

As for interlacing it is unsupported on GPUs for a long long time. Newest system I was able to make interlacing work it was iGPU on 9900K as Intel did support interlaced modes on iGPUs. Not sure their discrete Arc GPUs.
That said I personally don't really see the point of interlacing and I don't miss it one bit. In the past it was always an option but I never used it.
I would very much prefer to have ability to define custom scalers to have e.g. 2560x1440 exposed to game but displayed as 2560x720 with lines being blended providing form of super sample anti-aliasing. This is possible on Linux using GameScope by tinkering with it but on Windows not so much.
Interlacing is beautiful and awesome.

1080Ti was the last gpu to support interlacing hardware wise.

simple as
 
The Radeon Vega and VII GPU's probably support interlaced since they're GCN architecture. I think they came out after the 1080Ti
Vega VII could be ineteresting, but hard to get in my country.

56 and 64 are moot because they get pummeled by the 1080Ti and i doubt they're easier to make run interlaced on windows.
 
I have a Radeon VII, but I'm on windows 7. If you let me know how I can help to check I'll try to get it figured out
 
I have a Radeon VII, but I'm on windows 7. If you let me know how I can help to check I'll try to get it figured out
remember that if it works, it can only work on the hdmi port, if you are using dp to vga on your VII it cant possibly ever work afaik.
 
Hello, are there any good usb type C to vga adapters that I can use to connect a nice CRT like the FW900 to a GPD Win Mini handheld PC? Sorry if this has been answered already, I tried skimming trough the last few dozen pages and didn't find what I need

P.S. I see a lot of type C X-in-1 docks feature a VGA port among other things, would be nice if one of those is capable enough but I doubt it
 
Hello, are there any good usb type C to vga adapters that I can use to connect a nice CRT like the FW900 to a GPD Win Mini handheld PC? Sorry if this has been answered already, I tried skimming trough the last few dozen pages and didn't find what I need

P.S. I see a lot of type C X-in-1 docks feature a VGA port among other things, would be nice if one of those is capable enough but I doubt it
I've used the plugable successfully:

https://hardforum.com/threads/24-wi...ived-comments.952788/page-396#post-1043620545
 
Interlacing is beautiful and awesome.
I don't see it that way at all.
Combing artifacts and reduced motion resolution is not my cup of tea.

Still I cannot imagine there is any benefit from dropping support for interlacing and it is just an issue for the few people who like to use it.

That said one option to not be limited to 1080Ti (which is very low end card these days) might be Linux with modern AMD (or Intel) card.
Not sure interlacing is supported out of the box but at least drivers are open source so chances are it could be re-added. Unless perhaps this something in binary blob and/or depends on some hardware feature of display controller. I have not tried it when I had AMD card and used Linux and I could not find information about it. I guess set of Linux users who also use CRT monitors and also use interlacing might be too small.

More interesting to me info was that apparently it is possible to use non-square DSR on Nvidia cards so it should be possible to have something like 2560x1440 160Hz resolution displayed as 2560x720 - with reduction in vertical resolution acting like anti-aliasing.
It would be perhaps much more useful should I have much better DAC and got to the point vertical lines to blur naturally but then again at this point horizontal resolution would also experience severe blurring and shadow mask/aperture grille resolution limits so it would not be that useful. Still 3840x2400 scaled down to 3840x1200 would look better than scaled down to 1920x1200.
 
I don't see it that way at all.
Combing artifacts and reduced motion resolution is not my cup of tea.

Still I cannot imagine there is any benefit from dropping support for interlacing and it is just an issue for the few people who like to use it.

That said one option to not be limited to 1080Ti (which is very low end card these days) might be Linux with modern AMD (or Intel) card.
Not sure interlacing is supported out of the box but at least drivers are open source so chances are it could be re-added. Unless perhaps this something in binary blob and/or depends on some hardware feature of display controller. I have not tried it when I had AMD card and used Linux and I could not find information about it. I guess set of Linux users who also use CRT monitors and also use interlacing might be too small.

More interesting to me info was that apparently it is possible to use non-square DSR on Nvidia cards so it should be possible to have something like 2560x1440 160Hz resolution displayed as 2560x720 - with reduction in vertical resolution acting like anti-aliasing.
It would be perhaps much more useful should I have much better DAC and got to the point vertical lines to blur naturally but then again at this point horizontal resolution would also experience severe blurring and shadow mask/aperture grille resolution limits so it would not be that useful. Still 3840x2400 scaled down to 3840x1200 would look better than scaled down to 1920x1200.
ehhh, the artifacts are only visible if you're at lower resolution and refresh rate, once you get to 1920x1440i 120hz they're not noticeable anymore, i have to go down to 1440x1080i 60hz to see them in Cyberpunk.

ChatGPT gave me the same answer regarding linux, it said that the RTX 3090 most likely will do interlaced, and i read that AMD gpus with amdgpu driver should be able to do interlaced, i have not seen an ounce of proof yet and im not about to buy a modern gpu unless i have full confirmation it actually does work.
 
ehhh, the artifacts are only visible if you're at lower resolution and refresh rate, once you get to 1920x1440i 120hz they're not noticeable anymore, i have to go down to 1440x1080i 60hz to see them in Cyberpunk.
It has more to do with how much lines overlap - if you see gaps between scanlines with given amount of displayed lines then interlaced resolution won't look so hot. Otherwise... still not so hot but combing artifacts and such won't be that visible. Kinda like on old school 480i/576i TVs - you cannot really see gaps between lines but interlaced resolutions still look clearer.

For FW900 which my clearest CRT 1280x720 shows massive gaps between lines. 2560x1440i therefore isn't ideal. It does run 160Hz at 720 lines so at least flickering isn't bad at 80Hz per specific scanline.
I would see interlacing more interesting to get much higher resolutions like 2400i or higher. Unfortunately DACs don't have enough bandwidth to really take advantage of such resolutions - especially on wide-screen aspect ratio.

And if I could run such resolutions then other solutions like non-square DSR ratios would also be very good.
This is something I am personally more interest in and will probably test during this weekend.
I mean something like having 2560x1440 desktop resolution but 2560x720 going to display - in this case getting 2x horizontal resolution compared to using 2560x1440 -> 1280x720 downscale.
Personally I like scan lines like that and I like SSAA/downscaling. What I never likes is exactly downscaling horizontal resolution instead of displaying it as-is which was never needed for SSAA/DSR. Recently I read somewhere it is possible to achieve such non-square DSR so I will test it.

ChatGPT gave me the same answer regarding linux, it said that the RTX 3090 most likely will do interlaced, and i read that AMD gpus with amdgpu driver should be able to do interlaced, i have not seen an ounce of proof yet and im not about to buy a modern gpu unless i have full confirmation it actually does work.
Hmmm, Deepseek-R1 said there is little chance for interlacing on RTX since even Noveau drivers don't have access to certain video controller registers. It didn't dismiss the possibility it could work - no data really despite using search function.
AMD and Intel GPUs were stated as the best bet.

I also asked Qwen2.5-Max (my favorite Chinese AI model 😅) and it was more helpful. It said amdgpu driver doesn't support interlacing due to this commit https://gitlab.freedesktop.org/drm/amd/-/issues/1636
In other words there was a bug which no one wanted to fix so they disabled the whole feature - which is something I experienced in the past and even reverted one X11 blunder-solution for feature deemed not popular to fix. Still much more popular than interlacing... on Linux :)

Anyways, helpful AI assistant also said Intel doesn't support interlacing and Nvidia only having limited support on professional cards and certain old stereo glasses which uses interlacing and which might not be supported anymore.
I also asked about Noveau driver (oss driver for Nvidia cards) and there is no data. Qwen also said the driver is not ready really and is not recommended.

tl;dr - it looks like no real chances of get Interlacing that way either 😪
Unless someone sits with amdgpu driver then it might be possible - might be doesn't mean it is because part of display controller core might be in binary blob and issue is unfixable. Usually it is skill & effort related issue.
 
I also asked Qwen2.5-Max (my favorite Chinese AI model 😅) and it was more helpful. It said amdgpu driver doesn't support interlacing due to this commit https://gitlab.freedesktop.org/drm/amd/-/issues/1636
In other words there was a bug which no one wanted to fix so they disabled the whole feature - which is something I experienced in the past and even reverted one X11 blunder-solution for feature deemed not popular to fix. Still much more popular than interlacing... on Linux :)

Interesting, now this is a good use for AI. Trawling the internet to find the exact moment or exact reason a thing was broken.

Tracing that issue back, Harry Wentland mentions in this post that his team figured nobody need interlaced anymore, so they removed it, or didn't bother fixing it, at least within the confines of the linux stuff they're talking about. So if anybody wanted anymore insight, maybe he's a guy to ask about it. He did say it would be easy to put back in (7 years ago anyway)

Now I'd like to hear from the guy at AMD who had to help Sony put 1080i support back in for the PS5 years later.
 
  • Like
Reactions: XoR_
like this
It has more to do with how much lines overlap - if you see gaps between scanlines with given amount of displayed lines then interlaced resolution won't look so hot. Otherwise... still not so hot but combing artifacts and such won't be that visible. Kinda like on old school 480i/576i TVs - you cannot really see gaps between lines but interlaced resolutions still look clearer.

For FW900 which my clearest CRT 1280x720 shows massive gaps between lines. 2560x1440i therefore isn't ideal. It does run 160Hz at 720 lines so at least flickering isn't bad at 80Hz per specific scanline.
I would see interlacing more interesting to get much higher resolutions like 2400i or higher. Unfortunately DACs don't have enough bandwidth to really take advantage of such resolutions - especially on wide-screen aspect ratio.

And if I could run such resolutions then other solutions like non-square DSR ratios would also be very good.
This is something I am personally more interest in and will probably test during this weekend.
I mean something like having 2560x1440 desktop resolution but 2560x720 going to display - in this case getting 2x horizontal resolution compared to using 2560x1440 -> 1280x720 downscale.
Personally I like scan lines like that and I like SSAA/downscaling. What I never likes is exactly downscaling horizontal resolution instead of displaying it as-is which was never needed for SSAA/DSR. Recently I read somewhere it is possible to achieve such non-square DSR so I will test it.


Hmmm, Deepseek-R1 said there is little chance for interlacing on RTX since even Noveau drivers don't have access to certain video controller registers. It didn't dismiss the possibility it could work - no data really despite using search function.
AMD and Intel GPUs were stated as the best bet.

I also asked Qwen2.5-Max (my favorite Chinese AI model 😅) and it was more helpful. It said amdgpu driver doesn't support interlacing due to this commit https://gitlab.freedesktop.org/drm/amd/-/issues/1636
In other words there was a bug which no one wanted to fix so they disabled the whole feature - which is something I experienced in the past and even reverted one X11 blunder-solution for feature deemed not popular to fix. Still much more popular than interlacing... on Linux :)

Anyways, helpful AI assistant also said Intel doesn't support interlacing and Nvidia only having limited support on professional cards and certain old stereo glasses which uses interlacing and which might not be supported anymore.
I also asked about Noveau driver (oss driver for Nvidia cards) and there is no data. Qwen also said the driver is not ready really and is not recommended.

tl;dr - it looks like no real chances of get Interlacing that way either 😪
Unless someone sits with amdgpu driver then it might be possible - might be doesn't mean it is because part of display controller core might be in binary blob and issue is unfixable. Usually it is skill & effort related issue.
Intel does support interlacing on newest drivers, but only on their igpus , their discrete battlemage and alchemists killed the feature through drivers, but the Alchemist gpus with day1 drivers will do interlacing fairly well on the DP out, sadly those day 1 drivers are barelly usable at all.
 
Something wild just happened with Dell p992 Trinitron.

I've been putting off doing a WinDAS adjustments just by using the expert color mode to tweak R/G/B gain and bias individually. Without adjustment it is very red tinted especially near black

Adjusting via user menu worked well enough so far but randomly today the screen went completely green with retrace lines. and I mean completely green, the video signal from my PC was not visible at all.

It looked almost exactly like this
glitch 1.jpg


Except the retrace lines actually started thick and got thinner, or maybe it was vice versa. This lasted maybe 5 seconds then it faded back to my desktop screen.

After this, my contrast and gamma was out of whack, I had to readjust the colors except now I have to have contrast at 100 for balance to be correct. It was at 83 before.

Hopefully running WinDAS will solve the underlying issue. I hope some component isn't near failing
 
Last edited:
Well, being intermittent the problem must come from a faulty component or a solder. I doubt it is only a matter of settings unfortunately. Green + visible retrace lines is the usual result of very high G2, except G2 isn't supposed to vary wildly during a short period.
 
I was just thinking about how I used to be against using 3DLUT's to color correct Trinitrons after going for an extra deep gamma curve in WinDAS for super deep blacks. Because those 3DLUT's wouldn't always be applied correctly to content, especially exclusive fullscreen games.

But now, with DX12 and Windows DWM, I think 3DLUTs probably work with almost everything now. Even if you have an older game that does whacky stuff with exclusive fullscreen, there are multiple ways to get that to run in a Vulkan or some other type of wrapper that will most likely let you implement those 3D LUTs

If anybody has recently experimented with this I'd love to hear your experience.

Of course it's still not exactly a viable solution for me because I like to hook up consoles directly to my monitor. They maybe I can tweak gamma via an external device in those scenarios
 
I was just thinking about how I used to be against using 3DLUT's to color correct Trinitrons after going for an extra deep gamma curve in WinDAS for super deep blacks. Because those 3DLUT's wouldn't always be applied correctly to content, especially exclusive fullscreen games.

But now, with DX12 and Windows DWM, I think 3DLUTs probably work with almost everything now. Even if you have an older game that does whacky stuff with exclusive fullscreen, there are multiple ways to get that to run in a Vulkan or some other type of wrapper that will most likely let you implement those 3D LUTs

If anybody has recently experimented with this I'd love to hear your experience.

Of course it's still not exactly a viable solution for me because I like to hook up consoles directly to my monitor. They maybe I can tweak gamma via an external device in those scenarios
What do you mean by 3D LUTs?
There is normal LUT aka gamma correction LUT (aka what actual level is being output for each RGB channel for given input which is more like 3x 2D LUT and not 3D LUT) and it would indeed not always work in exclusive full screen games - especially prevalent issue back when FW900 was absolutely the best gaming monitor money could buy with nothing coming even close.

Few pages ago in post https://hardforum.com/threads/24-wi...-ebay-arrived-comments.952788/post-1045974602 I posted names for programs for AMD and Nvidia GPUs which can enable gamut correction - which is what 3D LUT is mostly used for, read: gamut correction. There is some functionality for gamma too - though how useful it is in this case I don't know. From what I did test it could potentially be used to sideline issue with normal LUTs. With AMD software I figured I can save some values, change and reload and it would act like gamma correction - it would affect gamut correction but without gamut correction it should be 1:1 gamma correction. Nvidia software I only use to enable dithering (6-bit on OLED in HDR for plasma-like dithered looks 🤩). It has option to load ICC file and some additional gamma related options for it. Imho worth checking.

And if you are after true 3D LUD then I guess the only option is ReShade and 3D LUD shader. I would not use it though because I don't think 3D LUT would offer better image quality than normal "free" gamut + gamma correction.

---
BTW. From my limited testing by using LUTs you can indeed improve black level but only really for fully black screen. For actual images black level rise anyways. Maybe there is some option in WinDAS to affect how much of this automatic compensation happens which with LUTs could be used to improve ANSI contrast... is something like this there?
 
I haven't messed with LUTs at all, really, so I know very little about how to use them.

My point was just to bring up how modern Windows DWM probably makes things like color correction work properly across all games.
 
Well, being intermittent the problem must come from a faulty component or a solder. I doubt it is only a matter of settings unfortunately. Green + visible retrace lines is the usual result of very high G2, except G2 isn't supposed to vary wildly during a short period.
Here's a potential hypothesis about this:

Dell P992 doesn't have the "color return" feature that's available on many other Trinitrons. The feature that's supposed to automatically recalibrate color and gamma.

I'm wonder if maybe the P992 is supposed to do a sort of color return on its own, or perhaps when it detects that some voltage or resistance has drifted.

And maybe, this temporary bright screen I saw was the monitor doing an automatic color return calibration or something.

Just a theory, given how orderly and transient it seemed. It was a quick fade in and fade out
 
Here's a potential hypothesis about this:

Dell P992 doesn't have the "color return" feature that's available on many other Trinitrons. The feature that's supposed to automatically recalibrate color and gamma.

I'm wonder if maybe the P992 is supposed to do a sort of color return on its own, or perhaps when it detects that some voltage or resistance has drifted.

And maybe, this temporary bright screen I saw was the monitor doing an automatic color return calibration or something.

Just a theory, given how orderly and transient it seemed. It was a quick fade in and fade out
If I recall correctly, there is a G2 drift compensation circuit that brightens the screen while warming up. It lowers the G2 back to its setting while the monitor warms until it stabilizes.

For what it's worth. Color return is largely useless if you regularly calibrate your monitors with WinDAS. After I calibrated mine I never ever used it.
 
If I recall correctly, there is a G2 drift compensation circuit that brightens the screen while warming up. It lowers the G2 back to its setting while the monitor warms until it stabilizes.
No, that one is just a passive trick using a transistor, leakage current on the ground pin of the main RGB amplifier and the natural variations in behavior of semi-conductors with temperature. G2 isn't impacted at all by that system.

That crap was only there on monitors from the same generation as the FW900, it must be gone on the P992, it is newer I think.
 
I agree 100% - unlikely scenario. Still these things can and do happen.


Here it looks actually visible and it does indeed looks like caps dried up too much.
Is it even worth having someone seek out only the dried caps, or should all the caps be replaced?
 
No, that one is just a passive trick using a transistor, leakage current on the ground pin of the main RGB amplifier and the natural variations in behavior of semi-conductors with temperature. G2 isn't impacted at all by that system.

That crap was only there on monitors from the same generation as the FW900, it must be gone on the P992, it is newer I think.
I thought P992 was of the same generation. So if I'm mistaken then my bad.
 
Is it even worth having someone seek out only the dried caps, or should all the caps be replaced?
Could you take an esr meter and test the caps first? Given the age of the sets it may make sense to do the whole set of caps but if only one or two are problematic then just do the one or two.
 
I couldn't resist looking for the service manual, I confirm, that one is the next generation without the transistor trick. But it wasn't messing with the main RGB amplifier, rather the cutoff amp. I'm getting old and forgetful. :oops:
Welcome to aging then I suppose. :D. Hat tip to you guys still pushing these old beauties. I miss mine.
 
Could you take an esr meter and test the caps first? Given the age of the sets it may make sense to do the whole set of caps but if only one or two are problematic then just do the one or two.
I'm definitely not doing it myself. I just don't have the time nor the patience lol. This was more of a "what should I tell the tech to do" type of question. With it's age, I feel like it would only make sense to just replace them all to be on the safe side.
 
I'm curious about the person you bought it from. Were they still using it for games? Editing? Just sitting in storage from years ago?
i know this is a long time off, ive been busy
the guy i got it off seemed to be a old tv cathode professional, not used for games i think, i think he just fixes them up. When he demoed it he had it on terrible settings on his laptop, i knew i could set it up better and bought it anyway.
 
i know this is a long time off, ive been busy
the guy i got it off seemed to be a old tv cathode professional, not used for games i think, i think he just fixes them up. When he demoed it he had it on terrible settings on his laptop, i knew i could set it up better and bought it anyway.

Lol thanks for the response.

So how's it holding up now?
 
I haven't messed with LUTs at all, really, so I know very little about how to use them.

My point was just to bring up how modern Windows DWM probably makes things like color correction work properly across all games.
I would assume older games will behave exactly like they did in the past and in newer games this being less of an issue - but I have not tested it.
Thankfully you can just check it yourself. Best to load LUT with very specific change e.g. make one of the color channel much brighter/darker and check if the resulting color cast is visible in the games.
 
Yeah I was thinking by using DXVK, the vulkan translation API, it would force a DX9 game to output through Windows DWM.

But there might be some driver level stuff happening that I don't understand.

I may give it a test like you're describing at some point.
 
Back
Top