24" Widescreen CRT (FW900) From Ebay arrived,Comments.

It depends on what's putting out the heat, and what needs cooled.

CRT and plasma displays, it's the display tube or panel itself that's emitting the heat, and there's nothing particularly temperature sensitive in there.

LCDs with fans aren't cooling the panel, they're cooling the scaler electronics. (I've got an IBM T221 which has fans to cool its large, fast FPGAs that are acting as the scaler system.)
Makes sense that modern flat-panels would need processors that require active cooling just for the scaler... though you actually have an IBM T221? How well does that thing hold up today?

It's pretty impressive to realize that they were doing 4K in 2001, with ludicrously high DPI at that owing to its 22.5" size, but that the DVI interface and the graphics cards of the day just weren't up to driving the thing properly. I wonder if the panel would still be worth something if the entire T-CON were to be replaced with a more modern one packing current HDMI and DisplayPort standards that can push 3840x2400 at 60 Hz, or if the existing one can have its FPGAs reprogrammed to similar effect.
 
CRT and plasma displays, it's the display tube or panel itself that's emitting the heat, and there's nothing particularly temperature sensitive in there.
I don't know for plasma but a CRT tube doesn't heat much. Most if not all the heat comes from the electronics around it. There are some quite massive heatsinks inside to cool down mosfets, amplifiers, rectifier bridges, voltage regulators and some critical transistors/high power resistances.
 
Makes sense that modern flat-panels would need processors that require active cooling just for the scaler... though you actually have an IBM T221? How well does that thing hold up today?
Not well. Power consumption is absolutely ridiculous, response time can almost be measured with a stopwatch (IIRC it's something like 63 ms), contrast ratio is... not modern.

It's pretty impressive to realize that they were doing 4K in 2001, with ludicrously high DPI at that owing to its 22.5" size, but that the DVI interface and the graphics cards of the day just weren't up to driving the thing properly. I wonder if the panel would still be worth something if the entire T-CON were to be replaced with a more modern one packing current HDMI and DisplayPort standards that can push 3840x2400 at 60 Hz, or if the existing one can have its FPGAs reprogrammed to similar effect.

So, funny story with that: IBM was actually thinking along those lines, on the DG5 (the version I have). They decided to develop a packetized protocol that would only update changed areas of the screen - Digital Packetized Video Link - and reprogrammed the FPGAs internally to use it.

DPVL saw some use in mobile applications as a low power interface, but never got used in high-end applications in the real world... except VESA loosely based DisplayPort on the ideas in DPVL.

However, DisplayPort's 4k60 support is done through brute force, so making a DisplayPort scaler firmware for the T221 wouldn't help, the interfaces are the limiting factor AFAIK.

And, also, while there have been overclocks for the existing scaler to get to 55 Hz (there's actually enough bandwidth on the interfaces to get to 60 Hz across four links, but the scaler only feeds the panel at 48 stock, 55 overclocked, even when it's being fed with a 60 Hz signal), you're still dealing with a very slow panel.

...of course I'm feeding my T221 off of a Haswell's IGP at like 24 Hz, because I just use it for productivity stuff (it was a case of I had it, might as well use it).
 
I don't know for plasma but a CRT tube doesn't heat much. Most if not all the heat comes from the electronics around it. There are some quite massive heatsinks inside to cool down mosfets, amplifiers, rectifier bridges, voltage regulators and some critical transistors/high power resistances.
Good point. I guess the difference there is that there's relatively low heat density, and space to provide enough heatsinking? Whereas scaler ASICs with cooling demands have a lot less surface area for a heatsink to be effective on, and therefore forced air is needed?
 
I finally bit the bullet and gotten one of those newer RAMDAC-less cards (nvidia 1080ti), is there some consensus on which adapter is the best please? I remember there were issues with faulty pieces and what not, how did that play out in the end? Trying to order something as soon as possible. Or is there some new adapter on the horizon and I should rather wait? Thanks in advance for any tips!

And to those questioning the benefits of WPB, I (amateur) have done it myself on different CRTs and can 100% see the difference, its not just about making a "dim tube" brighter. Even making a custom LUT was slightly better (helped a little with crushed blacks which seems to plague all CRTs) but didnt work at all times (because some games overwrite it with their own I think) and I dont watch movies (where it works) on it so I dont do LUT anymore. But would WPB again, 10/10. The procedure does not damage anything, everything seems within spec and there are many safety features (going "out of range", etc). Calibrating a CRT monitor systems manually is like playing a million head whac-a-mole, your brain isnt that good. WPB provides a clear step-by-step process to "solve the puzzle" in the least amount of very logical moves (and some heavy math behind the scenes). If you are lucky and there hasnt been too much drift on your tube over the years, you could get away with just adjusting the G2 value in EEPROM manually, I get that. But in order to do that, you still need to setup WinDAS and all that so you might as well just do the WPB which takes like 10 minutes after you get the hang of it.
 
I'm currently using a Sunix DPU-3000 with a GTX 1080. Working great with just about any standard resolution on my 21" Lacie monitor. Including 2048x1536@85hz, a 388mHz pixel clock.

So if you don't think you're going to be going too crazy with custom resolutions, then you can pretty much stop reading there, but if you are going to play with custom resolutions, then I have seen one issue:

The only issue I've run into is when I go above and beyond standard stuff. Like once in a blue moon while I'm playing Battlefield 1 at 2560x1920@60hz (430mHz pixel clock), the screen will split into 3 pieces, with the right and left side swapping places. This maybe happens once a day, sometimes never, but all you have to do is alt+tab out for a split second and it will go away. However, for some reason, a slightly lower resolution like 2176x1632@60hz (301mHz pixel clock) will do it more often (though the swapped sides become smaller). And more strangely, 2048x1536 is rock-solid stable, even at higher refresh rates. And this includes lower resolutions too.

And to make it more confusing, I've never had this issue with Street Fighter 5, which I play at an insane resolution of 3200x1800 (a 492mHz clock!).

So this issue only really crops up in very high, non-standard resolutions, apparently only in 4:3, and apparently are more likely to happen as you drop closer to 2048x1536 but somehow doesn't happen at all at 2048x1536 and below, regardless of refresh rate.

I'll try to get some pictures sometime
 
  • Like
Reactions: Meeho
like this
Good point. I guess the difference there is that there's relatively low heat density, and space to provide enough heatsinking? Whereas scaler ASICs with cooling demands have a lot less surface area for a heatsink to be effective on, and therefore forced air is needed?
I took pictures of the boards some time ago, that's worth any explaination:

The power supply board

Ecran2-G board1_light.jpg


The video board

Ecran3-Aboard_light.jpg


The deflection board

Ecran3-Dboard4_light.jpg

Note that the huge heatsink on left is only the tip of the iceberg, since it's fit on the shielding plate with thermal paste. The shielding itself acts as a cooling surface as well.
 
And to those questioning the benefits of WPB, I (amateur) have done it myself on different CRTs and can 100% see the difference, its not just about making a "dim tube" brighter. Even making a custom LUT was slightly better (helped a little with crushed blacks which seems to plague all CRTs) but didnt work at all times (because some games overwrite it with their own I think) and I dont watch movies (where it works) on it so I dont do LUT anymore. But would WPB again, 10/10. The procedure does not damage anything, everything seems within spec and there are many safety features (going "out of range", etc). Calibrating a CRT monitor systems manually is like playing a million head whac-a-mole, your brain isnt that good. WPB provides a clear step-by-step process to "solve the puzzle" in the least amount of very logical moves (and some heavy math behind the scenes). If you are lucky and there hasnt been too much drift on your tube over the years, you could get away with just adjusting the G2 value in EEPROM manually, I get that. But in order to do that, you still need to setup WinDAS and all that so you might as well just do the WPB which takes like 10 minutes after you get the hang of it.
Some times it is all about convictions and honor, not about what make more sense :ROFLMAO:
Besides calibrating via OSD is not that hard.

Did you validate you calibration using eg. Colorimètre HCFR?
 
Enhanced Interrogator

Thanks man! Sounds like it works for the most part and there is no (or wont be soon) a better alternative so I ordered the Sunix DP3000 today. I will make sure to report back my experience later for anyone else interested in these. And yes, crazy resolutions is right up my alley :) But I prefer BFBC2 :p The other day, on the new 1080ti, I was trying to render some older game (Arma 2) at 66 million pixels (FullHD is only 2 mil FYI, so its like 33 FullHD displays) with some MSAA on top of that and gotten around 10FPS, there is some serious horsepower in that card lol. Just need to connect that goodness to the other goodness now and I am all set for a foreseeable future.

XoR_

I feel you but calibrating with OSD is simply harder and the result will probably not be as "correct". You sound like someone who did not attempt WinDAS WPB procedure yet (I dont know if you did or didnt) because I am having a hard time imagining someone not liking it after trying it. And again, you still need WinDAS to adjust G2 (assuming changing Contrast/Brightness is not enough for your old tube), unless you have Diamondtron which has that option directly in a hidden service OSD. Yes, I have used HCFR program and DTP-94 colorimeter for Trinitron WPB procedure and for manually adjusting Diamondtron via service OSD.
 
If you connect 980ti directly to CRT over D-SUB(VGA) then you can set resolutions upto 400MHz pixel clock (because 980ti has 400MHz RAMDAC). It MIGHT be possible to get more by using the SUNIX adapter http://www.sunix.com/product/DPU3000.html because it will bypass the 980ti RAMDAC by outputting digital signal straight from 980ti DisplayPort to SUNIX - this is something I want to test in a couple of weeks.
 
selling my fw900 if anyways want to buy it, im in Sedona, AZ. going to live in a van/rv, not a joke. maybe down by the river.
 
...(because 980ti has 400MHz RAMDAC). It MIGHT be possible to get more by using the SUNIX adapter...

It definitely is possible. Besides the resolutions I mentioned above, I also created 2880x2160@60hz, which is 535mHz. Aktan(sp?) has tested even higher resolutions for short periods.
 
I feel you but calibrating with OSD is simply harder and the result will probably not be as "correct". You sound like someone who did not attempt WinDAS WPB procedure yet (I dont know if you did or didnt) because I am having a hard time imagining someone not liking it after trying it. And again, you still need WinDAS to adjust G2 (assuming changing Contrast/Brightness is not enough for your old tube), unless you have Diamondtron which has that option directly in a hidden service OSD. Yes, I have used HCFR program and DTP-94 colorimeter for Trinitron WPB procedure and for manually adjusting Diamondtron via service OSD.
I do not believe WPB can give more 'accurate' results than just using OSD

...but actually at least now I have a reason to do WPB: to check if its accuracy is any good :rolleyes:

BTW. to have actually accurate colors you also need to correct gamut. Unfortunately it is only doable on AMD cards and need EDID emulator because of strange CIE values encoded in monitors own EDID.

ps. for videos (and games to which you can inject custom shaders) it is also possible to correct gamut. On AMD cards with custom EDID it works for everything so it is preferred solution
 
Willing to ship it to Seattle?
i dont like shipping but im willing to drive there, i might do shipping if you accept all responsibility(meaning if it breaks you fill insurance) + full insurance but PM so we dont pollute this thread. I have shipped 6 crts and 5 lived, will the tube still worked but the bezel had cosmetic damage do to FEDEX basically rough handling it.
 
i dont like shipping but im willing to drive there, i might do shipping if you accept all responsibility(meaning if it breaks you fill insurance) + full insurance but PM so we dont pollute this thread. I have shipped 6 crts and 5 lived, will the tube still worked but the bezel had cosmetic damage do to FEDEX basically rough handling it.

Cool, started a separate convo with you (don't know how to send a PM here yet).
 
If you connect 980ti directly to CRT over D-SUB(VGA) then you can set resolutions upto 400MHz pixel clock (because 980ti has 400MHz RAMDAC). It MIGHT be possible to get more by using the SUNIX adapter http://www.sunix.com/product/DPU3000.html because it will bypass the 980ti RAMDAC by outputting digital signal straight from 980ti DisplayPort to SUNIX - this is something I want to test in a couple of weeks.

I have two CRTs I want to use with a TriSLI configuration. Only one of the GPUs display outputs will be active due to TriSLI, and the 980 Tis only have one analog output, therefore, I will need an external adapter and RAMDAC.
 
I have two CRTs I want to use with a TriSLI configuration. Only one of the GPUs display outputs will be active due to TriSLI, and the 980 Tis only have one analog output, therefore, I will need an external adapter and RAMDAC.
RAMDAC is DAC that have RAM
You need normal DAC

Besides why have two CRT connected at the same time?

EDIT://
Delock 62967 is good DAC
Not the one with the highest bandwidth (something around 50MHz on mine) but image quality is pretty good and is stable.

Unfortunately on GeForce cards adapters like these will produce banding when correcting gamma.
I wonder if there is something with actually working 10bit as it would be preferred for CRT which always need some gamma correction.

Radeons use dithering (very similar if not the same as A-FCR on LCDs thus very good) so can correct gamma on 8bit displays/adapters without any banding and it works on all adapters
 
Last edited:
RAMDAC is DAC that have RAM
Unfortunately on GeForce cards adapters like these will produce banding when correcting gamma.

You can get rid of the banding by switching to a 4:4:4 chroma on the nvidia control panel, instead of going RGB, right? At least that's what worked for me.
 
Unfortunately on GeForce cards adapters like these will produce banding when correcting gamma.
I wonder if there is something with actually working 10bit as it would be preferred for CRT which always need some gamma correction.

Radeons use dithering (very similar if not the same as A-FCR on LCDs thus very good) so can correct gamma on 8bit displays/adapters without any banding and it works on all adapters

Nope, Geforce DACs are 10 bit, and you can adjust gamma without banding.

But agreed that it would be wonderful to see an external VGA adaptor with a 10 bit DAC.
 
One is a main monitor, one is a side monitor.
Setup consisting of different monitors for different tasks makes more sense than keeping monitors the same, especially when display tech used have so many flaws and is inferior for some tasks like CRT is for anything else than gaming and watching movies.
But CRT enthusiast like yourself wouldn't want it any other way than having pure CRT-bliss, right? ;)

Nope, Geforce DACs are 10 bit, and you can adjust gamma without banding.

But agreed that it would be wonderful to see an external VGA adaptor with a 10 bit DAC.
I was not talking about internal DAC. Besides it is thing of the past. Ray-tracing is the way to go and RTX cards do not have RAMDACs anymore...
In a sense FW900 with its ability to run lower resolutions is the great pick for ray-traced games :)

as for adapters, we would not need 10bit adapters if NV implemented their outputs properly :mad:

You can get rid of the banding by switching to a 4:4:4 chroma on the nvidia control panel, instead of going RGB, right? At least that's what worked for me.
Which adapter we are talking about that supports deep color again?
 
I was not talking about internal DAC. Besides it is thing of the past. Ray-tracing is the way to go and RTX cards do not have RAMDACs anymore...
In a sense FW900 with its ability to run lower resolutions is the great pick for ray-traced games :)

as for adapters, we would not need 10bit adapters if NV implemented their outputs properly :mad:


Which adapter we are talking about that supports deep color again?

Sorry, misread your original post.

And yea, well implemented dithering with 8bit could be a good solution as you say.
 
alright gents, the input lag difference between by PG279Q and my FW900 is approximately 16 ms in the FW900's favor, with the PG279Q trailing behind.

The program you used to test the input lag is useless,if you want to test the difference between CRT and LCD try with SMTT 2.0
About the adapter you tested,most probably it's without lag
I have seen several datasheets of these chipset and the internal timings are all in micro and nanoseconds

Nope, Geforce DACs are 10 bit, and you can adjust gamma without banding.

But agreed that it would be wonderful to see an external VGA adaptor with a 10 bit DAC.

Can Reshade help with that banding problem?
All HDFury converters are 10 bit capable
Delock has just launched a new adapter with the new ITE IT6562FN chipset (Delock 63925)
IT6562FN specifications:
-DP input color depth 6/8/10/12 bits
-Triple 10-Bit DAC converters (200MSPS throughput rate)
The DP receiver is four lanes HBR,much better than two HBR2 lanes because HBR is very easy to reach even with a shit DP output from graphic card
Input bandwidth is 360 MHz 24 bit and 288 MHz with 30 bit
Adapter is USB-C only so it requires:
-a graphic card with USB-C output (new nvidia RTX cards have it)
-a notebook with USB-C and support for DisplayPort over USB-C
-a card like Sunix UPD2018 to convert displayport to USB-C output
 
FW900 is actually good choice for RTX cards with their Ray-Tracing as I doubt something like 4K will be attainable even on 2080Ti... but we will (meaning I :cool:) see :)
 
FW900 is actually good choice for RTX cards with their Ray-Tracing as I doubt something like 4K will be attainable even on 2080Ti... but we will (meaning I :cool:) see :)

me too. sold both my 1080 Tis for a 2080 TI SLI setup. Now if only my FW900 was in the same room as my Z9D. Best of both worlds.
 
Can Reshade help with that banding problem?

From what I understand, Reshade allows 3DLUTs and the like to be implemented in games, rather than only in color managed software like Adobe, so it won't address the banding issue (at least not the banding issue caused by not enough LUT precision). See my post here for a detailed account of LUT precision.

All HDFury converters are 10 bit capable

Ah didn't know this. Did they ever deal with their legal HDMI issues? (I remember hearing something about how that was hampering their efforts for an updated HDMI to VGA converter).

Delock has just launched a new adapter with the new ITE IT6562FN chipset (Delock 63925)
IT6562FN specifications:
-DP input color depth 6/8/10/12 bits
-Triple 10-Bit DAC converters (200MSPS throughput rate)
The DP receiver is four lanes HBR,much better than two HBR2 lanes because HBR is very easy to reach even with a shit DP output from graphic card
Input bandwidth is 360 MHz 24 bit and 288 MHz with 30 bit
Adapter is USB-C only so it requires:
-a graphic card with USB-C output (new nvidia RTX cards have it)
-a notebook with USB-C and support for DisplayPort over USB-C
-a card like Sunix UPD2018 to convert displayport to USB-C output

Ooooh sexy! That will be enough to run the FW900 on prime mode (1920x1200 @ 85 hz)
 
For videos madVR should be used and it can do anything with image with high precision and have best upscalers which use artificial neuron networks and tons of GPU power and give great results even for low resolution content.

For games using shaders implementing full 3DLUT is silly if not completely overdoing it.
Gamut + 3x3 gamut correction would suffice. Gamut correction does not produce any visible banding. Shaders for gamma could be done with some sort of basic dithering.

I have glimpse of hope RTX cards do use dithering on 8bit outputs. Most probably not but maybe... one can only hope...
 
Anyone know where I could track down a flyback transformer for the FW900? I've got a broken monitor in my closet and I'd love to get it working again.

The member bramabul5353 was offering one a while back but I've tried contacting him but I haven't heard back. I assume he sold it to someone else.


On another note, I'm currently using a working FW900 as my main monitor. It's on almost constantly and it still seems to work great. I do wonder how much life it has left though.

Is there a way to estimate how many hours it has on it and how much life the tube has left? I've gone through the White Point Balance routine so things should be tuned up. Maybe some of the parameters you can set give an indication of the health of the tube?
 
Anyone know where I could track down a flyback transformer for the FW900? I've got a broken monitor in my closet and I'd love to get it working again.

The member bramabul5353 was offering one a while back but I've tried contacting him but I haven't heard back. I assume he sold it to someone else.


On another note, I'm currently using a working FW900 as my main monitor. It's on almost constantly and it still seems to work great. I do wonder how much life it has left though.

Is there a way to estimate how many hours it has on it and how much life the tube has left? I've gone through the White Point Balance routine so things should be tuned up. Maybe some of the parameters you can set give an indication of the health of the tube?

The only way would be through a Sencore CR70 or a CR7000. I don't recommend it since those can be pricy though, and if you try the rejuvination process on H-K short, it could kill whatever little life there was left.
 
Sitrep on the Sunix adapter, it works pretty great with my nvidia 1080ti :) Sometimes the image gets all distorted but like you guys have said, just refresh CRU with F8, re-plug the DP cabel in the graphics card or similarly trigger change of resolution/adater to quickly fix it. Happens like once every day so not an issue at all.

I had zero issues with specific frequencies or resolutions (yet) as someone mentioned earlier in this thread. I have not noticed any degradation in image quality or higher input lag compared to direct VGA connection.

Regarding the pixel clock limits, I was able to get anything upto 500MHz - precisely, so <499 works and >500 does not work. My new favorite resolution that I was not able to achieve without this adapter is 3056x1910(16:10)@60Hz (=499.66Mhz). Sprinkle some 4x DSR and SGSSAA on top and the image quality looks like a video, no jaggies at all (or almost, depending on the game engine of course). I also make the CRT image area as small as possible using OSD controls, which gives it an extra pop.

One more thing, it took me like half an hour to make the adapter work at all so if you have that problem, keep trying. I am not sure what fixed it in the end but try:

- plug the USB cable into Sunix, make sure the USB port you are using for this is powered
- plug in a second monitor as a default one and try to set up Sunix as a second extended monitor. Then set Sunix as default monitor and unplug the other monitor and see if Sunix works on its own.
- power off the computer, re-plug DP cable between Sunix and computer, power on the computer

Hope this helps to anyone. Overall I am very pleased with the adapter and would highly recommend it.

I have a question about sharpness on FW900 though, can I stick something in those top ventilation holes to adjust the knob(s) without hurting myself? :) Sticking a metal screwdriver just doesnt feel like a good idea. I would prefer to avoid having to tear down the chassis for this if possible. Any tips from the fellow FW900 owners please?

One more question, any ideas how to tackle the black crush that happens after WPB? I have stopped using my own LUTs because they are buggy so not that please. Right now I am solving it by adjusting gamma sliders in each game separately. This approach works reasonably well (for games) but has 2 issues:

1) Some games have very small gamma sliders so their max value is not enough and the image is still a bit dark.

2) In-game gamma sliders make the whole image lighter and I only really need to bring out the shadow details in very dark areas, hope you know what I mean now.

Cheers!
 
Sitrep on the Sunix adapter, it works pretty great with my nvidia 1080ti :) Sometimes the image gets all distorted but like you guys have said, just refresh CRU with F8, re-plug the DP cabel in the graphics card or similarly trigger change of resolution/adater to quickly fix it. Happens like once every day so not an issue at all.

For me, on an AMD card, to fix the sometimes distortion, I had to change to a refresh rate that is a multiple of 10. Like 60, 80, 90, etc. I dunno if this would help your case, but worth a try! I will say, after tinkering with it, and it has now been 6 months of continual use, I've had ZERO problems.
 
For me, on an AMD card, to fix the sometimes distortion, I had to change to a refresh rate that is a multiple of 10. Like 60, 80, 90, etc. I dunno if this would help your case, but worth a try! I will say, after tinkering with it, and it has now been 6 months of continual use, I've had ZERO problems.

I have not tried many resolutions yet but from the top of my head 75Hz and 95Hz worked fine. Using CRU with Detailed Resolution and "Automatic - CRT standard" timing. I usually remove all Standard and Established resolutions because they sometimes mess up the nvidia DSR settings.
 
I have a question about sharpness on FW900 though, can I stick something in those top ventilation holes to adjust the knob(s) without hurting myself? :) Sticking a metal screwdriver just doesnt feel like a good idea. I would prefer to avoid having to tear down the chassis for this if possible. Any tips from the fellow FW900 owners please?
Yes you can, use an insulated screwdriver.

But it's not a good idea at all to mess with the knobs, they are there to set the flyback in accordance with the rest of the circuit and/or the tube for optimal focus. If you didn't change the board or the flyback you're unlikely to improve anything. On contrary, you may screw up everything and have a nice time struggling to get back the initial result.
 
Thanks, I have already done it on Mitsubish 2070SB where they have a dedicated hole on the left side. This felt safe since it is a dedicated hole. Ehm. The knobs there were very sensitive indeed but manageable if you have a reasonably steady hand, good short distance eyesight and a few minutes. But on FW900 I see a cage and a whole lot of electronics around the hole so I wanted to ask if its not designed to be taken apart first. And by insulated you mean a normal screwdriver with normal plastic sleeve around it and plastic handle? Not some special kind of insulation? But yeah, the image is already pretty sharp as far as CRTs go so I might re-consider.
 
Back
Top