24" Widescreen CRT (FW900) From Ebay arrived,Comments.

What about HD Fury 4?

I'm not sure how high it can go. I don't know if anyone here has bought it and pushed it to find out.

And I think also has a frame of lag since it has scaling capabilities. Not sure if that is in all modes, but the spec page mentions the frame of lag.

It's pretty expensive compared to other DACs, so it was never as popular here
 
Here is more info.

Spec sheet is here. It says 200 MSPS - which I think means 200 million samples per second, which I think means 200 MHz.

But I know a couple people using it for their FW900's and I believe they're running 1920x1200 @ 85hz, so it must be more than 200 MHz.

Interesting, can you ask them to try with higher resolutions like 2304x1440 80 Hz, image quality or if they can set 10 bit on the digital output?

DAC speed is by specs 200 MHz, like the Synaptics chipset of the DPU3000, but a Lontium engineer told me that it can do much more than that.

They test it at that speed because is enough for 1920x1080 60 Hz, which is what the market ask for those types of converters.

The important thing is that the digital input bandwidth can handle a pixel clock up to 720 MHz with 8 bpp.
Most of other chipset can handle max up to 360 MHz
 
Last edited:
I'd asked him to do a 10 bit test, have just emailed him again, and asked for the 2304x1440 @ 80hz test also.
 
Just confirmed that 2304x1440 @ 80 hz works on it. He can't do 10 bit test yet as he doesn't have a colorimeter yet.
 
Amazing to see this thread still going! I use to frequent this thread daily for a good year or two something like ~6-7 years ago when I purchased a FW900 from UncleVito on eBay. It served me incredibly well for a few years when I was into playing Counter-Strike, but had been collecting dust for the last couple years. With the new Call of Duty releasing, I decided to bring the beast out again! Happy to report after she warmed up everything looked as good as I last recall.

Back when I frequented this thread it was "THE" FW900 resource for sharing tips & tricks, video settings, software/tools, etc.

...and so I am curious fellow FW900 owners..

What's your FW900 setup/configuration look like in 2019?

Here are some tools and settings I use on my FW900...

Software/Tools/Resources:
  • PowerStrip (to manage monitor/display settings)
  • ReClock (to keep timings in sync)
  • SVP (smooth video project - GPU playback/conversion of videos to your refresh rate, similar to soap opera/motionflow on TVs)

Video/Game Settings (GTX 970):

I like to use competitive settings (low res, high refresh, low gfx / post process / etc)
  • CALL OF DUTY: MODERN WARFARE
    800x600 @ 150 Hz (200% render resolution)
    1024x768 @ 150 Hz (100-150% render resolution)
    1280x800/1024 @ 140 Hz (100% render resolution)
  • NVIDIA
    Low Latency Mode: Ultra
    Vsync: Off
    Freestyle Filters: Sharpen, Color, Details, and Brightness/Contrast

Other/Misc:
  • FW900 Mods: removed AG coating and also cut out a small square in the plastic casing to have direct access to the sharpen/focus knobs
  • PC/GPU: I run DVI-I to BNC via a GTX 970, despite owning a 1080. Unfortunately it seems that the 9xx was the last nvidia series to have DVI-I connector. If this isn't the case, someone please let me know!
  • PS3/Console: I ran HDMI -> VGA via HDFury2 1080p from curtpalme.com. It worked great for a year or two until the AC / power started cutting in and out and now it can no longer hold a steady charge.
 
Just confirmed that 2304x1440 @ 80 hz works on it. He can't do 10 bit test yet as he doesn't have a colorimeter yet.

Ok, so now we know that they have connected all the four lanes to the chip, i was worried about that.
About the 10 bit, just ask if there is an option in the graphic panel to set 8 or 10 bit on the digital output, because without that I have doubts that it can work.

Awesome. Derupter do we know if a DP version of this adapter exists?

There is a miniDP to VGA adapter on Vention website without information about the chipset, i asked them in the past but they never answered me.
I think it is an old adapter with an old chipset, new chipset are all used on newer USB-C adapters.
All displayport adapters are old, the most recent is the Delock 62967 and we know why.

The Lontium LT8711X-B is native USB-C but maybe can be connected directly to a displayport source, in this case we need to replace the cable of the adapter, surely not an easy task.

The solution is to buy a card like Sunix UPD2018 or Delock 89582 to connect the Lontium adapter to any card without USB-C output, this adds 80 dollars to the cost of the adapter with a total of 100 dollars, but considering the cost of the DPU3000 if the Lontium performance are confirmed without problems i think it's worth it.

Clearly if someone is about to buy a new video card it would be better to focus on a model with the USB-C output.

If someone has only the displayport output and is not interested in resolutions over 1920x1200 96 Hz or 1920x1440 85 Hz the best solution is the cheap Delock 62967, but be warned that it doesn't work with all the video cards, different samples can have different results, but when it works is perfect and accepts all the resolutions and refresh without a single problem.
 
Last edited:
Ok, so now we know that they have connected all the four lanes to the chip, i was worried about that.
About the 10 bit, just ask if there is an option in the graphic panel to set 8 or 10 bit on the digital output, because without that I have doubts that it can work.

I'll wait till he gets his colorimeter, that way I can ask him to do that check on the graphics panel as well as the 10 bit LUT test (don't want to bug him too much).
 
Amazing to see this thread still going! I use to frequent this thread daily for a good year or two something like ~6-7 years ago when I purchased a FW900 from UncleVito on eBay. It served me incredibly well for a few years when I was into playing Counter-Strike, but had been collecting dust for the last couple years. With the new Call of Duty releasing, I decided to bring the beast out again! Happy to report after she warmed up everything looked as good as I last recall.

That's an interesting idea to mod the case to have easy access to the focus knobs - might get on that myself!

Here's the WinDAS white point balance guide, in case you missed it.
 
On the product page it's labeled as a DP converter with USB-C as the alternate mode:

http://www.lontiumsemi.com/uploadfiles/pdf/LT8711X_Product_Brief.pdf

I imagine it would be possible to connect a display port cable, if we had the pinout for the chip. But then it might need new firmware or something, who knows.

Unlike other chipset here the USB-PD controller is inside the chip, with the four DP lanes and power is easy because they go into the same pins.
The problem is with the AUX channel and the hotplug lane, AUX should go to the two SBU1 SBU2 and hotplug to CC pin but i don't know if the signal is compatible.
There is the possibility that the chip has dedicated pins for direct displayport connection but we need the pinout.
Here more info about USB-C and displayport:

https://www.anandtech.com/show/8558/displayport-alternate-mode-for-usb-typec-announced
 
On the product page it's labeled as a DP converter with USB-C as the alternate mode:

http://www.lontiumsemi.com/uploadfiles/pdf/LT8711X_Product_Brief.pdf

I imagine it would be possible to connect a display port cable, if we had the pinout for the chip. But then it might need new firmware or something, who knows.
Keep in mind the power issue. An adapter connected on an USB-C port can drain as much power as it likes, one connected on a displayport input will receive only limited power supply (hence the USB external power supply on the sunix). That's why just replacing the USB-C plug with a displayport is likely to fail.
 
Keep in mind the power issue. An adapter connected on an USB-C port can drain as much power as it likes, one connected on a displayport input will receive only limited power supply (hence the USB external power supply on the sunix). That's why just replacing the USB-C plug with a displayport is likely to fail.

Oh i forgot that the USB-C voltage is 5 V to 20 V and displayport is 3.3 V, the chipset should have a dedicated pin for that if it is designed to be connected directly to the displayport cable as well as to the USB-C cable.
 
Does anyone know of a good dvi d to vga conveter that can atleast do 1920x1200 properly without the edges cut off?
 
Oh i forgot that the USB-C voltage is 5 V to 20 V and displayport is 3.3 V, the chipset should have a dedicated pin for that if it is designed to be connected directly to the displayport cable as well as to the USB-C cable.

I guess an modification would need to cut the lines for video signal, while leaving the power lines intact to go to a USB C charger or connector? Then splice a displayport cable directly to the chip?

I know we'd need a data sheet, and then there are voltage and current tolerances that would probably need to be considered. And then it might not work at all because the chip may need a different firmware.

I guess we really need a datasheet first.

Does anyone know of a good dvi d to vga conveter that can atleast do 1920x1200 properly without the edges cut off?

What's your GPU?
 
I guess an modification would need to cut the lines for video signal, while leaving the power lines intact to go to a USB C charger or connector? Then splice a displayport cable directly to the chip?

I know we'd need a data sheet, and then there are voltage and current tolerances that would probably need to be considered. And then it might not work at all because the chip may need a different firmware.

I guess we really need a datasheet first.



What's your GPU?
Asus R9 390X
 
Asus R9 390X

This is a bug in recent AMD drivers I think. It's definitely an issue with my 5700xt. Any "standard resolution" is using reduced LCD timings instead of CRT timings.

What you need to do is recreate the resolution in AMD's custom resolution tool and choose GTF or CVT timings.

You should also try using CRU to do it, just to do me the favor of finding out whether it works with your card or not. Because AMD is definitely ignoring CRU overrides with my 5700xt for some reason.
 
Hi,

I have two IBM C220p monitors, and have recently put two Delock 62967 (ANX9847) adapters on a Vega 56. However, it's not supporting more than 60Hz, and it's limited to 6 bit colour per channel, so there's banding onscreen. Same with both Windows 8.1 and Windows 10.

Any advice? I have a 780Ti I need to fix that I can potentially test the adapters with as well.

It's the 'almost original reference' Vega 56 as it's an MSI Airstream OC.

(No, I do not want to use another graphics card. I need something that works with PCI passthrough, that isn't too loud, and has decent open source support (RX5700 is too new, anything new Nvidia has poor open source support))
 
Looked quite a way back, ok, looks like a have to solder on a new Displayport connector as Delock's are awful.

Cable has been bought in the last month, so obviously Delock haven't been bothered about improving things.
 
Looked quite a way back, ok, looks like a have to solder on a new Displayport connector as Delock's are awful.

Cable has been bought in the last month, so obviously Delock haven't been bothered about improving things.
Given the limited demand for such devices, it's pretty likely all the adapters you'll find on the market are from the same batch that was tested here in the beginning. Either there is enough demand, and they'll make another one when everything is sold, or that reference will then just disappear.
 
Hi,

I have two IBM C220p monitors, and have recently put two Delock 62967 (ANX9847) adapters on a Vega 56. However, it's not supporting more than 60Hz, and it's limited to 6 bit colour per channel, so there's banding onscreen. Same with both Windows 8.1 and Windows 10.

Any advice? I have a 780Ti I need to fix that I can potentially test the adapters with as well.

It's the 'almost original reference' Vega 56 as it's an MSI Airstream OC.

(No, I do not want to use another graphics card. I need something that works with PCI passthrough, that isn't too loud, and has decent open source support (RX5700 is too new, anything new Nvidia has poor open source support))

This happens when the monitor is seen as non plug and play, when for some reason the monitor EDID is not seen by the graphic card.
Try with CRU monitor utility and set a custom EDID or copy-paste a full EDID backup of your monitor if you have it, it also works with non plug and play monitor registry keys.
You can make a full backup with export function of CRU, with the monitor connected to the analog output of your old graphic card.

The cable problem we talked before is another thing, it happens when the displayport signal isn't strong enough for the chipset receiver and can be solved by replacing the cable with a better one.
The original cable and connector are not that bad, the problem is the design of the adapter and the receiver which i think isn't good as the ones present on monitors.
When this happens you can't go over 180 MHz without problems, but up to 180 you have no limits about refresh or color depth.

60 Hz and 6 bit is a non EDID problem, it happened to me several times when i was using a dual monitor configuration (VGA+adapter), always resolved with an EDID override.

Oh and try with a monitor at a time
 
Last edited:
it's in decent condition, only flaws are scratches to ag film ...

My FW900 has the same problem. Quick warning: wiping with microfiber cloth will scratch this film!

Did you end up doing anything about it? I've seen a video of someone just removing the anti-glare film, and I'm wondering if there's a good replacement that could be applied, maybe something like ClearCal. I'm worried if I just remove it, reflections are going to drive me nuts.
 
If I have time next week, I'm gonna be hooking two of my units up together, one with AG and one without. Planning on doing a full wpb adjustment on both of them simultaneously, and then can compare the image quality side by side (won't be a perfect comparison because the unit without AG happens to have poorer focus).


I've been using my unit without AG for years. I have no problem since i work in a light controlled environment. I think the antiglare helps with static attracting dust though.
 
My FW900 has the same problem. Quick warning: wiping with microfiber cloth will scratch this film!

Did you end up doing anything about it? I've seen a video of someone just removing the anti-glare film, and I'm wondering if there's a good replacement that could be applied, maybe something like ClearCal. I'm worried if I just remove it, reflections are going to drive me nuts.
Removing the AR film is the most stupid thing to do unless it is so damaged it makes the screen unusable. It reduces reflexions, increases contrast, protects against EMI and removes static electricity.That's some kind of custom film produced specifically for that application, you won't find any standard equivalent.

Stay away from clearcal or any standard film advertised as ANTI-GLARE as this is some crap not suitable for a high resolution display, AG means it makes the display blurry. The correct type of film is an AR film (ANTI-REFLECTIVE) but then, any standard film you'll find will only be a poor stopgap. It may improve things compared to no film at all, but it won't replace the original properly.
 
Removing the AR film is the most stupid thing to do unless it is so damaged it makes the screen unusable. It reduces reflexions, increases contrast, protects against EMI and removes static electricity.That's some kind of custom film produced specifically for that application, you won't find any standard equivalent.

Stay away from clearcal or any standard film advertised as ANTI-GLARE as this is some crap not suitable for a high resolution display, AG means it makes the display blurry. The correct type of film is an AR film (ANTI-REFLECTIVE) but then, any standard film you'll find will only be a poor stopgap. It may improve things compared to no film at all, but it won't replace the original properly.

Thanks for the info! The scratches I have are just light lines where some of the darkened material is missing from the surface, but no tears all the way through the film. It is distracting on white backgrounds, but otherwise pretty easy to ignore.

It would be nice if there was something out there that could replace the original film properly.
 
Although I do kind of like the appearance of the raw glass on my debezeled (de-cased?) FW900, I certainly would not have removed the film if I hadn't damaged it.

Thankfully, for much of the day in the room, it's dark enough that it's ok. I think I read that Sony did not apply the film to the BVM displays using the same or similar tubes? If so, that's at least some solace.

Ideally, the FW900 would have gotten a proper baked on AR treatment like the F520, etc. (And the rest of the F520's advancements, but I think the FW900 is close enough that the additional size probably wins out.)
 
Although I do kind of like the appearance of the raw glass on my debezeled (de-cased?) FW900, I certainly would not have removed the film if I hadn't damaged it.

Thankfully, for much of the day in the room, it's dark enough that it's ok. I think I read that Sony did not apply the film to the BVM displays using the same or similar tubes? If so, that's at least some solace.

Ideally, the FW900 would have gotten a proper baked on AR treatment like the F520, etc. (And the rest of the F520's advancements, but I think the FW900 is close enough that the additional size probably wins out.)

I have a PVM monitor that doesn't have the anti-glare. The look is exactly the same as an AG-less FW900. FYI, the FW900 tube and the BVM-24 tubes aren't the same. They're very similar but not the same. Both employ SMPTE-C phosphors, but they're not the same tube, as far as I'm aware.
 
I wonder how long I'd be able to extend the life of my monitors if I removed the anti-glare, since I would be able to reduce brightness by some amount.
 
I wonder how long I'd be able to extend the life of my monitors if I removed the anti-glare, since I would be able to reduce brightness by some amount.

In order to answer that question, we'd have to know two pieces of information:

1: The transmission of the film (i.e. what percentage of light from the phosphors passes through).
2: How much the cathode ages as a function of beam current (I believe beam current is linearly proportional to the luminance). Also would be nice to know the same function but for phosphor aging.

flod and I attempted to deduce 1, but didn't get very good data. See discussion starting here. One way to do it would be to take luminance measurements of a full field white screen while some of the antiglare is off. You measure a spot on the screen where film is still present, then compare it to where it's removed.

2 is likely much trickier to ascertain. And cathode aging is probably not linearly related to beam current.
 
I wonder how long I'd be able to extend the life of my monitors if I removed the anti-glare, since I would be able to reduce brightness by some amount.

I don't think a full A/B is possible unless you calibrate with the film on, then remove the film, and calibrate with it off. By that point it would be too late to go back right? :)

My opinion only, as I'm sure there are some who would love to argue this until the end of time - I think that any benefits of long-life from the tube are overblown, to be honest. Just calibrate your monitor with the film still on and call it a day. Once you remove the anti-glare, there really isn't any going back, and while sure it potentially can look better in a *very* light-controlled environment, you limit your options with the film removed. When I had my FW-900, I removed the film and in hindsight I really wished I left it on. My two cents.

Also, if you regularly calibrate your monitor. Heck - just even once a year pull out WinDAS and do a white point balance, your tube will go for years and years (and years). I'm willing to bet that your electronics will fail before the tube is ever shot.
 
In order to answer that question, we'd have to know two pieces of information:

1: The transmission of the film (i.e. what percentage of light from the phosphors passes through).
2: How much the cathode ages as a function of beam current (I believe beam current is linearly proportional to the luminance). Also would be nice to know the same function but for phosphor aging.

flod and I attempted to deduce 1, but didn't get very good data. See discussion starting here. One way to do it would be to take luminance measurements of a full field white screen while some of the antiglare is off. You measure a spot on the screen where film is still present, then compare it to where it's removed.

2 is likely much trickier to ascertain. And cathode aging is probably not linearly related to beam current.
I can't find again the post but I answered the transmission question some time ago when I made measurements with a colorimeter with a film, and then with the film removed. It's about 66%. If I remember correctly, 63% for blue, 66% for green, 67% for red.
 
hah, turns out the measurements from 4.5 years ago were actually pretty accurate!

just spent half an hour trying to get transmission measurements. Not satisfied with the way I set it up. Hard to keep the smart phone (which I was using as a flashlight) at the same angle consistently, etc.

But got some decently consistent measurements.

With AG: 4.5 nits
Without AG: 10 nits.

(4.5/10)^0.5 = 0.67.

So 67% transmittance, just as flod suspected :)

Crazy, so removing the AG means the tube has to produce almost half the beam current to achieve the same brightness!

I may repeat in future with a better setup. If I have a regular flashlight, I can create a consistent setup from screen to screen fairly easily.
 
I have a PVM monitor that doesn't have the anti-glare. The look is exactly the same as an AG-less FW900. FYI, the FW900 tube and the BVM-24 tubes aren't the same. They're very similar but not the same. Both employ SMPTE-C phosphors, but they're not the same tube, as far as I'm aware.

Is it possible the anti-glare was removed from your PVM prior to your possession?

When thinking the D24 didn't have the anti-glare film and was a very similar tube, I was recollecting this discussion from Reddit:
Was even wondering if the reason the FW900 didn't get a proper baked on coating was because of the different and conflicting applications.

(Looking back at the Reddit thread, I suppose it's also possible they meant the D24 did have an anti-glare film, but a different one.)
 
Last edited:
Wow, then I stand corrected then - on all accounts. :) If you can control your ambient light then I guess I'd say go for it.
 
Hoping to get a definitive answer on this next week. Looking forward to using my startech 2 port splitter to do the test properly. It's gonna take a few hours, so gonna try schedule it in.

Gonna use two laptops, one connected to each monitor, calibrate them using WinDAS (setting the G2 voltage simultaneously so I get same black level).

At the end, I'll have two FW900's side by side, one with AG one without.

They'll be color calibrated virtually identically, being fed the same video signal.
 
Oh...I wouldn't remove the AR film unless it was damaged. Even in the dark it's probably still not quite the same, because the light the monitor itself generates?

I would feel better though if someone could confirm that the D24 didn't even have the film anyway.

I've been using Sony GDM CRTs almost forever. For my personal machines anyway. LCDs can't make a good black. How was that being ok ever even a thing?

That said, by the time I could afford Sony, they had already moved away from a proper badge. However...


S.jpeg
 
Back
Top