24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Reading the Dynamic Convergence document. I want to make some changes! :D But it should be known to people that this was my start of a bigger document. So when you see stuff like "Insert picture of error and correct" - it's that I meant to take a picture of what an out of convergence item looks like versus an in-convergence item looks like using the specific slider.
 
late to the party, what about no AG or Polarizer at all?


There are some advantages:
- there is much less stress applied to the electron guns, so image is generally crisper and warmup time is quicker
- I personally found inner glare to be handled better (again, probably due to less guns stress)
- corners are sharper, again due to less guns stress
- image is somewhat crisper, maybe because every foil has some imperfect structure, maybe because of less guns stress

Problem is the blacks are TOTALLY ruined in anything other than perfectly dark room. Reflections are huge. So if I was using FW900 in perfectly dark room all the time, I would probably go for no-AG. Unfortunately this rarely happens.

yall are too caught up with the idea of using a "polarizer"
any neutrally tinted film achieves the same effect (and generally will be even better than a linear polarizer)

also idk about the fw900 film, but the original cpd-g520p film isn't "diffusing" like the matte lcd antiglare films. the thin film coating on the film literally reduces reflectance, it doesn't spread them out.
(any diffuse reflections you see from a crt are from the phosphor layer)

Yeah, I guess this is true. Polarization might do wonders for certain reflections, but there might be more cons than pros after all.

Reminds me of when I took off the film of my old FW900. I really wish I didn't do that. Yes, under perfect lighting conditions - it was technically better. But lighting had to be damn near perfect to see the benefit.

My thoughts exactly. There are some obvious advantages, but you have to be damn close to perfect conditions to see them. Any other situation and image quality goes underwater.
 
I run a 40 watt lightbulb with a diffusing cover over it in my room. My monitor doesn't face the window, and my blacks are certainly not ruined. I get the impression that many folks here have very bright lighting and/or windows without curtains facing their displays.

Also, a bias light works very well.
 
I run a 40 watt lightbulb with a diffusing cover over it in my room. My monitor doesn't face the window, and my blacks are certainly not ruined. I get the impression that many folks here have very bright lighting and/or windows without curtains facing their displays.

Also, a bias light works very well.

Maybe my mistake was that I didn't use a diffusing cover. Oh well too late now. I haven't had an FW900 for almost two years now.
 
late to the party, what about no AG or Polarizer at all?.

from my experience as a polarizer user i am (to correct at some extent the no AG mess) , i agree with others removing the AC being a bad idea with more cons than pros.

i did it based on many testimonials i read from this forum about it being good and even fantastic so the monitor would get even better bright screen, but i was stubborn and did not pay enought attention of the cons and requirements for it, such the importance of using the monitor on a complete dark or properly light controlled room.

similar as igsux3 said, my monitor when turned off, the screen area looked gray instead black and so the blacks became gray when turning monitor on, even if brightness setting was 0, even in the night but with the only room light on the ceiling turned on, the only way i was able to perceive blacks was in the night with the room light tuner off and room totaly dark. but monitor luminance was so strong with my eyes under that condition that i had to reduce the contrast to reduce the luminance killing my eyes, but that also reduced the only good thing i notice from not using a coating or polarized on the monitor: a brighter screen.

the room where the monitor is has a blackout facing the monitor screen that covers an huge window and blocks natural light, on the left side of the monitor has courtains covering another large window which remains closed and block about 50% of the sun light, there are no windows from the right and back sides of the monitor, but even with that the monitor with no AG was a grayish mess during the day.

after all that and since i dont like to have such a dark room, now i would only find worth to retire the original coating if it has a considerable damage like being very scratched.

for a reason sony engineers decided to put that AG coating on this monitor.
 
Last edited:
Some update with the AR film.

I couldn't find a company willing to provide this in Europe (there are very few companies selling optical grade filters here and they apparently only care about mass production, which is outsourced in Asia).

But I found a company in the USA which may be able to provide something. They don't have the right product at this time but they are looking for a solution fitting the requirements I sent them. Answer in a few months.
 
Thanks for looking into this Strat. I'm sure there are quite a few ppl here who will benefit from this.
 
You don't have to thank me. I have 2 screens unusable because of a damaged AR film, that alone is quite motivating. ;)
 
Looooooooooong time lurker, first time poster here
I am in central Texas first of all. Thinking of selling or trading the following....

Would anyone be interested in 1 or 2 FW900s, one is missing the housing (case) and the other is painted black, never done any windas tweaks, just using the built in controls. Both have film removed as I used them in dark room, never had to deal with glare

1. Very good picture, sharp all around, no blurry text anywhere, colors great, blacks excellent, been my daily driver for 5+ years now, never failed me, Been running 1600x1000 at 85hz for years now (no case)
2. Also very good picture except lower right corner where clock would be is a little blurry, never attempted to fix, case painted black, Ran 1440x900 at 100hz for years on this one (black case, not a pro paintjob but looks good)

Just putting it out there.

EDIT* 15,000th post woot
 
Good news,Delock sent me a DP to VGA sample converter with the ANX9847 chipset and i tested it.
After loading the system and without any EDID override you can set 1920x1440 85Hz
The maximum pixel clock is 340 MHz (like amlet sample with anx6212)
After 340 MHz it becomes unstable with strange noise on windows borders,after 341 it starts to lose the video signal,342 is not accepted.
I can cover all the FW900 resolutions up to 1920x1200 96Hz,tested 2304x1440 71Hz stable.
I tested the input lag with SMTT 2.0 (photo and video 120 FPS at 120 HZ),it is inexistent.
Image quality seems good and clear,but i have to test on my fw900 for this because i tested on my brother monitor (a Lacie Electron 22 Blue IV) with a GTX 1070
The converter is 24 bit and it doesn't support the DisplayPort Content Protection
I have to do some tests with videogames,stability after hours and check if the adapter overheats,but for now it is very good
The 340 MHz limit seems related to the DP receiver and not the DAC,maybe increasing the chip voltage can help but the max possible is probably 358 MHz and i don't think it's worth the time for 18 MHz of clock.
Keep in mind that all the tests have been done with standard CRT timings of CRU and tweaking with blanking can help to reach higher refresh rate.
Delock told me that if i like it then they can start the production,so best not to waste time with vcore,heatsink and other things like that.
Soon i will test it on my FW900 with my AMD 7950 to see if it reach the same limits,but first i need to find a mini DP to DP adapter.
I will keep you updated
 
Awesome news Derupter.

My one crazy suggestion is that they make an adapter with two ANX9847's in parallel. Have a 300mhz oscillator to switch between them quickly, alternating each pixel, but output the pixels to the same analog signal. That way you can get the pixel clock up to 600+mHz, allowing you to run resolutions like 4K @ 96hz interlaced on a FW900, 2560x1920@70hz on a LaCie, etc. Maybe have a separate signal generator chip to generate H and V blanking signals, as well as front and back porch.
 
Last edited:
Great news!

Especially since I just picked up another Dell P1130 from Craigslist. The listings for CRTs have pretty much dried up in the last couple of years, but luckily I found this one listed for $10 as just a "Dell monitor" with no mention of CRT anywhere. It was being used in a metal fabrication shop, so it was covered in dust and not actually hooked up, but for the price, I was willing to take a chance. Got it home and cleaned it, and it's actually in incredibly good shape -- better even than my other one.

2CRTs_k8k38qsmoi.jpg


Manufactured in 2002! I love this stuff.

So now I have two computers: one with a 1070 for the Samsung TV and the other with a 980 Ti for one of the CRTs, but then I'm out of DVI-I outputs. I have that StarTech DP-VGA adapter, but it's only 240 Mhz. We've definitely got a bunch of people here who are willing to buy one (or more) of these ANX9847 adapters.
 
congrats on the score Ashun, and that's a very nice photo (and good to see the DTP-94 making a cameo)
 
Ashun

I have a similar monitor, the Dell P992, basically a 19 inch version, but I can't tell if there is a place to plug in the winDAS cable. Do I just need to take the back of the monitor off to find it? I know some Sony monitors actually have a little 2cm pop-out panel for the winDAS cable.
 
I believe the P992 is supported by WinDAS. You might have to dig around, but in the service manual, there seems to be schematics for the 4 pin TTL port (top right of page 4-11)
 
Do you have some tests to advise me to test if the DAC is doing the right job?
Better to test 2d or 3d?
 
Do you have some tests to advise me to test if the DAC is doing the right job?
Better to test 2d or 3d?

I asked flood, and he suggested that one thing you could do is to check to ensure that the DAC isn't interfering with the LUT. So make sure that sending 0 is really black, and 255 is really peak white (and test some things in between). You can do this by testing the DAC output vs a VGA output, and comparing luminances (if you have a colorimeter).
 
The most important things to look for are blur or interference problems I think.

The meme pattern from Sony is great to assess sharpness. I also use R/G/B colored variants.

For interferences, some usual Windows environment should be enough, like the right click menu or displaying windows explorer. There shouldn't be trails behind characters or ghosts (seen more easily when there's a lot of contrast, like black characters on light gray or white for example).

edit: Derp, the forum downsized the images, which makes them blurry and unusable. I uploaded the original files zipped instead.
 

Attachments

  • Meme RGB.zip
    11.9 KB · Views: 25
Last edited:
I finished my tests with the adapter with my FW900 and AMD 7950
For 2D i used Monitor Diagnostics of AIDA64 and meme pattern,for 3D videogames.
The image quality is the same if not better that native AMD DAC,image is sharp and clean up to 340 MHz (same limit as GTX 1070),the DP link limit is 358 MHz so the 340 MHz limit is probably the internal DP receiver of ANX9847.
Tweaking with CRU i reached 2304x1440 74Hz with 338.79 MHz and 2560x1600 60Hz with 339.03 MHz (both reducing the blanking)
I had to delete with CRU two resolution(1920x1440 85Hz and 2048x1536 75Hz) from the Standard Resolutions List and create them custom in Detailed Resolution because they wasn't stable (this because both resolutions with CRU are under 340 MHz,instead with VGA using different timing,probably GTF,the pixel clock was over 340.5 MHz)
So all the resolutions near 340 MHz (338-339) are better to create custom and not let the VGA to handle them.
I played a couple of hours at 2304x1440 74 Hz and the adapter didn't overheat
 
Delock replied to me,they will start to produce this converter,the item number will be 62967,they think that it will be in stock end of September or early of October
 
Last edited:
Delock replied to me,they will start to produce this converter,the item number will be 62967,they think that it will be in stock end of September or early of October

Awesome.

One question though: I guess 340mHz is the fastest chip they have? I'll just be really bummed if we never get a DAC on par with what's in our graphics cards. With Toasty X's pixel clock patcher, you can actually go above 500mHz on analog equipped video cards.
 
Awesome.

One question though: I guess 340mHz is the fastest chip they have? I'll just be really bummed if we never get a DAC on par with what's in our graphics cards. With Toasty X's pixel clock patcher, you can actually go above 500mHz on analog equipped video cards.

Delock takes what is on the electronics market to create adapters and other devices with their logo,i don't think they make custom chips.
A good solution would be to take a DP receiver and an external DAC like ADV7123JSTZ330 or the TDA8777 (both are 330MHz 10 bit) and put all together.
A product like this probably can do 400 MHz,but you need someone for the electric diagram,firmware,pcb and other components and this costs money and time and no one would do it without the right economic return.
Another solution was to take an all in one chipset,just ready for converters and make a dongle with that.
I searched all over the web for these chipset and the Analogix is the fastest i could find,the others (Lontium,Chrontel,Parade,ST,ITE)are all under 200 MHz DAC
And Analogix is not even a 340 MHz DAC,but a 270 MHz 24 bit,the fact that it can do 340 MHz stable is just our luck.
Personally the max resolution i use is 1920x1200 85 Hz,most of the time 1680x1050 100 Hz,rarely i used 2304x1440.
For me internal rendering resolution is most important than output resolution,with new Nvidia VGA downsampling from 4x works very well,without blur and you can choose the resolution to apply it (with AMD you can't do it and everytime someone find a hack,they block it with new drivers),this is a must with a CRT,you can set 1680x1050 as native resolution and make 3360x2100 or 1440x900 for 2880x1800 and it works with all the graphics API.
Look HDFury,they made custom chip but how much does it cost?
Delock probably will be under 40 euro,a hypotetic HDFury 5 how much?
I think that sooner or later HDFury 5 will come out,i can't believe they invested time and money for nothing
 
The most important things to look for are blur or interference problems I think.
The meme pattern from Sony is great to assess sharpness. I also use R/G/B colored variants.
For interferences, some usual Windows environment should be enough, like the right click menu or displaying windows explorer. There shouldn't be trails behind characters or ghosts (seen more easily when there's a lot of contrast, like black characters on light gray or white for example).
edit: Derp, the forum downsized the images, which makes them blurry and unusable. I uploaded the original files zipped instead.
Not sure where you found it, but that's not the correct meme pattern. It's supposed to be non-uniform.
This will tile correctly if you set it as a background image.

memejzoyh.png


And I made some RGB ones if that helps:

meme-r47pr7.png
meme-g49olt.png
meme-bxaoa2.png
 
Derupter, this is fantastic news, adaptor available by mid fall! Thanks for your awesome efforts here.

This might have already been answered, but do you know whether the DAC is higher than 8 bit?
 
Derupter, this is fantastic news, adaptor available by mid fall! Thanks for your awesome efforts here.

This might have already been answered, but do you know whether the DAC is higher than 8 bit?

From specs it is a triple 8 bit DAC
 
Not sure where you found it, but that's not the correct meme pattern. It's supposed to be non-uniform.
This will tile correctly if you set it as a background image.

memejzoyh.png


And I made some RGB ones if that helps:

meme-r47pr7.png
meme-g49olt.png
meme-bxaoa2.png
That's the meme pattern provided with the WPB guide in black and white. I don't know if it is supposed to be right or not, but it works perfectly for its original purpose, setting focus.
 
That's the meme pattern provided with the WPB guide in black and white. I don't know if it is supposed to be right or not, but it works perfectly for its original purpose, setting focus.
It's a specific pattern that intentionally uses 1px thick lines and both 1px and 2px gaps in the "letters".
I'm having trouble recalling which technical paper it was from specifically, but what I posted is the meme test pattern which is supposed to be used to adjust focus.
 
Actually, Derupter, do you have a colorimeter? If so, I have a test you can do to see whether your vid card supports 10 bit LUT precision when using the Analogix
 
Actually, Derupter, do you have a colorimeter? If so, I have a test you can do to see whether your vid card supports 10 bit LUT precision when using the Analogix

No i don't have it,if you have a test that don't require special hardware i can do it
 
Back
Top