24" Widescreen CRT (FW900) From Ebay arrived,Comments.

here's a macro shot of mine

yea it looks a bit sharper than this

but my monitor is sharp enough that if you're coming from a ~100ppi lcd you will not find it blurry in any way

Great thank you! This is how it looks like on my monitor as well. You're better than me at taking macro photos.

interesting idea. Glad it seems to have worked.

Yeah, but sadly, it really didn't last long... Today 2min after powering up it did it again. If it does once more, I'll see if I'm quick enough to catch it on video.

Soudns perfectly normal to me. The initial harsh buzz when you first turn monitor on, and the clicking when you switch resolutions match the sounds from all my FW900's.

Ok that's good, thank you! So it seems like everything is still fine, except my flyback showing signs of weakness.
If I replace it, my FW900 should be good to go again... until another thing fails (I hope not).
 
this is based on weak anecdotal evidence, but you could try leaving the tube on for a couple days straight, and maybe that will allow it to work through the flaky section of whatever element is causing the issue.
 
this is based on weak anecdotal evidence, but you could try leaving the tube on for a couple days straight, and maybe that will allow it to work through the flaky section of whatever element is causing the issue.

I take it this would potentially burn out any shorts?
 
well I imagine that if you currently had a short you'd get that out of focus behaviour all the time. I'm guessing that there's an element in there somewhere that is occasionally flaking off material and those flakes are causing temporary shorts. So just a guess that keeping tube on for a while will allow it to go through its flaky phase.

This is all pure speculation.
 
That might be a good idea, I'll see. But I don't really want to let this monitor on while I'm asleep.
If something happens, I don't know what, I won't be there to notice and power it off soon enough.
Btw, after what it did this morning, all day long, nothing else happened...

Like a friend of mine said, "You're not afraid? Your monitor seems to be possessed!". :D
 
just had a friend come over. First thing upon seeing my room:

"Woah, that thing has a huge butt!".

"Wait, is that a TV?"

She'd never heard of a CRT, and when I put on some HD reference material she was blown away. Kept saying how the picture looked "perfect".

Freshly calibrated too, looks gorgeous :)
 
My first HDTV was relatively uncompressed video from a satellite box feeding an FW900. Was pretty incredible.

Been long enough now that some people may not even realize what image quality is supposed to look like. (Though to be fair, even back in the day, plenty of folks had black levels and other settings messed up on their CRTs, so maybe it's always been so...)
 
has any one ever tried to oc their RAMDAC to get speeds like 85hz @ 1440p or can that be done with a default clocked RAMDAC?

EDIT: what are the differences between the GDM-C520, GDM-F520 and FW900?
 
Last edited:
i think that (for nvidia at least) ramdac has a 400mhz pixel clock limit. 1440p at 85hz would be beyond the fw900's hscan limit.

fw900=widescreen (~16:10), variable aperture grill pitch from 0.23-0.27mm, 121khz max horizontal scan

f520=4:3 aspect ratio, 0.22mm aperture grill, 137khz hscan

c520 is the artisan? or is that another c520? anyway the artisan is 4:3 aspect ratio, 0.24 aperture grill, 130khz hscan, and supposedly tighter tolerances and finer adjustments for color-related stuff (http://lists.apple.com/archives/colorsync-users/2003/May/msg00452.html)

is it bad that i somehow have all this memorized?
 
I tried really really hard to track down that seybold white paper on the artisan. No luck.

I imagine the artisan supported hardware adjustments at a precision of beyond 8 bits, unlike the FW900. But it's somewhat of a moot point if you're comfortable using software LUTs that can be respected in the desired applications.
 
i think that (for nvidia at least) ramdac has a 400mhz pixel clock limit. 1440p at 85hz would be beyond the fw900's hscan limit.

fw900=widescreen (~16:10), variable aperture grill pitch from 0.23-0.27mm, 121khz max horizontal scan

f520=4:3 aspect ratio, 0.22mm aperture grill, 137khz hscan

c520 is the artisan? or is that another c520? anyway the artisan is 4:3 aspect ratio, 0.24 aperture grill, 130khz hscan, and supposedly tighter tolerances and finer adjustments for color-related stuff (http://lists.apple.com/archives/colorsync-users/2003/May/msg00452.html)

is it bad that i somehow have all this memorized?

The Sony GDM-C520 is the Artisan monitor.

Sony offered this monitor as follows:

1) As a display (GDM-C520),
2) As a Color Reference System (GDM-C520K)...everything included in the box (CRS software, the color sensor, the Artisan digital cable, and the monitor hood),
3) As a display (GDM-C520) bundled with the Color Reference System GDM-CA20 which is the CRS software, the color sensor, the Artisan digital cable, and the monitor hood). There were two (2) separate boxes shipped: the monitor and the kit

I have used the Sony Artisan extensively and have compared the image quality with so many displays, and it is second to none... Not even the BARCO comes close to a properly calibrated and adjusted Sony Artisan. It is just a work of beauty!

But again, beauty is in the eye of the beholder...

Hope this helps...

UV!
 
I tried really really hard to track down that seybold white paper on the artisan. No luck.

I imagine the artisan supported hardware adjustments at a precision of beyond 8 bits, unlike the FW900. But it's somewhat of a moot point if you're comfortable using software LUTs that can be respected in the desired applications.

I have it... You could have asked me for it...

UV!
 
I posted about it last year

though I guess it wasn't obvious which white paper I was referring to in that post. Glad it's alive though, I was saddened at the prospect that it had disappeared forever :)
 
Can someone tell me what do these pixel layouts(i think that's what they're called) look like in LCD monitors? these are from CRT's:

jqWYACX.png


or can LCD have both of these types, as shown in this image:

8WLVmnW.jpg



In other words, what would my 0.26 pixel pitch LCD be like, compared to 0.24 aperture grille CRT? Just looking at the pixel density and not other stuff...
 
Last edited:
My current CRT supports 2560x1920i@120Hz. I've been looking at the FW900, but I read a post by Unkle Vito saying that it couldn't handle above 2560x1600i@60Hz. Can the FW900 handle 3072x1920i@120Hz or similar resolutions? I wouldn't want a FW900 if I would have to drop my resolution that much to use it. Here are my current settings: https://imgur.com/pnd7WLA
 
My current CRT supports 2560x1920i@120Hz. I've been looking at the FW900, but I read a post by Unkle Vito saying that it couldn't handle above 2560x1600i@60Hz. Can the FW900 handle 3072x1920i@120Hz or similar resolutions? I wouldn't want a FW900 if I would have to drop my resolution that much to use it. Here are my current settings: https://imgur.com/pnd7WLA

"You play... You pay..." No rocket science!

The GDM-FW900 has VESA recommend resolutions (see my previous posts for full description). We have conducted torture tests on this CRT and it is capable of achieving higher resolutions and timings, but being able to sustain them without diminishing the life of the tube is the million dollar question. And I think you all know the answer to that...

Hope this helps...

UV!
 
My current CRT supports 2560x1920i@120Hz. I've been looking at the FW900, but I read a post by Unkle Vito saying that it couldn't handle above 2560x1600i@60Hz. Can the FW900 handle 3072x1920i@120Hz or similar resolutions? I wouldn't want a FW900 if I would have to drop my resolution that much to use it. Here are my current settings: https://imgur.com/pnd7WLA

There's no way your CRT can properly render those resolutions. You're achieving nothing in image sharpness by driving the electron beam at that frequency other than a degradation in image quality.
 
My current CRT supports 2560x1920i@120Hz. I've been looking at the FW900, but I read a post by Unkle Vito saying that it couldn't handle above 2560x1600i@60Hz. Can the FW900 handle 3072x1920i@120Hz or similar resolutions? I wouldn't want a FW900 if I would have to drop my resolution that much to use it. Here are my current settings: https://imgur.com/pnd7WLA

Can't say that I have ever tried that on my screen. Nor do I plan to. Sorry...
 
Yeah dude, I agree with spacediver. You've have to be out of your mind to drive it that high and fast. Not only will image quality be crap, but there's also the lifetime of your CRT to worry about.

There's no new replacement for your old CRT when it breaks, remember? And quality used units are getting harder than ever to find.
 
There's no way your CRT can properly render those resolutions. You're achieving nothing in image sharpness by driving the electron beam at that frequency other than a degradation in image quality.

Once, we were able to achieve 2560x1600i@60Hz... but not sustain it... Any other higher resolution, the monitor bombs...

UV!
 
Why would you want to run an interlaced signal? You run into weird artifacts when you interlace, even on a crt. Hell it's distracting when I play video games on my CRT at 480i and I can see two interlaced frames on it.
 
Why would you want to run an interlaced signal? You run into weird artifacts when you interlace, even on a crt. Hell it's distracting when I play video games on my CRT at 480i and I can see two interlaced frames on it.

At that high non-VESA resolution, the monitor cannot and will not run non-interlaced.

UV!
 
My current CRT supports 2560x1920i@120Hz. I've been looking at the FW900, but I read a post by Unkle Vito saying that it couldn't handle above 2560x1600i@60Hz. Can the FW900 handle 3072x1920i@120Hz or similar resolutions? I wouldn't want a FW900 if I would have to drop my resolution that much to use it. Here are my current settings: https://imgur.com/pnd7WLA

Also, I have to ask - is this the same monitor that you are experiencing issues on? If it is, then you probably want to drop that timing down. You're almost at 400 Mhz...
 
There's no way your CRT can properly render those resolutions. You're achieving nothing in image sharpness by driving the electron beam at that frequency other than a degradation in image quality.

I know that I am way above the dot pitch, but things still look quite a bit sharper for some reason.
 
Also, I have to ask - is this the same monitor that you are experiencing issues on? If it is, then you probably want to drop that timing down. You're almost at 400 Mhz...

This monitor has been clicking, but I don't think that running an extremely high resolution will hurt it.
 
This monitor has been clicking, but I don't think that running an extremely high resolution will hurt it.

EDIT: Sorry - that was rude. I believe that you are doing damage to your monitor by running it a bit higher than spec, especially if you have your monitor capped at the bandwidth limit of your RAMDAC. By the way - I'm shocked that your screen can scan that. Are you absolutely sure it's scanning that rate and not converting it somehow?

My P991, I believe it had a converter in it or something, that would scale down a scan rate if it wasn't compatible with it. I accidentally took the resolution to 1920x1440 at 85 hz I believe, and it synced up no problem. But looking at the screen, it was very evident that the monitor wasn't even trying to display all pixels. And fyi - scaling was disabled on my video card. I really wish I had it still so that I could test it, but it took me by surprise that it would go that high, even though it was quite evident at the picture it was displaying - that it could not display that resolution properly.

A P991 is a rebranded Trinitron. Your Nokia is also. I believe the P991 actually displayed the resolution that I was using (which was the full readout of 1920x1440 85hz). I wonder if some of these lower-end consumer model Trinitrons have the ability to accept a higher signal than they're capable of driving? And internally resync it to a good, recommended resolution?

My Sony HDTV does essentially this. All HD signals (1080i and 720p) are resynced to 1080i - regardless.
 
Last edited:
EDIT: Sorry - that was rude. I believe that you are doing damage to your monitor by running it a bit higher than spec, especially if you have your monitor capped at the bandwidth limit of your RAMDAC. By the way - I'm shocked that your screen can scan that. Are you absolutely sure it's scanning that rate and not converting it somehow?

My P991, I believe it had a converter in it or something, that would scale down a scan rate if it wasn't compatible with it. I accidentally took the resolution to 1920x1440 at 85 hz I believe, and it synced up no problem. But looking at the screen, it was very evident that the monitor wasn't even trying to display all pixels. And fyi - scaling was disabled on my video card. I really wish I had it still so that I could test it, but it took me by surprise that it would go that high, even though it was quite evident at the picture it was displaying - that it could not display that resolution properly.

A P991 is a rebranded Trinitron. Your Nokia is also. I believe the P991 actually displayed the resolution that I was using (which was the full readout of 1920x1440 85hz). I wonder if some of these lower-end consumer model Trinitrons have the ability to accept a higher signal than they're capable of driving? And internally resync it to a good, recommended resolution?

My Sony HDTV does essentially this. All HD signals (1080i and 720p) are resynced to 1080i - regardless.

I'm pretty sure it's displaying the full, 2560x1290 resolution.
 
because of the electron beam's spot size, there's really no point to going above 1440 vertical resolution...
 
First of all i would like to say hello to all :)

I did read a lot in this thread in the past and its a pleasure to see that there are still people around who enjoy the unbeaten FW900 graphic displays. I personally own 2 of them and also an older W900. I also have a DTP94B and a TTL cable but never took the time to get into the WINDAS config stuff. I was always satisfied with the picture of my main screen. The other one is reserve.

I did put my reserve screen on yestersay for testing after 3 years in the cellar. It was running about 15 minutes and then it just went off. No loud sound or something smelling. It just went off. If i switch it on now just the relay clicks 1 time but the "zonk" sound is not coming and the picture stays black. the switch light is green until i press the joystick to the right then it starts flashing orange in a one second interval. How i can find out what the problem is?

Since i dont have that much time anymore at all i would also consider to give all the screens and the DTP94B away to somebody who is into that stuff and has the time to repair and enjoy the screens. I live in Jena, Germany. If somebody is intrested than write me.

Regards Philipp
 
My P991, I believe it had a converter in it or something, that would scale down a scan rate if it wasn't compatible with it.

Nah, I'm 99.9% sure it doesn't do that. I have a couple P991's and I sometimes run 2560x1920i 90hz, and it's definitely scanning at that rate (effectively 45hz but interlaced to avoid flicker, great for games that you want to crank up the visuals on but can't hit 60fps). It's incredibly sharp. I know it's not fully showing that resolution, all the pixels will be blurring together somewhat, but it looks really good. It's almost like an analog anti-aliasing.
 
As some of you may know, I have a Sun GDM-FW9010 from Korea. If anyone has played Alien Isolation, it is a very dark game. When there is something bright white like a light on a true black background and I look around in a spiral for instance, I can turn off the monitor and still see the spiral glowing and it takes forever to go away. The light 'tracers' make the game unplayable because it's very distracting. Hopefully that makes sense. Any ideas what is going on? Is the phosphor not aging well? Is there a way to tell how many hours or the age of the tube?
 
Last edited:
Just bought a FW900 here in Italy! I'm excited and a bit scared of I will end up with once I pick it up this weekend, but at least I finally managed to find one over here. Once I get it I'll have to look this thread up to fine tune it ^^
 
As some of you may know, I have a Sun GDM-FW9010 from Korea. If anyone has played Alien Isolation, it is a very dark game. When there is something bright white like a light on a true black background and I look around in a spiral for instance, I can turn off the monitor and still see the spiral glowing and it takes forever to go away. The light 'tracers' make the game unplayable because it's very distracting. Hopefully that makes sense. Any ideas what is going on? Is the phosphor not aging well? Is there a way to tell how many hours or the age of the tube?

that is by design unfortunately and a serious drawback, especially when one of the reasons for this monitor is its contrast.
Since it was made with graphic design work/cad in mind, the phosphor has a longer decay time than other CRTs.
Nothing you can do about it really, maybe try playing around with brightness/contrast settings but you won't get rid of it.

Just bought a FW900 here in Italy! I'm excited and a bit scared of I will end up with once I pick it up this weekend, but at least I finally managed to find one over here. Once I get it I'll have to look this thread up to fine tune it ^^

Congrats man!
Must be very lucky to find one in Italy, even here in Germany there are none available right now i think.
Take some pics :)
 
Back
Top