24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Yes, but I meant, maybe you had in mind to mount them on the ceiling for example.
I just had a look at the weight of these, I was thinking about maybe 120-140lbs but... 182lbs?! That's definately not something to move twice I'd say, these are monsters!
 
Yes, but I meant, maybe you had in mind to mount them on the ceiling for example.
I just had a look at the weight of these, I was thinking about maybe 120-140lbs but... 182lbs?! That's definately not something to move twice I'd say, these are monsters!

Oh I'm sorry! So yeah, I'm not at that point yet. CRT projectors are monsters to setup. They are definitely not plug and play. I'm also going to be polling opinions on Curt Palme's website as to if I should use the "used" tubes or if I should just repair the projector with the virgin tubes. I think on Curt's 10 scale, the tubes are probably 5-7 for the Blue and Green and an 8 on the red. Unfortunately... G70 tubes are hard to find. Very rare. There are some other 8-inch LC sets that will mate with the G70 with some modification. Thankfully I have two full projectors so I may be able to modify a couple of the neck boards to fit the different tube pins on the other "sister" models of tubes. But to figure out condition, I'm going to need to pull the lenses off the projector, as there's only so much a flashlight will show.

My thoughts for now are to ceiling mount the projector and aim it at either a 60-inch or 65-inch 16:9 screen. Why so small? Easier to drive, my basement is small, and we're already used to a 46-inch television so why not? My hope is that the smaller screen will allow enough brightness to down the contrast and brightness so that we can milk these tubes for all they're worth.

My next plan is to get a Moome HDMI card for these
 
Good day enthusiasts and image quality connosseurs!

Some questions.
I have the chance to aquire an FW900.
How well does this monitor display a resolution like 1024x768? Does it look sharp with black bars left and right? Any drawbacks to using lower resolution (one of the main reasons I want to use a CRT, aside from reaction time and blur)?
Does the monitor get violently hot during summers, is there an unbearable buzzing noise?

Lastly, could someone perhaps take a sharp screenshot of using a low resolution?

Thanks in advance guys.
 
Last edited:
Good day enthusiasts and image quality connosseurs!

Some questions.
I have the chance to aquire an FW900.
How well does this monitor display a resolution like 1024x768? Does it look sharp with black bars left and right? Any drawbacks to using lower resolution (one of the main reasons I want to use a CRT, aside from reaction time and blur)?
Does the monitor get violently hot during summers, is there an unbearable buzzing noise?

Lastly, could someone perhaps take a sharp screenshot of using a low resolution?

Thanks in advance guys.

Well, I did the best I could, but I know someone else here can probably do a lot better with the photo. I think I remember seeing a long time ago some really sharp pics from spacediver, maybe I'm wrong, I can't find them anymore.


(You can click on that pic to get the full resolution.)

This is Battlezone II running at 1024x768@85Hz on my monitor, but the refresh rate can go a lot higher than that at such a low resolution. I know it can do 140Hz, but I'm limited by my HDFury3 right now, I can't go really further. It is really sharp, and I can tell you the sharpness does not seem to be affected at all by higher refresh rates.
Being able to use any resolution supported by the monitor, with no compromise in terms of picture quality, sharpness, is something that only CRTs can do nowadays! Drawbacks, no definately not I'd say, but let's wait for more opinions on that.

During summer, here where I live, temperature never goes higher than about 30-33°C (86-92°F). I'm using my FW900 without its cover and it doesn't seem to run that hot I'd say.

And the monitor makes some noise when switching resolutions, but it's just a nice *click*, apart from that there is a faint buzzing noise, but you really need to get close and even though, it's barely noticeable.
A while back I posted a YouTube video where I recorded the noises made by my FW900. I was wondering if everything was fine, and it was. I can tell you the microphone was sitting on the metal shielding of the monitor, cover removed. It may not be the best microphone ever (I'm not sure but I think it was the built in mic of my Google Nexus 5 phone), but it did its job pretty well here.



Hope I could help!
 
Some questions.
I have the chance to aquire an FW900.
How well does this monitor display a resolution like 1024x768?

- It displays all resolutions superbly. Do note that the lower you go, the more prominent the scanlines will be. As for geometry of the picture - you get to decide if you see black bars or not. Just like a 4:3 monitor can be vertically clamped to view a 16:9 image, a 16:10 CRT can be horizontally clamped to view a 4:3 image.

Does it look sharp with black bars left and right?

- See above. Yes, it looks very sharp.

Any drawbacks to using lower resolution (one of the main reasons I want to use a CRT, aside from reaction time and blur)?

- The only issues I can possibly see is that the scanlines are pretty prominent. It may bother you, it may not. I never cared either way. Scanlines start disappearing after the 1024 vertical resolution mark.

Does the monitor get violently hot during summers, is there an unbearable buzzing noise?

- For heat? No. It does get warm, but all CRT's do. The FW900 does not get any hotter than any other CRT I've ever used - big or small. For the noise, I don't recall mine ever making a buzzing noise. Some of my friends could hear it though. Depends on your ears.

Lastly, could someone perhaps take a sharp screenshot of using a low resolution?

I don't have one anymore - wish I could. :(

Thanks in advance guys.

No sweat. I hope you end up getting it. If so - welcome to the club. :) If not, welcome to the club. We're a pretty accepting bunch.
 
Just got the Artisan this evening and had to snap a picture ;)

I do however still have some tweaking to do on all of them
 

Attachments

  • 0525162244.jpg
    0525162244.jpg
    131.5 KB · Views: 288
Very cool. and looking at the link above they can hit 1200p at close to 90hz! So you could definitely do 1440p at 60hz, maybe a little more (will be a little blurry though). Given that info, these could be pretty solid for video games.

Let us know how hard they are to get up and working, especially getting convergence dialed in. I've seen some Sony projectors go on sale near me before, so it's something I've though about.

Forget about 60Hz on this tech, is pretty much eye cancer.
 
out of curiosity what is the Sony fw900 equivalent for 4:3, assuming its one of the diamondtrons?
 
out of curiosity what is the Sony fw900 equivalent for 4:3, assuming its one of the diamondtrons?

For Diamondtrons, any of the 0.24mm 22 inch monitors should be very similar. I'd say the Sony Artisan and F520 are smaller FW900's. The 520's tighter pitch is barely noticeable in comparison.
 
so whats the best card that still has analog support

was hoping the 1070 would have it but looks like nv dropped it from all of their cards

also does anyone know if the fw900 supports 1:1 pixel mapping? I have an rgb scaler that requires it to get the most out of the scaler
 
would say the 290 at this point or 980 ti (in before its gimped in drivers like the 780 ti). imma send a email to moome to see how the new ramdac he is making is coming along. I kinda want to get a f520 and crank it to 170 hertz to see if I could use it. but man the f520 are harder to find then fw900's
 
I really do hope they make a new HDfury or something cuz I want to build a new pc so I can get back into modern gaming using this monitor.
And yes to answer the above question this thing handles low res like a champ. Dreamcast games and Wii/Wii U games look amazing running at 480p using this monitor.
I can post some pics later if you want. There are some thin scan lines at 480p. They don't look as pronounce as they do one my other crt monitors but they are there.
 
280x/380x sorry, its hard keeping this ridiculous naming schemes of graphics cards to memory And for the dear love of god, no, not a titan x, its sucks as a gaming card or a compute card. also im not sure but if the 1080 doesnt have analogue, i'm quite sure the 1070 wont either.

honestly it depends on the games you play, but I rather go team red consider how more consumer friendly they are then nvidia ever has been.
 
Honestly, if HD Fury doesn't come to the rescue, might be time for some/one of us to take one for the team and learn how to build our own External DAC's. Who's with me?! :D
 
Well, I did the best I could, but I know someone else here can probably do a lot better with the photo. I think I remember seeing a long time ago some really sharp pics from spacediver, maybe I'm wrong, I can't find them anymore.


(You can click on that pic to get the full resolution.)

This is Battlezone II running at 1024x768@85Hz on my monitor.



Hope I could help!


- It displays all resolutions superbly. Do note that the lower you go, the more prominent the scanlines will be. As for geometry of the picture - you get to decide if you see black bars or not. Just like a 4:3 monitor can be vertically clamped to view a 16:9 image, a 16:10 CRT can be horizontally clamped to view a 4:3 image.

- See above. Yes, it looks very sharp.

No sweat. I hope you end up getting it. If so - welcome to the club. :) If not, welcome to the club. We're a pretty accepting bunch.

Thank you two so much. I am so convinced into getting one of these now. Is there a significant difference between the FW900 and the Multiscan W900?
 
Yes, huge. The FW900 has a finer dot pitch, and supports higher resolutions and refresh rates. And it's a flat screen, not curved, like the W900. There's probably more differences beneath the hood too.
 
I don't know whether the curved is a problem. I suppose not. The lower dot pitch essentially means a lower max resolution, but does it have effects on overall sharpness of lower resolutions in general?
 
I don't know whether the curved is a problem. I suppose not. The lower dot pitch essentially means a lower max resolution, but does it have effects on overall sharpness of lower resolutions in general?

For CRT's, a tighter dot pitch means that you can usually achieve a higher resolution and still have it look clearer than on a monitor with a lower dot pitch. I have two GDM monitors - an Artisan and an F520. The Artisan's pitch is .24mm and the F520's dot pitch is .22mm. Both are generally of the same sharpness until 1600x1200. At that resolution the F520 is a little sharper, but it's not by much. Once you start going higher though, the difference becomes more pronounced. The Artisan pretty much tops out at 1792x1344 and even then it's pretty soft. For games it's fine, but for text, you better take the refresh down to 70hz to retain readability. Meanwhile, the F520 is good until 1920x1440 for text (no need to lower the refresh from 85hz), and for games it's good right until the max at 2048x1536, 75hz.

Lower resolutions (1280x1024 and below) don't really mean much. If I hook my Xbox 360 to the Artisan, for example, and set it to 1280x1024, I still need to sit a bit back from it as I can see the scanlines clearly.

If I were you, I'd just get the FW900 for all the reasons spacediver mentioned, and then there's another one too. The W900 is not compatible with WinDAS. It uses the old DAS software - which has not been cracked, and is not usable unless you have the proprietary Sony dongle. Which is sad... I suspect the W900 is not only a fantastic monitor, but I also suspect it's more reliable than the FW900. You don't typically find anyone with a dead W900 on the internet. :) (while there are droves and droves of people with dead FW900's)
 
If I hook my Xbox 360 to the Artisan, for example, and set it to 1280x1024

Unrelated to what you were talking about, but for most 4:3 compatible games you should choose 1024x768. Xbox 360 generally isn't powerful enough to hit 1280x1024, so it will actually upscale game from 1024x768 when you select it. Which if you're familiar with GPU scaling on PC's, actually looks worse than running 1024x768 native
 
Unrelated to what you were talking about, but for most 4:3 compatible games you should choose 1024x768. Xbox 360 generally isn't powerful enough to hit 1280x1024, so it will actually upscale game from 1024x768 when you select it. Which if you're familiar with GPU scaling on PC's, actually looks worse than running 1024x768 native

I didn't know that. Thanks for the head's up.
 
Yeah, it's an unfortunate side effect on consoles on a CRT. To get the best picture quality you have to try and figure out what the internal resolution is, then select it from the menu before you launch the game. The funny thing is, for some 16:9 games that run at an internal resolution of 720p, 1280x1024 is fine because it letterboxes perfectly with a 1280x720 image. So basically, it depends on what game you're playing. But most games that I've played that support 4:3 will run at internal res of 1024x768.
 
Yeah, it's an unfortunate side effect on consoles on a CRT. To get the best picture quality you have to try and figure out what the internal resolution is, then select it from the menu before you launch the game. The funny thing is, for some 16:9 games that run at an internal resolution of 720p, 1280x1024 is fine because it letterboxes perfectly with a 1280x720 image. So basically, it depends on what game you're playing. But most games that I've played that support 4:3 will run at internal res of 1024x768.

Yeah, I was doing some looking around and it appears that most 360 games don't even do 720p. Instead they do some lower version of 16:9 or some other wonky resolution, and then are upscaled. That's a bunch of poo-poo. But it makes sense when you consider the amount of raw power the GPU of those things have.

In other news, I've officially hauled out one of the projector beasties and am messing around with it. Bottom line is that I may have to do one of two things:

1. Fix the non-working projector and replace the red focus board and use it
2. Swap in the blue and green tubes from the non-working projector.

When I tried to do the raster adjustment, I adjusted my brightness up on my projector. Both blue and green are displaying retrace lines. I posted about this on Curt Palme's website to get a consensus of what I should do. This only happened after I reenabled the AKB (AGB as the G70 calls it), which boosts black levels tremendously (too much). Disabling AKB and maxing out the Brightness produces NO retrace lines to speak of.

And now that I finally have the set powered on, I can finally see the number of hours on the thing! Almost 8200. So blue and green are most likely nearing their EOL (though their wear patterns aren't that bad), while Red is kicking ass and taking names with no visible wear pattern at all. It's very common for Red tubes to last up to 20,000 hours and not the standard MTF of 10,000.

EDIT: Thankfully, and assuming nothing's functionally wrong with them, the non-working projector's blue and green tubes have no visible wear pattern at all, at least none that I can see. It's not all that bad though. Considering that a typically movie night puts around 2 hours of wear on the tubes, and considering that we will only be watching movies on this thing (I really doubt we'll do any gaming), 1800 hours is still a long time. That's 900 nights of movies. And assuming we watch it every day of the year, that's about 2.5 years of movie nights EVERY night. So long as I can calibrate this baby to a decent light output at 6500k (14 ft lamberts is considered just right for projectors in a blacked room), then we're good.

Double Edit: According to Curt, older tubes (greater than 5000 hours) act strangely with ABG. Just use the one that gives the best B/W balance. Woohoo!

For now, I'll be watching movies in 720p. I know the set can do 1080p, but the component input of the set cannot handle 1080p properly. The picture goes dim. I'll have to get new BNC-5 connectors to try out the RGB mode of the HD Fury and see if that helps take me to 1080p land.
 
Last edited:
I have an external DAC that does 225mhz bought of amazone for 10 doller. it runs 1600x1200 at 85 herties. Thats enought for 75 to 80 at 1920 1200. Worst case, drop to 1800x1152, etc.
 
I have an external DAC that does 225mhz bought of amazone for 10 doller. it runs 1600x1200 at 85 herties. Thats enought for 75 to 80 at 1920 1200. Worst case, drop to 1800x1152, etc.

How's quality? I may get one meself. :) Sounds perfect for the Artisan.
 
How's quality? I may get one meself. :) Sounds perfect for the Artisan.

AMAZING!

It blows the nVIDO one in my 980GTX away and is even better than AMD DACs! The cheap aMAZONe DAC I got has LCD level clarity. I thought the blur of my monitors was just an inherent CRT property, but it was really a case of send shit in, get shit out.
 
AMAZING!

It blows the nVIDO one in my 980GTX away and is even better than AMD DACs! The cheap aMAZONe DAC I got has LCD level clarity. I thought the blur of my monitors was just an inherent CRT property, but it was really a case of send shit in, get shit out.

How about a linky?
 
So the Radeon 480 only has HDMI and Displayport. Guess I made the right call with dual 380x's, which may be the best analog-capable card we'll ever see from AMD.

I mean, people say that certain vendors could add in a VGA/DVI port but is there a precedent for that sort of thing? I thought cards generally didn't stray too far from the reference model with number/types of outputs (other than single slot/dual slot variation).
 
AMAZING!

It blows the nVIDO one in my 980GTX away and is even better than AMD DACs! The cheap aMAZONe DAC I got has LCD level clarity. I thought the blur of my monitors was just an inherent CRT property, but it was really a case of send shit in, get shit out.

Could you tell us the exact model of that DAC?

Thanks :)

-----------------------------------------

I've been looking into DP to VGA adapters, and the best DAC I've found is
ANX9833 | www.analogix.com
Startech uses it in their MDP2VGA converter. Tomorrow I'll try it for seeing how far it goes over the standar 165mhz DACs included in cheap converters...
Maybe this can do the trick until HDFury releases a 400mhz DAC based converter.

-------------------------------------------

By the way, does anyone know about an adapter using ADV7125JSTZ330 DAC?
 
Last edited:
Little off topic here, I have own a couple FW900s for a while now. Last night I fired up DOOM and was playing on my T.V a Panny Plasma via hdmi. I was easily way off target and could feel the input lag ruin my gameplay. I am a lifelong FPS player and I can literally feel every shot like never before on the CRT. I jumped back on my FW900 and instantly become a headshot machine. The 0 input lag is a complete and utter game changer for accuracy on these Units in any twitch shooter. I cannot even stand to think about when my 15 year old units decide to pass away, I might as well go as well. Nothing can compare to them in FPS unless there is some new tech I am unaware of.
 
For CRT's, a tighter dot pitch means that you can usually achieve a higher resolution and still have it look clearer than on a monitor with a lower dot pitch. I have two GDM monitors - an Artisan and an F520. The Artisan's pitch is .24mm and the F520's dot pitch is .22mm. Both are generally of the same sharpness until 1600x1200. At that resolution the F520 is a little sharper, but it's not by much. Once you start going higher though, the difference becomes more pronounced. The Artisan pretty much tops out at 1792x1344 and even then it's pretty soft. For games it's fine, but for text, you better take the refresh down to 70hz to retain readability. Meanwhile, the F520 is good until 1920x1440 for text (no need to lower the refresh from 85hz), and for games it's good right until the max at 2048x1536, 75hz.

Lower resolutions (1280x1024 and below) don't really mean much. If I hook my Xbox 360 to the Artisan, for example, and set it to 1280x1024, I still need to sit a bit back from it as I can see the scanlines clearly.

If I were you, I'd just get the FW900 for all the reasons spacediver mentioned, and then there's another one too. The W900 is not compatible with WinDAS. It uses the old DAS software - which has not been cracked, and is not usable unless you have the proprietary Sony dongle. Which is sad... I suspect the W900 is not only a fantastic monitor, but I also suspect it's more reliable than the FW900. You don't typically find anyone with a dead W900 on the internet. :) (while there are droves and droves of people with dead FW900's)

This means.. there is no way whatsoever to use this monitor at its designated specs (100+hz)? Also curious, do I need any cable adapters or anything for either of these monitors?
As of right now, my options are a W900 or a GDM F520. If I can't run the W900 at 100+HZ, I reckon my best option (having competetive gaming in mind) would be the GDM F520 or pray for another FW900 to pop up, correct?
A FW900 close in my country actually sold a month ago, for 50 bucks, as I just found out... :(
 
For games it's fine, but for text, you better take the refresh down to 70hz to retain readability. Meanwhile, the F520 is good until 1920x1440 for text (no need to lower the refresh from 85hz), and for games it's good right until the max at 2048x1536, 75hz.

No. The refresh rate has no effect on CRT clarity. You are probably using a crappy DAC that can't handle the bandwidth. I will send you a link to the one I bought on amazon when I check, later. I thought the same thing you do, but once I used an external DAC, I relized the truth. 1600x1200 @ 85 looks as good on my eDAC as 1600x1200 @ 50 on the on GPU one. If you interlace, 180Hz on eDAC is as clear as 100i on GPU. Too bad it only does 225MHz, but 16x12 @ 85 (or 180i) is OK for me.
 
No. The refresh rate has no effect on CRT clarity. You are probably using a crappy DAC that can't handle the bandwidth. I will send you a link to the one I bought on amazon when I check, later. I thought the same thing you do, but once I used an external DAC, I relized the truth. 1600x1200 @ 85 looks as good on my eDAC as 1600x1200 @ 50 on the on GPU one. If you interlace, 180Hz on eDAC is as clear as 100i on GPU. Too bad it only does 225MHz, but 16x12 @ 85 (or 180i) is OK for me.
Would I need an expensive cable adapter (DAC..?) or whatever to run a CRT efficiently nowadays? Do I really need winDAS? I am just looking for an edge in fast competetive shooters and to get rid of my terrible bLurCD :pompous:
High resolutions / HD are of no relevance to me, I just look for a sharp, colorful and fast image, anywhere between 1000-1600 pixel width.
 
No. The refresh rate has no effect on CRT clarity. You are probably using a crappy DAC that can't handle the bandwidth. I will send you a link to the one I bought on amazon when I check, later. I thought the same thing you do, but once I used an external DAC, I relized the truth. 1600x1200 @ 85 looks as good on my eDAC as 1600x1200 @ 50 on the on GPU one. If you interlace, 180Hz on eDAC is as clear as 100i on GPU. Too bad it only does 225MHz, but 16x12 @ 85 (or 180i) is OK for me.
could you post the link here for everyone?
 
Would I need an expensive cable adapter (DAC..?) or whatever to run a CRT efficiently nowadays? Do I really need winDAS? I am just looking for an edge in fast competetive shooters and to get rid of my terrible bLurCD :pompous:
High resolutions / HD are of no relevance to me, I just look for a sharp, colorful and fast image, anywhere between 1000-1600 pixel width.

WinDAS is for calibrating, not really gaming related.
If you can get hold of a FW900 and the 225mhz DAC we're talking about will do 1600x1200@85h, it should also do 1440x900@120hz since that requires slightly less pixel clock.
 
Back
Top