24" Widescreen CRT (FW900) From Ebay arrived,Comments.

My F520 developed a slight flicker in its text display. Think it happened after I moved to a condo complex with its shared power lines, if that makes any sense...

Ran the 65F against the F520. Best solution is not to do this, especially at night. :)

The 65F can have quite pleasing color and all, probably helped by its glossy coating, but, of course, fails on the all or nothing nature of its monolithic backlighting when trying to display scenes which should be rendered at different levels of lighting...

And at night, even if one is willing to attenuate the brighter parts of a scene, and even with a range of backlight adjustment that is quite a bit better than many displays, it still can't really go dark enough...
 
My F520 developed a slight flicker in its text display. Think it happened after I moved to a condo complex with its shared power lines, if that makes any sense...

I would look for magnetic fields. Speakers, other CRTs, appliances, AC units, fans. That includes on the other side of the wall in the neighbour's unit.

My F520 will show the mag field effects in my home office when someone turns on the CRT TV with power-on degauss in the family room. They have sensors which determine magnetic north every power up - they are very sensitive to magnetism. Don't believe me? Rotate the monitor on your desk 90 degrees while powered on and see the picture change. (I read about it in the service manual, I believe)
 
Why isn't the fw900 listed as a 23" screen? Because that's exactly what it is. Not complaining, as 1" doesn't make a diff to me but just curious.
 
Sigh, I miss my fw900 so much. Just not willing to drop more than $300 again.
 
Has anyone used the HDFURY to go between a HDMI source and the BNC inputs on the FW-900? I'd need to go Source HDMI port --> HDMI to DVI-D cable --> HDFury --> VGA to BNC cable --> Fw-900. Has anyone ever done this before?

That's the setup I used to use with my PS3, and it worked perfect.
 
two issues-

1. i'm on windows 7 64 and i cannot get any resolution past 1600x1200 on bnc. i had the same issue on vista x64. i somehow got the correct driver installed (in device manager), yet all software lists the monitor as generic. i can't even seem to force any higher resolution. i've tried powerstrip several times.

2.my vga input is blurry no matter what device i use. is there anything i can do about this?
 
1) I don't think you can do better over BNC. And I just tried using HDFURY as a possible fix for that...HDFURY = fail on all acounts here :(

2) You probably need a better VGA cable. Shielded with ferrite cores on each end.
 
I'm having trouble with "newer" videocards. Every card I've tried past the x1950 generation is much blurrier than before. The analog quality is part, this is probably due to lower grade ramdac and other filters since LCD (digital/dvi) are so popular nowadays. Is there anyone out here that can recommend me a good brand that still bring a sharp analog connection? I had a 3850 it was a lot more blurry at all resolutions and refreshrates than this extremely sharp x1950pro i have right now from Powercolor. Sort of a waste having the best display in the world yet there's no equipment that can match it.
 
I'm having trouble with "newer" videocards. Every card I've tried past the x1950 generation is much blurrier than before. The analog quality is part, this is probably due to lower grade ramdac and other filters since LCD (digital/dvi) are so popular nowadays. Is there anyone out here that can recommend me a good brand that still bring a sharp analog connection? I had a 3850 it was a lot more blurry at all resolutions and refreshrates than this extremely sharp x1950pro i have right now from Powercolor. Sort of a waste having the best display in the world yet there's no equipment that can match it.

My BFG 7600GTOC and Galaxy 8800GTS512 were perfectly fine running 1600x1200@72 over HD15 analog.

EDIT: I also used both at 2048x1536@70Hz over VGA (not BNC) cabling with acceptable results. I purchased my new BFG GTX260OC(216) after going DVI so I have never tried it with analog.
 
i started out with a 7800gtx on this monitor and have since upgraded 2 generations of nvidia video cards since then and have notice no difference in the output to my fw900.
 
I suspect it varies by card manufacturer. Back in the days of the GeForce 2-4 I recall a fair number of discussions about NV based board manufacturers going cheap on the output filters on gaming cards, resulting in blurry images at high res. NVidia eventually clamped down on the practice IIRC. At the time ATI was reported to have better image quality, but then again at the time most of the ATI cards around were ATI branded. Perhaps ATI has a similar problem with their board manufacturers now.
 
My GTX 280 is working fine. Although, I still use an old DVI to VGA adapter from an x1900 I had. It is *slightly* better than the other adapters I have.
 
when i set my res to 1600x1200 the monitor reports 1920x1200.

i'm having trouble playing certain games. the games are wayyyy off the screen.

win 7 x64 8600gt
 
when i set my res to 1600x1200 the monitor reports 1920x1200.

Mine does that too, it's pretty common for CRTs. CRT monitors usually just count lines and check the vertical refresh rate to figure out what resolution they're in. In other words, they're just counting sync pulses. A trick I use on mine is to tweak the refresh rates so 16:10 and 4:3 resolutions with the same number of lines run at a different refresh. Then the monitor can tell them apart, though it still doesn't necessarily label them correctly.
I don't really care what mode it thinks it's in, the problem is if it can't tell the difference between 1600x1200 and 1920x1200 one can't have different geometry, etc. settings for each mode. That gets really annoying. Unfortunately this has gotten more annoying lately, as NVidia decided to ditch the detailed resolution adjustments from their recent drivers. Of course you can still do it, you just need Powerstrip or something. On the up side, at least pretty much all recent games support 16:10 so I haven't had nearly as much need to distinguish between 1920 and 1600 lately as I used to.
 
anyone in the MD/PA/VA/WV area interested in this monitor? I might be trying to sell mine, depending how the Viewsonic 120hz turns out. I would be cool with meeting half way if someone lives too far, as long as the drive isn't more than an hour or so.

btw forceware 182.08 fully supports custom resolutions again, even in Windows 7.
 
I suspect it varies by card manufacturer. Back in the days of the GeForce 2-4 I recall a fair number of discussions about NV based board manufacturers going cheap on the output filters on gaming cards, resulting in blurry images at high res. NVidia eventually clamped down on the practice IIRC. At the time ATI was reported to have better image quality, but then again at the time most of the ATI cards around were ATI branded. Perhaps ATI has a similar problem with their board manufacturers now.

It's a shame I tell ya. It's like you have to decide if the tradeoff of faster graphics but infinity inferior sharpness and overall image quality is worth your time. I think ATI really let things slide nowadays. For this very reason I will be going to Nvidia this time around to try my luck. I can't tell you how much sharper my x1950pro is for the FW900 than the 3850, it's a world of difference.

FYI all connections going to the FW900 is considered analog due to the nature of CRT monitors. They are not "digital" like LCDs so the quality of filtering and ramdac used on the videocard will determine how sharp and bright the image appears on the CRT. In layman's terms think of a CRT as a radio, you have to "tune" it to perfection in order to get the best quality signal, where-as LCDs are like radio shows broadcasted through the internet (Pod-casts) is perfect or nearly perfect as the source (whatever is being recorded in the studio).

The CRTs of the world today are being overlooked, the FW900 is AFAIK the only PC monitor that supports resolutions beyond 1600x1200 and thus it is really a "niche" item. I don't expect videocard manufacturers to give half a shit about maintaining CRT fidelity since most people who would use 1920x1200 or higher resolutions will surely be using a LCD which is 100% sharp in it native resolution regardless of whatever quality the ramdac and filtering may be on the videocard.
 
At the same point, it can't cost very much $ per card to keep analog output quality correct. And as I mentioned before, my GTX 280 appears to be correct. But, I am not surprised to hear that about the 3850. I want to be an ATI supporter. My x1900 was great. But it seemed like they cut a lot of corners after AMD purchased them. (speaking in generalizations)
 
@ christpunchersg, I do not know where you get your info but my 19" Mitsubishi back in 2001 ran higher than 1600x1200, it was 1792x1344 and my 22" Mitsubishi Superbright I got to replace it ran 2048x1536. :)

The Sony FW900 is possibly the only 16:10 CRT inc any other re badged Sony's. ;)
 
@ christpunchersg, I do not know where you get your info but my 19" Mitsubishi back in 2001 ran higher than 1600x1200, it was 1792x1344 and my 22" Mitsubishi Superbright I got to replace it ran 2048x1536. :)

The Sony FW900 is possibly the only 16:10 CRT inc any other re badged Sony's. ;)

Well I'm speaking generally, most 19" CRT you'll find is RECOMMENDED to run at 1600x1200, only the FW900 is meant to run at 1920x1200. CRTs can run pretty much any resolutions because it's not limited like a LCD but we're talking about general usage. Higher res on a 19 inch monitor will probably not look as sharp and as good as it's recommended resolution.
 
Does anyone use the FW900 on Vista 64bit? Are there drivers already in windows for it or do I have to find it from somewhere?
 
anyone in the MD/PA/VA/WV area interested in this monitor? I might be trying to sell mine, depending how the Viewsonic 120hz turns out. I would be cool with meeting half way if someone lives too far, as long as the drive isn't more than an hour or so.

btw forceware 182.08 fully supports custom resolutions again, even in Windows 7.


hope the lcd works out for you. do you have a price. a friend of mine was looking into this crt. he lives in wv. depending on price maybe i will snag this.

good luck.
 
Does anyone use the FW900 on Vista 64bit? Are there drivers already in windows for it or do I have to find it from somewhere?

There is already a driver in Vista64 as Sony have no 64bit driver but I have modded a 32bit driver for Vista64 and all you need to do is accept that it aint digitally signed.

I have found no gain in using it so simply use its Colour .ini and only and let Vista uses its own driver.

I game at 1920x1200x96HZ (not sure why many peeps only seem to be able to get 85HZ at this RES).
 
With this talk of analog output being lower quality on newer cards bs, I would still take the fw900 over a LCD for games/movies due to all it's quality advantages vs. an LCD. I only need high sharpness for text. With that said, my fw900 doesn't look blurry at all. I just need to use windas to correct some convergence.
 
The DAC's on Nvidia cards are fine so I also smell BS.

Most peeps are on 19" or smaller LCD's and most (not all) of them have only VGA not DVI, so your going from a modern GPU with DVI (2 of) out to DVI to VGA Dongle same as a CRT user but then your going back again to Digital at the LCD end.

That means signal is Digital, then Analogue then back to Digital. :p
 
The DAC's on Nvidia cards are fine so I also smell BS.

Most peeps are on 19" or smaller LCD's and most (not all) of them have only VGA not DVI, so your going from a modern GPU with DVI (2 of) out to DVI to VGA Dongle same as a CRT user but then your going back again to Digital at the LCD end.

That means signal is Digital, then Analogue then back to Digital. :p

It may be fine with Nvidia but I'm speaking from personal experience it is not fine with some of the newer ATI cards.

DVI ports CAN carry analog signals. There are two types of DVIs if I recall correctly, one is DVI-I which is most common and has the ability to carry both analog and digital. DVI-D is digital only and will not work for CRTs.

VGA to a LCD monitor works because VGA can also carry both types of signals.

You can say BS all you want but I'm not here to argue with you. I've collected 3 FW900s in the past 2 years and all of them had worser text with newer VCs than the ones from 3 to 4 years back. It's not an issue of cabling either I have many vga cables and a BNC cable from al-kabelshop, there IS an issue of certain manufacturers using lower quality DACs because they can afford to get away with it nowadays.
 
With this talk of analog output being lower quality on newer cards bs, I would still take the fw900 over a LCD for games/movies due to all it's quality advantages vs. an LCD. I only need high sharpness for text. With that said, my fw900 doesn't look blurry at all. I just need to use windas to correct some convergence.

What is not very blurry to one person may not be acceptable to the next. The fw900 is capable of being very sharp when it's properly calibrated but the quality of the analog components on the videocard still account for a lot of equation. I too would never ditch my FW900s for any LCD but I'm merely pointing out a possible trend.
 
At the same point, it can't cost very much $ per card to keep analog output quality correct. And as I mentioned before, my GTX 280 appears to be correct. But, I am not surprised to hear that about the 3850. I want to be an ATI supporter. My x1900 was great. But it seemed like they cut a lot of corners after AMD purchased them. (speaking in generalizations)

Hey if I were in a position to make more profit by using cheaper parts without many complaints I would do it too as well. The masses will be using LCDs and not a bunch of high resolution CRT monitors so why not?

Maybe it is strictly an ATI problem, maybe it isn't, maybe I just got unlucky when I tried out Sapphire, Powercolor, and HIS. I'm getting a EVGA GTX 285 this week so I hope Nvidia won't disappoint me.

To prove my point I've had lengthly discussions about the FW900 with a fellow forum member. He told me that in an article printed sometime in 2008 in a German PC Gaming magazine there was an analysis on how poor recent videocards have been in terms of analog output. I will try and get him to maybe scan me a few of those pages for everyone to see and have it translated.

" Preacher0815 n00bie, 1.2 Years

Preacher0815 is offline
Re: Which BNC cable for fw900?

All I can say is that I was pretty shocked by the degradation of image sharpness when I 'upgraded' from an ATi x1800 to a Geforce 880GTS 512. If there only was something like an external DVI-D to DVI-A converter that could do resolutions up to 1920x1200. That would make things independent of the rather poor quality RAMDACs used on cards. Yeah, the digital / analog converters on cards are crap, but those on the ATIs used to be higher quality crap. Next thing will be back to Matrox, who have half decent RAMDACs, but suck for gaming...

Just dug out a magazine where they tested 8800GTs and HD3870s. All cards have rather poor VGA (= analog) output. On a scale from 1 to 5, with 1 being the best possible result and 5 the worst, the cards score 3 or 4 for signal quality. Not one with better results.
I'm afraid all of todays gaming cards have poor output. Tested ATIs were from GeCube, MSI, Powercolor, Sapphire and HIS, and not one with good signal quality. There are however no cards directly from ATI, since those aren't sold in Europe.

How the picture looks depends on the shape of the signal. Different card, different shape of the signal. The best signal has a rectangular shape, however todays cards stray greatly from that, from upside-down U shaped signals to waves. So it's quite possible that at the point that defines the "strength" of white, the signal comes close to it's ideal form, but everything else is just bad."
 
@ christpunchersg, There is nothing you can argue about with me from my post above. ;)

I specified Nvidia cards as that's is all I use.

There is 3 Types of DVI FYI. ;)

Still the fact remains a GPU outputs Digitally unless you use a VGA then it is Analogue but to then use a LCD that's only got VGA means its doing this :

Digital > Analogue > Digital.

My set up does this :

Digital > Analogue.

:)
 
I'm not here to try to argue either. But to completely brush this off as a possible issue just frustrates me.

On a different topic, I'm trying what I can to preserve these FW900s until the day OLEDs or whatever else it will be can match the CRTs pure prowlness: Trying not to leave the monitor idle for too long, using a LCD or the laptop when I don't need the FW900 to watch movies or play games, not exceeding use of it beyond 6 to 8 hours if possible and doing my best to use one set resolution whenever possible. Hell, I've even gotten use to booting up blind until Windows has fully loaded then turn on the monitor.
 
I don't know how you can work like that.

These Sony's are over bright at start up (every other CRT I have seen is a bit dark till warm).

The FW900 looks like you have white cataracts in both eyes when its being off for 10 mins or so, I leave it with a blank screen saver as I cannot bare it to turn off after Windows default of 20mins.

I have a 15yr old Mitsubishi TV that has seen more hours than most and its still 100%.
 
I don't turn it off if I'm going to be away from the computer for like, 30 minutes. But if I'm away for an afternoon and I will be coming back something like 3 hours later I will for sure turn it off.

Yes it is bright on startup as with every Sony CRT. I just boot up my computer, wait until Windows load, then I'll turn on the monitor and come back in 30 minutes. Sometimes I can stand the bright screen because all I'm doing is checking my emails. But yeah, it takes awhile to warm up, still I would like to minimize wear when possible.
 
I too subscribe to the idea that if you run one of these, you leave it run for a long time. It doesn't waste that much electricity.

In consideration of the high voltages/currents going around in a CRT, I figure it causes less stress on power supply components having to stand up (turn on) and lay down (turn off) many times.

So, generally, I run my FW900 for no less than 5 hours a day.

On the flip side, is there a negative side to leaving your CRT running non-stop? (with a good screen saver obviously).
 
I've heard that screen savers really doesn't do much for a CRT like the FW900.

He says that it's best to turn it off if you're not going to use it for many hours. Regardless of how much work it takes to power up and warm up you're still going to wear down the components by having it for a really long time.

Make what you will out of it but my source's the same guy who sells FW900s in California to the big studios and photoshoppers. He also calibrates them as well and he has worked as a Sony technician for these sort of things for many years. You may know him as UnkleVito on Ebay.
 
If there only was something like an external DVI-D to DVI-A converter that could do resolutions up to 1920x1200. That would make things independent of the rather poor quality RAMDACs used on cards. Yeah, the digital / analog converters on cards are crap, but those on the ATIs used to be higher quality crap. Next thing will be back to Matrox, who have half decent RAMDACs, but suck for gaming...

I'm sure there are if you look in the right places. Professional video editing equipment I'm sure has what you need if you want to spend $1000's on better ramdacs. I suppose that eventually we'll all have to get hardcore enough to start replacing the ramdacs on our $500 video cards?
 
I too subscribe to the idea that if you run one of these, you leave it run for a long time. It doesn't waste that much electricity.

In consideration of the high voltages/currents going around in a CRT, I figure it causes less stress on power supply components having to stand up (turn on) and lay down (turn off) many times.

So, generally, I run my FW900 for no less than 5 hours a day.

On the flip side, is there a negative side to leaving your CRT running non-stop? (with a good screen saver obviously).

You'll wear out the tube faster if you leave it on.

(Your approach sounds like a decent balance between the two extremes. Assuming it is in use for much of those 5 hours...)
 
Back
Top