helmutcheese
Weaksauce
- Joined
- Oct 1, 2008
- Messages
- 75
@ Rock&Roll, Dont cheat yourself out of screen size.
The 24" FW900 has 22.5" of Viewable Screen.
The 24" FW900 has 22.5" of Viewable Screen.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
My F520 developed a slight flicker in its text display. Think it happened after I moved to a condo complex with its shared power lines, if that makes any sense...
Has anyone used the HDFURY to go between a HDMI source and the BNC inputs on the FW-900? I'd need to go Source HDMI port --> HDMI to DVI-D cable --> HDFury --> VGA to BNC cable --> Fw-900. Has anyone ever done this before?
It technically is a 24" screen. 1.5" of that is covered by plastic however.Why isn't the fw900 listed as a 23" screen? Because that's exactly what it is. Not complaining, as 1" doesn't make a diff to me but just curious.
I'm having trouble with "newer" videocards. Every card I've tried past the x1950 generation is much blurrier than before. The analog quality is part, this is probably due to lower grade ramdac and other filters since LCD (digital/dvi) are so popular nowadays. Is there anyone out here that can recommend me a good brand that still bring a sharp analog connection? I had a 3850 it was a lot more blurry at all resolutions and refreshrates than this extremely sharp x1950pro i have right now from Powercolor. Sort of a waste having the best display in the world yet there's no equipment that can match it.
when i set my res to 1600x1200 the monitor reports 1920x1200.
I suspect it varies by card manufacturer. Back in the days of the GeForce 2-4 I recall a fair number of discussions about NV based board manufacturers going cheap on the output filters on gaming cards, resulting in blurry images at high res. NVidia eventually clamped down on the practice IIRC. At the time ATI was reported to have better image quality, but then again at the time most of the ATI cards around were ATI branded. Perhaps ATI has a similar problem with their board manufacturers now.
@ christpunchersg, I do not know where you get your info but my 19" Mitsubishi back in 2001 ran higher than 1600x1200, it was 1792x1344 and my 22" Mitsubishi Superbright I got to replace it ran 2048x1536.
The Sony FW900 is possibly the only 16:10 CRT inc any other re badged Sony's.
anyone in the MD/PA/VA/WV area interested in this monitor? I might be trying to sell mine, depending how the Viewsonic 120hz turns out. I would be cool with meeting half way if someone lives too far, as long as the drive isn't more than an hour or so.
btw forceware 182.08 fully supports custom resolutions again, even in Windows 7.
Does anyone use the FW900 on Vista 64bit? Are there drivers already in windows for it or do I have to find it from somewhere?
Does anyone use the FW900 on Vista 64bit? Are there drivers already in windows for it or do I have to find it from somewhere?
The DAC's on Nvidia cards are fine so I also smell BS.
Most peeps are on 19" or smaller LCD's and most (not all) of them have only VGA not DVI, so your going from a modern GPU with DVI (2 of) out to DVI to VGA Dongle same as a CRT user but then your going back again to Digital at the LCD end.
That means signal is Digital, then Analogue then back to Digital.
With this talk of analog output being lower quality on newer cards bs, I would still take the fw900 over a LCD for games/movies due to all it's quality advantages vs. an LCD. I only need high sharpness for text. With that said, my fw900 doesn't look blurry at all. I just need to use windas to correct some convergence.
At the same point, it can't cost very much $ per card to keep analog output quality correct. And as I mentioned before, my GTX 280 appears to be correct. But, I am not surprised to hear that about the 3850. I want to be an ATI supporter. My x1900 was great. But it seemed like they cut a lot of corners after AMD purchased them. (speaking in generalizations)
If there only was something like an external DVI-D to DVI-A converter that could do resolutions up to 1920x1200. That would make things independent of the rather poor quality RAMDACs used on cards. Yeah, the digital / analog converters on cards are crap, but those on the ATIs used to be higher quality crap. Next thing will be back to Matrox, who have half decent RAMDACs, but suck for gaming...
I too subscribe to the idea that if you run one of these, you leave it run for a long time. It doesn't waste that much electricity.
In consideration of the high voltages/currents going around in a CRT, I figure it causes less stress on power supply components having to stand up (turn on) and lay down (turn off) many times.
So, generally, I run my FW900 for no less than 5 hours a day.
On the flip side, is there a negative side to leaving your CRT running non-stop? (with a good screen saver obviously).