24" Widescreen CRT (FW900) From Ebay arrived,Comments.

not bad at all! Did you go the Argyll route after WinDAS? Also, I'm curious about the displays you've used before the FW900. Have you used decent LCDs and what are your thoughts about the picture quality of the FW900 in comparison?
 
does it make a popping kinda fizzing sound when it goes out of focus, and when it comes back into focus?

Maybe try leaving the monitor on for a day or two, there were some reports earlier in this thread of that potentially helping.

It does make no sound when going out of focus but if i would wait long enough sometimes it makes a "pop" sound an goes back to focus.

Now that you mentoin it i remember this potential help with leaving it on.
 
not bad at all! Did you go the Argyll route after WinDAS? Also, I'm curious about the displays you've used before the FW900. Have you used decent LCDs and what are your thoughts about the picture quality of the FW900 in comparison?

Actually, not at all... I was expecting the gamma to be a bit higher than what I got.

So about the monitors I've used before this one. I had 2 CRTs, one really old 15", I don't remember which one it was. It died one day and I had it replaced by a Hyundai Q770 which had a flat surface, and a really good image quality, the tube was a bit curved inside as far as I remember, and it was sadly only 60Hz. After that I had different common 17" and 22" VGA LCDs, nothing fancy.
Those late years I've been using an LG W2453V-PF, which I gave to a friend when I bought a used Acer GD245HQ just for the amazing refresh rate. But this Acer had a slight weird yellow tint. Now I own a BenQ XL2411T alongside my Sony GDM-FW900.

So as you can see, I've owned only TN panels which are not the best out there for image quality. Something that can really be compared to a high end CRT would have been IPS panels, but I'm not actually saying they are better than CRT.
I'm not in a position to make a proper comparison, but I can say something for sure, between a TN panel and this CRT, the difference in quality is quite noticeable.

And of course there's this thing about the blacks. Now after using a CRT you really understand what it means to have a backlit display. If we can put our hands on an OLED display anytime soon, then I think we will have something to compete with CRTs in term of black level at least and probably image quality as well.

What I can say on the other hand, is that the LCDs are sharper, and that's of course because of the technology itself. But when you use a 24" LCD at 1920x1080 even a lower sized display like a 17" laptop at the same resolution, you can clealy see the pixels.
That's something you can't see on the FW900 at 1920x1200. The image may look less sharp, but it feels a lot smoother, and that's something I really appreciate. Some people might think it's blurry, but it's not, at least when everything is set properly.
Some of my wallpapers looks awesome on the FW900, and when for some reason I switch to my LCD, it changes everything. Now if you have high resolution images, the software you use handle high DPI, and you own those "retina" high DPI displays, on phones or tablets for example, you won't see the pixels, and the image will be totally smooth and precise.

Now, of course, something truly amazing about CRTs is the way they behave on any resolution used. If you want to play a game at a lower resolution, and there are fonts displayed, even the crosshair if you're playing FPS, it will look blurry on a LCD. Here you won't even notice the difference, but you will have a better framerate for example. And the lower the resolution, the higher the refresh rate.

To conclude, in term of image quality, LCD monitors looks good specially IPS panels, but a CRT monitor will look better I suppose.
In term of sharpness, that's a matter of taste, a CRT looks smoother, but a LCD is the best for a sharper image, specially high DPI ones.
In term of flexibility, a CRT outperform a LCD, and probably any newer display technology.
In term of response time as well, a CRT will be better than any LCD, and if you want a good response time with an LCD, you will sacrifice the image quality (IPS can't handle low response times).
 
Like I said before, I just put the hood back on my FW900, and I took a last couple pics before.

DSC02420.JPG


I managed to take a photo of the cathode heater glowing, doing its job. But I wasn't crazy enough to remove the back part of the cage to take a better photo.
There are high voltages around with the board attached to the tube neck. Maybe one day. :D

DSC02414.JPG
 
Last edited:
great job
thanks for all the pics

Thank you! ;)

I've been playing around with some old hardware, and I just noticed that on really low resolutions, text mode, or graphics like 640x480 on CS 1.6, I still have some convergence issues in some spots... :(

DSC02425.JPG


I'll try to correct this tomorrow on later this week. Do you guys have any advices on how to adjust convergence for low resolutions? I'm asking this because I was sure about the settings I made on 1920x1200, so maybe I missed some small details on some spots, but if it's something else...
I know for example when you adjust alignment, you have to do it on different resolutions, but when adjusting convergence, WinDAS asks only for Mode 5.
 
Actually, not at all... I was expecting the gamma to be a bit higher than what I got.

You should definitely try out the Argyll step - it'll flatten out your gamma to a virtually perfect 2.4. As it stands right now, you're not too far off, but you're crushing the lower levels relative to the way a lot of content is mastered. Once you've created the LUT via argyll, you can create a shortcut on your desktop to reset the LUT, and another one to load the LUT. That way you can quickly switch between them and see the difference real time.

The fact that your gamma wasn't sky high means your G2 isn't super low, but I'm sure the blacks are more than satisfactory enough :)


So as you can see, I've owned only TN panels which are not the best out there for image quality. Something that can really be compared to a high end CRT would have been IPS panels, but I'm not actually saying they are better than CRT.
I'm not in a position to make a proper comparison, but I can say something for sure, between a TN panel and this CRT, the difference in quality is quite noticeable.

thanks for the thoughtful reply.

And nice photo of the glowing cathode heater :)

protip: leave the WinDAS cable attached to the monitor even when you put on the case. Next time you calibrate you won't need to re-attach it then.
 
Sadly mine starts to get complete out of focus (unsharp) from one moment to the other. Fixes when i turn it off and on again. But now it happens so often a day. Not a good sign.

Im sure this had been described before here. Was it the Flyback?

Very possible a bad FBT... but other HV components may be faulty as well. Hard to tell without running a full diagnose on the unit.

UV!
 
You should definitely try out the Argyll step - it'll flatten out your gamma to a virtually perfect 2.4.

Ok, I'll do that this afternoon.

protip: leave the WinDAS cable attached to the monitor even when you put on the case. Next time you calibrate you won't need to re-attach it then.

When I saw the slight convergence issue, I realised I should have. :D
 
Moar photos. :p

I just decided to try my really old Atari 1040ST on my FW900, and took it out of storage this morning. It looks amazing on that huge monitor!

DSC02427.JPG


So this is the monster I'm talking about.

DSC02428.JPG


And you probably can't see on the pics, but the ghosting is really noticable on some patterns, and that's of course because of the way I hooked it up. :D
EDIT: Now the ghosting issue is solved.

DSC02433.JPG


This thing does not have a standard VGA connector, but a round one, with either grayscale or RGB signals, an audio input and output. I only hooked up the monochrome signal for a higher resolution, and it was easier to do.
I'll try to build a better adapter later, supporting both monochrome and RGB signals, and audio as well.

DSC02432.JPG


EDIT: I had a lot more cables around before (second pic), because of an older design. But I simplified the thing, and it pretty much eliminated the ghosting. So I can clearly see here how a cheap cable impacts the image quality. I updated the last 2 pics.
 
Last edited:
Thanks for your great posts and pics etienne51, enjoy reading them :)
This really makes me want to start fiddling with my fw900 again but i can't seem to find any good deals on ebay on a dtp94 and i can't run 1920x1200@85Hz anyway right now and its all just so time consuming. I did the WinDAS convergence a year or so ago and never removed the cable since :D

Is there some..."amateur" way of getting at least a half-way decent gamma/white level without a meter? My LCD is decently set up with an .ICC profile, my Sony is just way off >.<
 
uh basically just fiddle around with the color controls in expert mode until
1. white looks white
2. you don't see any tint in a gradient: http://www.lagom.nl/lcd-test/gradient.php
3. adjust brightness until it roughly looks correct. black level should be significantly darker than your lcd's but dark greys shouldn't be crushed.
 
so was at the lab a couple days ago hunting around for a piece of equipment, and came across an old SGI 24 inch CRT. It had a curved screen and I think it's the SGI rebrand of the Sony GDM-W900. Will double check the model number tomorrow, but I was certainly not expecting to see it!

In other news, spent many hours over the last few days linearizing the gamma of our VIEWPixx display. Adjusted the LUT values for each of the 256 levels. Tedious work but virtually perfectly linear now. Too bad I couldn't use more bits in the workflow - ended up having to crush a lot of video levels together.
 
so was at the lab a couple days ago hunting around for a piece of equipment, and came across an old SGI 24 inch CRT. It had a curved screen and I think it's the SGI rebrand of the Sony GDM-W900. Will double check the model number tomorrow, but I was certainly not expecting to see it!

In other news, spent many hours over the last few days linearizing the gamma of our VIEWPixx display. Adjusted the LUT values for each of the 256 levels. Tedious work but virtually perfectly linear now. Too bad I couldn't use more bits in the workflow - ended up having to crush a lot of video levels together.

DUDE! GDM-W900! :) Power that bad boy on, right now! :D
 
hehe I might ask the guy who runs that lab if he wants to part with it and leave it in good hands. Though getting that puppy home would be an absolute monster of a time - I might wait until a friend can drive me home. But even then I'm running out of place to store my CRTs :p
 
Sorry if this questions has been posted already. Im freaking out a little right now.

I just ordered a 290X and found out it only has DVI-D which does not support my Sony CRT Legend :(
Is there anyway I can get a new video card like a 290X or 980 and still run my CRT?

What is the best way to do this? I don't want to introduce any input lag since I use the display still for competitive twitch fps gaming.
 
Is there anyway I can get a new video card like a 290X or 980 and still run my CRT?

As far as I know, GTX 980 still have a DVI-I port, meaning it outputs both analog and digital signals.
I'm not really an ATI/AMD fan, but I have a R9 290 I'll sell soon, and indeed, it does not have the required pins for analog output.
I switched to a GTX 680 in the meantime until I decide to buy a newer graphics card. I hope next generation (GTX 1080?) will still have a DVI-I output.
 
Sorry if this questions has been posted already. Im freaking out a little right now.

I just ordered a 290X and found out it only has DVI-D which does not support my Sony CRT Legend :(
Is there anyway I can get a new video card like a 290X or 980 and still run my CRT?

What is the best way to do this? I don't want to introduce any input lag since I use the display still for competitive twitch fps gaming.

The Titan X, GTX 970, GTX 960, and GTX 980 both support analog output without the use of Active adapters. The 290X does not. To use a Proper monitor with a 290X, you will need to buy an active adapter. Do note that even the best active adapters will limit your monitor to 1600x1000@80Hz or 1440x900@96Hz. The best option here is to return that graphics card and purchase a GTX 970, GTX 980, or Titan X.
 
Does the FW900 give an out of range error anywhere above exactly 121kHz or does the firmware allow for slightly higher scanrates (e.g 122Hz-124kHz)? Would it be possible to tweak the firmware somehow to allow for maybe 127kHz, enough for 85Hz at 2304x1440, 96Hz at 2048x1280, or 100Hz at 1920x1200? Those 5Hz might not sound like much, but they would sure come in handy! I had an older CRT that was stated to be 121kHz but allowed me to use resolutions up to 125kHz. Also, it it possible to override vertical refresh rate caps by modifying the firmware? My current monitor did not come with any vertical refresh rate limit and does 344Hz and I wouldn't want to drop from that to 160Hz.
 
Last edited:
It will display an out of range message if you set anything past the limit. Anyway, I'm sure it won't be a good idea to do that even if it was possible, it will probably shorten the life of the electronic components.

You can do 1920x1200@96Hz if you want, I know someone here uses those settings, but it's probably safer to run at 85Hz, and the image will be sharper.
At 2304x1440, the maximum is 80Hz, there is no extra room. At least not that I know of.
 
I'm sure it won't be a good idea to do that even if it was possible, it will probably shorten the life of the electronic components.

There is no reason that bypassing the vertical refresh rate cap would cause damage. I'm not sure about bypassing the horizontal scanrate's effect, but I don't think it would cause problems.
 
Some Monitors will run with higher refresh rates, some not. But it is not recommended to do so. You will shorten the life of the components.
 
You will shorten the life of the components.
you shorten the life of the components every minute you use the monitor.

the real question/issue is how much more your shorten its life when running at high scanrates than at normal resolutions. unfortunately no one knows it seems ;p
 
I know a guy who might know the answer. I'm hoping he'll be at a conference I'm going to this May. If I get the chance, I'll ask him.
 
I know a guy who might know the answer. I'm hoping he'll be at a conference I'm going to this May. If I get the chance, I'll ask him.

Awesome, please do that. I've been running crazy resolutions on all my CRT's and haven't seen any performance issues, but I still wonder how much I'm shortening their life.
 
Talking about refresh rates, here's something I found and want to know if this is true or not. The part in italic is important:

In the past I have used good CRTs - GDM-F400s followed by GDM-F520s. Now I am using an NEC 2490.

First, the absolutely perfect geometry and focus of LCD go a long, long way toward comfort for me. I use a PC for long hours for many things that aren't gaming or movies - pixel perfect text is a big part of comfort for me.

Second, you cannot directly compare "refresh rate" on CRT and LCD. On CRT the screen is constantly being redrawn locked in analog signalling matching the V refresh rate. Only phosphor persistence makes it look like a full picture on the screen. This can clearly be seen in photographs of CRTs, there's usually only a small band of picture visible in the photo. This means that CRTs will appear to flicker at refresh rates below 70Hz or so. Everyone is different and has different sensitivities to this, but you can really pick up on the "scanning" as speeds go down. LCDs accept a (usually) 60Hz framerate, and change screen content 60 times per second, but generally redraw from buffer much faster than 1/60 of a second. I could easily detect 60Hz refresh on a CRT but cannot see any flicker on an LCD. Generally I would be comfortable using a CRT at 70Hz or more. Anything above that is wasting bandwidth which leads to lower image quality (more below).

Third, CRTs are analog devices. The entire image circuit path has a certain bandwidth/frequency response. If you increase any frequencies (higher resolution, faster refresh or both) you are consuming more frequency bandwidth and pushing all of the analog circuitry harder. I could clearly demonstrate how even a top-tier analog display like the F520 (although to a lesser extent than lower grade displays) would get blurrier as frequencies increased. I could set XWindows up with 60 interlaced, 60 Hz, 70 Hz, 75Hz, 80Hz, 85Hz, 90Hz set up in round robin and use the CTRL+SHIFT+NUM + (or whatever it was) to rapidly toggle these modes and you would see more blur and less crispness at each step. Going to 5BNC cables helps but cannot change the laws of physics. This is why I stopped at 70Hz - maximum sharpness with no eye strain or interference beating effect with household/office lighting.

Running ridiculous refresh rates may gain you e-peen, or be seen as "essential" for "professional" gaming, or even be necessary for certain people to relieve eye strain, but there is a trade off in the analog world for every increase in frequency. It might not show in movies or games, but it's there.


Since I have a great LCD, I prefer LCD to CRT at this point. As mentioned, the perfect sharpness in all conditions (plus the reduced space, power, heat, and added widescreen 16:10) make LCD a winner for me and my mixed use. I was looking for an FW900 for a while - I'm glad I went this route instead.


http://hardforum.com/showpost.php?p=1037286001&postcount=7

Does higher ref. rate result in worse image quality?



Also another question, Wikipedia says that "On smaller CRT monitors (up to about 15"), few people notice any discomfort between 60&#8211;72 Hz. On larger CRT monitors (17" or larger), most people experience mild discomfort unless the refresh is set to 72 Hz or higher."

Do larger monitors require higher refresh rates than smaller ones?
 
Last edited:
yea I dunno about that claim regarding everything above 70hz being a waste. Easy enough to test with my macophotography setup, I'll include that in the tests.

About the different sized CRTs, I imagine it has something to do with our peripheral vision being more sensitive to flicker. Larger displays subtend a wider visual angle.
 
Does higher ref. rate result in worse image quality?

I said something about that earlier, about the picture becoming less sharp when increasing refresh rates.

You can do 1920x1200@96Hz if you want, I know someone here uses those settings, but it's probably safer to run at 85Hz, and the image will be sharper.

It was an impression I had, when playing around with different resolutions with ToastyX CRU. I wasn't sure, and I even thought it was me. But it seems to be true after all.
 
I don't see any difference in sharpness between my CRT when running at 60Hz vs even 344Hz. The only reason a higher refresh rate would be less sharp would be if you are exceeding the maximum bandwidth of the cables and the picture is degrading.
 
I don't see any difference in sharpness between my CRT when running at 60Hz vs even 344Hz. The only reason a higher refresh rate would be less sharp would be if you are exceeding the maximum bandwidth of the cables and the picture is degrading.

I don't think this is true, every CRT I've owned had reduced sharpness at higher refresh rates even when they were 1 day old. I used high quality BNC cables on my FW900 and still preferred the sharpness of 75 / 85hz, I preferred gaming with Vsync enabled so lower refresh rates worked out better anyhow.
 
I can't perceive a difference between 1200p 85hz and 1200p 96hz, using a run of the mill VGA cable.

However, thinking back many years, I do remember my old 17" looking noticeably softer at 1024x768 85hz compared to 1024x768 75hz. So I guess it's a relative, based on the settings, set of eyes, cable and monitor.
 
I don't see any difference in sharpness between my CRT when running at 60Hz vs even 344Hz. The only reason a higher refresh rate would be less sharp would be if you are exceeding the maximum bandwidth of the cables and the picture is degrading.

But what resolution do you have to run to get 344? I would imagine that the image would be nothing but scanlines at the resolution required to view that refresh. That's why you don't notice a difference. Raise the refresh at higher resolutions and you WILL notice reduced sharpness.

In other words, I think the difference is there, but since you're running so low a resolution to achieve that refresh rate, you won't notice it.
 
i think it's pretty obvious that horizontal sharpness drops as pixel clock/horizontal scan rate increases

for 344hz he was running at like 320x240 or lower
 
I do use 1920x1200 @ 96 hz, horizontal freq is 121.6 kHz. Of course it will result in shorter lifespan, but 85 hz is really disturbing my vision. Unfortunatelly my eyes are very prone to filckering. For last 15 years or more I've been using CRTs with 100hz refresh rates (first at 1024x768, then 1280x960 and lately 1600x1200) and now 1920x1200 @ 96 which is fine. Anything below 90 is noticeable for my eyes, even 88Hz. Hope my unit will be the lucky one to survive few more years until OLEDs arrive.

Oh and AT LAST my Optix RX is arriving tomorrow. Took 3 weeks for dude to send it to me.
 
Optix XR is finally here. So far I found weird bug. If my monitor temperature is around 7200-9000 K I can easily see measurements (by the way I found my eye to be pretty spot on, RGB are around 98-101% and gamma is always between 2.1-2.3 ) but beyond these values I get an error "encountered improper value". Only solution I found is to try other HCFR version. Hope XR is not faulty.
 
Back
Top