24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Hello, my name is Tristan Kamp and i live in Denmark! I was lucky to aquire one Sony triniton gdmfw900 FOR FREE last week.
I plugged it in my laptop, GE 70 Apache MSI, and it instantly loaded the screen in HD (1000 something * something). After cutting 3 pins in the VGA cable (so windows would stop whining & throttle my output) i got full hd in 60 hertz, allowing me to stream the best youtube in glorious resolution and no lagg!
I then tried the higher refresh rates at lower speeds, and was amazed.
Would never believe a crt could beat an lcd if i hadn't seen it myself (im 21)
It was so bright i had to turn gamma down to 0.4 and brightness to minimum +1
I went online to humanbenchmark.com and got a 177 milisecond human response time. number 34 on the leaderboard. (apparently im not on the leaderboard anymore)
However, my efforts soon proved to hybris, when i tried to roll back my driver to my gtx850 nvidea card to allow larger resolutions (screen was doing fine all the time). When i tried to install the older driver the installation failed, and bricked my gtx 850m inside my laptop. MSI turned against me like styxnet, where the normal drivers from nvidea doesnt work, only msi's personal, and that one didnt work either. As a last spite, when i tried installing ubuntu, i found out my computer has UEFI boot, and that ended the night.

in the start i wanted to sell it, but couldn't find them anywhere on ebay. in denmark a guy had a post up on "the go-to place" in denmark for used items (dba) for an entire year, however when i called him he had just aquired one last week.
what is a fw900 in near perfect working condition worth? i know the tubes wear down over time, but this guy is amazingly bright even at the lowest settings (hurts my eyes bad if i use anything else, even at 120 hertz)
 
Cool, could you forward me that email? I'd like update mine too and see if it updates the EDID to show 1080i (which is good for my TV tuner)
This is the link they gave me for the firmware: http://www.filedropper.com/nanogx_1

Email said to reflash it, and that was it. No further info. I looked on their forum and they have instructions for flashing their other stuff, but I guess the Nano GX got left out.

Anyway, you have to install the 'Silabs' driver before flashing, 4.0.0 on Windows 10. Then use the exe in the folder labeled 'original', drop-down menu - Update Firmware. If it worked, it should say OK then REBOOT, and it's done. I had to use a laptop to flash it because for some reason it would error out on my desktop.

No idea what other changes are in there, but it worked for me.
 
Thanks, I'll give it a try. I could have added 1080i to the Nano via Powerstrip but it's sort of a complicated process. Hopefully this firmware does it for me.
 
Im going to try to hook up my fw900 to a friends desktop tonight and take some pictures with a high quality camera. hopefully you can help tell me if it needs calibration or fixing or selling or scrapping?
 
If anyone in Boston or NH/MA border i have a fw900 that i don't want anymore. Have it for free just have to come get it.

Its been sitting in my closet Its in good physical condition. IQ and colors are good I did convergence from windas I also removed AG. I have DVI/VGA and BNC and service cable that will include.
 
Last edited:
If anyone in Boston or NH/MA border i have a fw900 that i don't want anymore. Have it for free just have to come get it.

Its been sitting in my closet Its in good physical condition. IQ and colors are good I did convergence from windas I also removed AG. I have DVI/VGA and BNC and service cable that will include.

good luck
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
That stinks, I've had to use my Nano GX to get HD video to play from Amazon Prime, because they will only play SD if they detect a CRT monitor/other non-HDCP display. If Sony blacklists the Nano, I'm guessing it's only a matter of time for the rest of the services do.

It really feels like the media companies are just trying to force us to buy new monitors, because I highly doubt pirates convert movies to analog to rip them. I'm sure they have better ways.

You could just go to Nassau for the movies. If the studios don't want to sell them without drm, screw em! The pirates have films that are in high definitions and work on any device.
 
This is my adaptor. It actually does 236mhz. It seems to be queer with the timings, it may need fiddling with blanks to make it work.

Does 1600x1200 at 85hz, I use it on my gtx 980 as the quality is so much better. I prefer clean 85hz@16x12 over junked up 16x12@100hz or 2048x1536@85.


Amazon.com: Vcom 6-Inch UltraAV DisplayPort M to VGA F Adapter, White (CG603-6INCH-WHITE): Computers & Accessories

Thanks for the link.

I'm in Europe so shipping that adapter costs a little bit. I'll wait for someone trying it with a FW900.

Could you try to run it with a custom res of 1600x1000@ 96 hz?
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
It didnt work at 100hz or 90hz but I didnt fool with the timings much. It does 200hz+ at low res (800x600).
 
It didnt work at 100hz or 90hz but I didnt fool with the timings much. It does 200hz+ at low res (800x600).

Could you try 1600x1000@96hz (200 pixel less in high) please? It uses less bandwith than 1600x1200@85hz. If it works you'd make me a very happy guy :)
 
bought a card with dvi-i, 1920x1200 at 85hz vsync on is just incredibly smooth, especially for shooters.

Nvidia I hope (so you can adjust the LUT with 10 bit precision).

p.s. for single player games I like vsync on, but for competitive gaming, it's always off (and I play at lower resolutions so I can get even higher framerates and refresh)
 
Hello, my name is Tristan Kamp and i live in Denmark! I was lucky to aquire one Sony triniton gdmfw900 FOR FREE last week.
I plugged it in my laptop, GE 70 Apache MSI, and it instantly loaded the screen in HD (1000 something * something). After cutting 3 pins in the VGA cable (so windows would stop whining & throttle my output) i got full hd in 60 hertz, allowing me to stream the best youtube in glorious resolution and no lagg!
I then tried the higher refresh rates at lower speeds, and was amazed.
Would never believe a crt could beat an lcd if i hadn't seen it myself (im 21)
It was so bright i had to turn gamma down to 0.4 and brightness to minimum +1
I went online to humanbenchmark.com and got a 177 milisecond human response time. number 34 on the leaderboard. (apparently im not on the leaderboard anymore)
However, my efforts soon proved to hybris, when i tried to roll back my driver to my gtx850 nvidea card to allow larger resolutions (screen was doing fine all the time). When i tried to install the older driver the installation failed, and bricked my gtx 850m inside my laptop. MSI turned against me like styxnet, where the normal drivers from nvidea doesnt work, only msi's personal, and that one didnt work either. As a last spite, when i tried installing ubuntu, i found out my computer has UEFI boot, and that ended the night.

in the start i wanted to sell it, but couldn't find them anywhere on ebay. in denmark a guy had a post up on "the go-to place" in denmark for used items (dba) for an entire year, however when i called him he had just aquired one last week.
what is a fw900 in near perfect working condition worth? i know the tubes wear down over time, but this guy is amazingly bright even at the lowest settings (hurts my eyes bad if i use anything else, even at 120 hertz)

I think the Nvidia drivers had some issues with laptop displays where they would corrupt the EDID or something. You may want to look into that. No idea if there was a fix.

As far as the value of the monitor goes, one in good condition could be worth at least $500 US. I assume people in Denmark would be able to pay as much.
 
Nvidia I hope (so you can adjust the LUT with 10 bit precision).

p.s. for single player games I like vsync on, but for competitive gaming, it's always off (and I play at lower resolutions so I can get even higher framerates and refresh)

yeah went from a 290 to a 980

With vsync off I have some minor tearing, with it I don't get any dips in FPS and it's buttery smooth. Have yet to play around with lower res and high refresh rates

Anyone here know of any solution for blurry corners? my bottom right corner is pretty blurry, might have to swap to a different flyback to see if that fixes it.



edit: Nvm I just found a happy medium between the focus in the center and the corners

Benefits of running the fw900 with the back cover off, you can get at the focus easily and it runs cooler ;)
 
yea vsync on is buttery smooth, but adds input lag. That's why for competitive quaking I have it off. Doesn't bug me at all in single player games though.
 
Some games have really good vsync implimentations that don't add significant input lag. Like Battlefield 4, I play that Vsynced at 85hz, though I found in the past that adding "Gametime.MaxVariableFPS 85" caps the framerate on the CPU to 85 and seems to close the vsync lag gap even further.
 
So the adapter that rabidz was raving about arrives tomorrow. If it's all that and a bag of chips, then I will finally upgrade to a new AMD card.
 
Thanks so much for the reply, @aeliusg
I have yet to upload images, but mine is stunning.1280*768 120p hertz, ultra clear. i would love to aquire some more of the exemplary pieces of craftmantship. (after having used this one for a week).
My setup is still just a laptop with gtx850m, running vga that i cut pins off. by cutting off the right ones i was able to bypass all of nvideas security and other bullshit (windows 8.1) and run directly from laptop full hd 120 hertz through vga
its an amazing piece of technology. i need more of them.
Any ideas on how these last? mine seems to be running like new even though its a white one of them (not the later blue version).and if it lasted 16 years.... then it might last 16 more? what actually causes them to fail? / i read that most parts could be upgraded, capacitors, resistors etc.
 
Thanks so much for the reply, @aeliusg
I have yet to upload images, but mine is stunning.1280*768 120p hertz, ultra clear. i would love to aquire some more of the exemplary pieces of craftmantship. (after having used this one for a week).
My setup is still just a laptop with gtx850m, running vga that i cut pins off. by cutting off the right ones i was able to bypass all of nvideas security and other bullshit (windows 8.1) and run directly from laptop full hd 120 hertz through vga
its an amazing piece of technology. i need more of them.
Any ideas on how these last? mine seems to be running like new even though its a white one of them (not the later blue version).and if it lasted 16 years.... then it might last 16 more? what actually causes them to fail? / i read that most parts could be upgraded, capacitors, resistors etc.
Well, some common failure modes involve the flyback transformer from what it seems. I'd guess that power-cycling and the fatigue from the thermal stress it causes is the main issue there. You can maintain the life of the tube itself best by doing the white point calibration procedure found elsewhere in this thread (the user spacediver did a good guide for it). In any case, if yours hasn't failed in 16 years of use, I think it's going to be in it for the long haul.

But, yes, I think most of us here would give at least a limb or two for a warehouse full of these in mint condition. :p
 
Well, some common failure modes involve the flyback transformer from what it seems. I'd guess that power-cycling and the fatigue from the thermal stress it causes is the main issue there. You can maintain the life of the tube itself best by doing the white point calibration procedure found elsewhere in this thread (the user spacediver did a good guide for it). In any case, if yours hasn't failed in 16 years of use, I think it's going to be in it for the long haul.

But, yes, I think most of us here would give at least a limb or two for a warehouse full of these in mint condition. :p

This and every other high-end CRT. In the projector realm, there are some diehards who have modified high-end 9-inch sets to do 4K. Whether or not the projectors can actually resolve it is another thing, but there's going to be a CRT projector meet in the fall and they're squaring off a heavily modified CRT projector against a top-end 4K LCOS projector. Good times indeed.

Just goes to show that if CRT was still in development, it would most likely still be the king of display technologies.
 
This and every other high-end CRT. In the projector realm, there are some diehards who have modified high-end 9-inch sets to do 4K. Whether or not the projectors can actually resolve it is another thing, but there's going to be a CRT projector meet in the fall and they're squaring off a heavily modified CRT projector against a top-end 4K LCOS projector. Good times indeed.

Just goes to show that if CRT was still in development, it would most likely still be the king of display technologies.
Very cool. I hope they publish some pictures. You know, to think that in the future the art of CRTs might even be lost to us. What happens when the engineers who've worked on this stuff all go the way of the dodo. Sad, indeed.
 
Very cool. I hope they publish some pictures. You know, to think that in the future the art of CRTs might even be lost to us. What happens when the engineers who've worked on this stuff all go the way of the dodo. Sad, indeed.

I hope they post some pictures too. Word on the street is that the CRT picture still wins against the LCOS 4K. There was some serious modifications done to the set though. I want to say (going off the top of my head here) that it's a very modified Marquee 9500 set. Those 9-inch CRT's really are something. Seeing a Sony G90 in person was just awe-inspiring. Eventually, if they're still around by the time my G70 conks out, I'd love to get my hands on one with good tubes.

In other news, the adapter arrived a day early. Preliminary opinions on the Artisan monitor (GDM-C520K):

+ Very sharp RAMDAC. Rabidz was right, this thing is sharp. I don't think I've ever seen my Artisan this sharp on my computer before. I may need to check with my pattern generator because I think this is even sharper than it. I suspect it's the default reduced CVT blanking option that it seems to set my Nvidia card to at least.
+ I was able to scan up to 1920x1440 70 hz. Not great, but not bad either

- No HDCP. Bummer. Can't watch BluRay's using it
- Could not get it to scan to 1920x1200 85hz either. Another bummer. Reduced blanking mode puts 1920x1200 85hz right under 225 Mhz. But the Artisan doesn't like it. I don't have an FW900 so I can't test it.

For people with an Artisan or below, I can say that this adapter will cut the mustard. I could get it to sync to 1600x1200 85hz no problem. I played a little bit of Quake 3 and didn't detect any input lag. So I can safely say that for $25 on Amazon, you could do a lot worse.

That said, I don't think this is a real option for those with FW900's or F520's, or if you're really wanting to get some good high res out of your CRT.

Bottom line is that I'm not returning this adapter. I'm glad that I can squeeze more life out of my Artisan. I may try 1920x1440 70hz on the F520 and see how that does. It's what I did on my old GTX-560. But honestly, I think we're better off waiting for a new HD Fury or we need to get ahold of some video DAC's and learn to build our own adapter, frankly.

And before anyone asks "what do the CRT Projector guys use?" Awesome HDMI adapters, that's what. :) Too bad we can't have someone like Moome make some video boards that fit the GDM monitors. I'll bet the architecture's similar enough to the projectors that it's possible. But at great expense I would assume. Alright, well that's my report on the adapter. Good enough to do Prime Mode on the 21-inchers. Very sharp as well, though see my comments on reduced CVT blanking. I'm very sure that's how it's achieving it.
 
thanks for the detailed reply! Looking forward to the results of that projector meet :)

re adapter, would it be possible to achieve the same effect by adjusting the timings in the custom resolution panel in Nvidia control panel?
 
More light on that? I remember it was mentioned here before, but can't find any trace.

In the WinDAS WPB guide, the last step (adjusting the gamma) requires a 10 bit DAC in order to preserve 256 distinct luminances.

Here's a figure I made to illustrate, which shows the results of simulating a linearization (gamma = 1.0) of a display that has a default gamma of 2.2. With 8 bits, the number of distinct steps after linearization drops from 256 to 184. It wouldn't be that severe when the target gamma is 2.4, but the principle is the same. With a 10 bit DAC, you have 1024 distinct luminance levels from which to choose your 256 working levels, and thus have more flexibility when tweaking gamma without compromising quality (having less than 256 distinct luminances means "brightness gradients" won't be as smooth).

230b2g.png
 
re adapter, would it be possible to achieve the same effect by adjusting the timings in the custom resolution panel in Nvidia control panel?

Yes. Actually I should be more specific. That's how I did it in the first place. When you first hook it up, it advertises a max res of 1280x1024 85 hz. When I created custom resolutions, the video card selected the reduced blanking option by default. I've never seen it do that before, so I figured it was something that the adapter is doing. With standard timings, 225 doesn't really get you a lot.

I plan on doing other tests too, because according to the nvidia custom resolution control panel, 1920x1440 70 hz exceeds 225 MHz. It was like 236 I think? This was with automatic CVT mode selected. For some reason the artisan doesn't like the reduced blanks, though 1600x1200 85hz worked well enough. I don't know... This stuff feels more like a crapshoot sometimes. :D
 
^I've found that if you reduce the porches/blanks too much you get a squished picture on the right and left sides. Like 100 lines of resolution will roll up into one. "Roll up" is the best way I can describe it, because all the lines are sort of packed on top of each other and really bright. The HDFury 1 does that with 1080p signals from my PS3. I have to center the picture best I can, losing about 2 or 3% of the picture on either side

Alright, well that's my report on the adapter. Good enough to do Prime Mode on the 21-inchers.

I would say it's more optimal for 19 inchers. My Dell P992 can do 85hz at 1600x1200 (which is also around it's maximum fully-resolvable resolution), but all my 21 inchers can scan up to 85hz at 2048x1536
 
Yes. Actually I should be more specific. That's how I did it in the first place. When you first hook it up, it advertises a max res of 1280x1024 85 hz. When I created custom resolutions, the video card selected the reduced blanking option by default. I've never seen it do that before, so I figured it was something that the adapter is doing. With standard timings, 225 doesn't really get you a lot.

I plan on doing other tests too, because according to the nvidia custom resolution control panel, 1920x1440 70 hz exceeds 225 MHz. It was like 236 I think? This was with automatic CVT mode selected. For some reason the artisan doesn't like the reduced blanks, though 1600x1200 85hz worked well enough. I don't know... This stuff feels more like a crapshoot sometimes. :D

Cloud you try a custom resolution of 1600x1000 (not 1200) at 96hz please? Or how far it goes with the standard 16:10 res 1680x1050?

Thanks :)
 
Last edited:
In the WinDAS WPB guide, the last step (adjusting the gamma) requires a 10 bit DAC in order to preserve 256 distinct luminances.

Here's a figure I made to illustrate, which shows the results of simulating a linearization (gamma = 1.0) of a display that has a default gamma of 2.2. With 8 bits, the number of distinct steps after linearization drops from 256 to 184. It wouldn't be that severe when the target gamma is 2.4, but the principle is the same. With a 10 bit DAC, you have 1024 distinct luminance levels from which to choose your 256 working levels, and thus have more flexibility when tweaking gamma without compromising quality (having less than 256 distinct luminances means "brightness gradients" won't be as smooth).

Thankfully I use nvidia card, but that made me wonder - did AMD (ATI) always use 8-bit DAC for their analog outputs?
 
^I've found that if you reduce the porches/blanks too much you get a squished picture on the right and left sides. Like 100 lines of resolution will roll up into one. "Roll up" is the best way I can describe it, because all the lines are sort of packed on top of each other and really bright. The HDFury 1 does that with 1080p signals from my PS3. I have to center the picture best I can, losing about 2 or 3% of the picture on either side



I would say it's more optimal for 19 inchers. My Dell P992 can do 85hz at 1600x1200 (which is also around it's maximum fully-resolvable resolution), but all my 21 inchers can scan up to 85hz at 2048x1536

I agree that if you're wanting to take it to the max, then this adapter won't get you there. As far as I know, there's still no option for us. Hopefully something better will be around the corner.

As for resolving power, the artisan can resolve a tad more than 1600 lines, going left to right. Which is why for me, this adapter is good enough. A sharp 1600x1200 at 85 hz is fine for me.
 
yeah went from a 290 to a 980

With vsync off I have some minor tearing, with it I don't get any dips in FPS and it's buttery smooth. Have yet to play around with lower res and high refresh rates

Anyone here know of any solution for blurry corners? my bottom right corner is pretty blurry, might have to swap to a different flyback to see if that fixes it.



edit: Nvm I just found a happy medium between the focus in the center and the corners

Benefits of running the fw900 with the back cover off, you can get at the focus easily and it runs cooler ;)

Have you tried adjusting the Moire? Below is a link to a program that shows some images to help adjust this with the OSD. It should be the number 7, 8, and 9 images in the utility. I was in the same boat and couldn't get anything to focus until I adjusted the Moire.
EIZO: test-monitor
 
Have you tried adjusting the Moire? Below is a link to a program that shows some images to help adjust this with the OSD. It should be the number 7, 8, and 9 images in the utility. I was in the same boat and couldn't get anything to focus until I adjusted the Moire.
EIZO: test-monitor

didn't really do anything for me sadly, going to play around with it some more
 
Did some more adapter testing. Yeah - there's input lag. Not significant, but the controls are a little floaty.
 
Any chance that DAC can run the FW900 at 1920x1200 @ 75 or 80hz? I'd buy it if there's a chance it might run that such as high res.
 
I listed on that is custom made that can do 1920x1200 @75hz, here is the quoted post, someone else may be able to chime in on if a hdfury model could it as well, im not sure about it.
Hi fw900 users who like to keep using on AMD cards, I emailed a fellow who makes racdac for high end projectors and here is what I have to share.

this is a ramdac made be a guy name moome ( Email :[email protected]) [Website: Moomecard - Home ]

This is a custom order(handmade as well) he show me that he does here(Pic below), HDMI to RGBHV converter, Im somewhat interested in myself but As I sent to him in the email 75 hertz(I am assuming that at lower resolutions will do higher, don't quote me) is to low for me. He did say he has a HDMI 2.0 product he was working to which he said out be about 6 to 8 months out (depending)

If you have any question you should email him, He is going to be emailing me shortly something of a spec sheet so I can share.View attachment 116 View attachment 117 View attachment 119
 
I'll get in touch with this Moome guy and see what kind of DAC he can make.

I was actually curious about the DAC Rabidz suggested. He says it's actually capable of a pixel clock speed of 236mhz.

I ran some tests today using Nvidia's custom resolution settings and a DVI to 5BNC cable. With 1920x1200, there was practically no perceivable difference between 80hz and 85hz, other than a slight bit of extra sharpness and clarity with 80hz.
However, 80hz still has a pixel clock requirement of 260mhz+

I then tried out 72hz, which at 1920x1200 requires 235mhz, give or take half a mhz. The monitor is visibly sharper and clearer than 80hz/85hz. 72hz is visibly not as smooth as 80hz, but it's not as bad a 60hz (which is pretty much unusable).

I'd imagine this degradation of clarity is because of the shitty quality of the DAC on most video cards released in the last 8 to 9 years since LCDs have taken over.

I had dozens of video cards over the year, Nvidia's analog output was always poorer than ATI/AMD. I once had a golden Radeon 1950pro (?), its analog output made all my CRT's extra clear, extra bright, and razor sharp. For non-gaming purposes, it made my FW900s a total pleasure to use, even for reading fine text. My 5870, while worse, was still ahead of any Nvidia card I've had since.

I suppose I'll give Rabidz's DAC a shot. Doesn't cost very much, and perhaps I can get used to 72hz (should help make the CRTs age better anyhow). Maybe if I'm lucky, it might be able to squeeze in 80hz. And that would mean a GTX1080/1070 would be worth buying as well.
 
Back
Top