24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Just briefly looked through the service manual adjustment part. Looks like it's a similar procedure. The guide will be very helpful, as it provides a lot of background info, and, more importantly, explains how to set things up for real time luminance/chromaticity measurements, which you'll need for hitting the targets that are described in the service manual. No need to calibrate a Trinitron as a reference. Use your instruments. Same basic setup - use the laptop to control the colorimeter, and use your PC as a signal generator.

The procedure seems to involve measuring voltages though.

Connect PWB-CRT TP (R200G lead wire) to the probe.

Wonder what this means. There's reference to an R200G, R200B, and R200R, so they have something to do with red green blue. Maybe the RGB inputs on the tube?
 
Yeah, can't really understand most of what's in the adjustment part in that manual. I'll need to read you guide first to bring me up to speed on the concepts, and I'll probably adjust one of my Trinitrons first so I can understand the process.

Hopefully I can get away with adjusting the LaCie with just the colorimeter, instead of measuring voltages and all that.
 
So focus is still drifting in my FW900. Hoped it would go away but certainly it won't :( Where can I buy this d-board mentioned by master noran?
 
So focus is still drifting in my FW900. Hoped it would go away but certainly it won't :( Where can I buy this d-board mentioned by master noran?
LAGRUNAUER aka Unkle Vito still sells them last time I checked. Try PMing him here or on Ebay.
 
What defines the max horizonal frequency of a CRT? Is it just the flyback and yoke or is there something else? And can the OSD scanrate cap be removed / changed?
 
What defines the max horizonal frequency of a CRT? Is it just the flyback and yoke or is there something else? And can the OSD scanrate cap be removed / changed?

Unless you want to get a FPGA and learn Power engineering, and some reverse engineering. No.

also So i'm 99% sure, I just found a porno with two fw900's and a PVM, and not a amateur one either, its ones by Mofo's.

that is all.
 
Looks bad, let's hope non-reference cards from MSI or Gigabyte would be able to utilize DVI-I. AFAIK mounting RAMDAC is up to 3rd party company and not up to NVIDIA project.

HDFury looks dead, but who knows. I doubt that anyone would have an interest in building >400mhz converter nowadays.

In this case we are stuck to 980Ti or TitanX and I think that their performance should be enough for up to 3-4 years with FW900, then our only hope is OLED :p

Oh and popping in my unit is almost completely gone, it's working around 10 hours a day. From 10 pops an hour this it reduced to 2-3 a day. Thanks god, as Unkle Vito doesn't seem to look here anymore.
 
Hi fw900 users who like to keep using on AMD cards, I emailed a fellow who makes racdac for high end projectors and here is what I have to share.

this is a ramdac made be a guy name moome ( Email :[email protected]) [Website: Moomecard - Home ]

This is a custom order(handmade as well) he show me that he does here(Pic below), HDMI to RGBHV converter, Im somewhat interested in myself but As I sent to him in the email 75 hertz(I am assuming that at lower resolutions will do higher, don't quote me) is to low for me. He did say he has a HDMI 2.0 product he was working to which he said out be about 6 to 8 months out (depending)

If you have any question you should email him, He is going to be emailing me shortly something of a spec sheet so I can share.View attachment 116 View attachment 117 View attachment 119


So I made a post before if hdfry5 doesn't come though, here is another option for you guys. just wondering how many of you guys could see the AG wires like I could? Rather how noticeable are they for you guys?
 
Yea I remember that post. Hopefully it works well.

As for wires, I rarely notice them. They're subtly noticeable with light uniform backgrounds, but it's not often that I work with light uniform backgrounds.

Was that video in which you saw two FW900's a recent one, or a "retro" one? :)
 
A recent one, I say maybe 2010? I can private message you them if interested haha, hope you like scene girls lol.

for me, no matter what I had I always saw them.
 
i'm telling you spacediver they're fw900's. maybe a off rebadge? do we have a list of the fw900's rebadges?

got a SFW image too
d9ce31c077.jpg


What do you gusy think, are those fw900s?
 
They look widescreen to me, but not like FW900's. Perhaps they are 4:3 and the perspective distortion is causing it to look widescreen.

2vt5s86.png
 
im telling you its a fw900, and its widescreen for sure.

also NVIDIA GeForce GTX 1080 Does Away with D-Sub (VGA) Support

Confirmed boys the Ramdac party train is over,

doesn't look widescreen to me at all, you can see the small broadcast monitor to the left which is 100% 4:3 and the vga monitors in question have a similar aspect ratio

Anyway I did some digging and they seem to be NEC MultiSync FE700 or similar monitors

Also other then the fw900 and its almost identical looking rebrands, the older gw900 or w.e the fw900 replaced, and the that massive crt that carmack used there aren't any widescreen pc crt's AFAIK
 
great detective work :)

also, imagine how dominating an FW900 would have looked next to those measly 19 inchers :)
 
Terrible news the nvidia decision about removing analog signal from their pascal cards. Let's hope AIBs still inlcuding it in their custom versions.

Is there any adaptor that supports 1200p@96hz? (what I use for gaming)
 
Hi everybody !
I've been reading this forum for some time now as it is a great source for CRT advice.

So I had a Sony E500 that I got for free, but had to give up while moving. So now I have a Sony G500, which is almost the same monitor.
I love these CRTs for gaming, as I appreciate the IQ, fast response and smoothness I get from them.

So now I have the chance to get a FW900 for pretty cheap, that seems to be in good condition. I am quite attracted to it, as it looks like the perfect monitor.
I have a few questions though, that I hope you guys can help me with.
My E500 and G500 lose their image alignment every now and then. When I turn them on, the image is more on the right side, than it should be. During warm up it slowly moves back to the center, but often ends up imperfect and I have to adjust it during every use in order to have a perfect image. Is this normal ? Does the FW900 do this too ?
The G500 has this function to allign the image automatically, just like the FW900, but it doesn't really work. Even if i do chose a 4:3 resolution like 1600:1200 the function just puts the image to a completely wrong place, rendering the auto option useless.
Does this have anything to do with my settings or my graphics card ? I'm using an HD6870.

Also what maximum refresh rate can I expect from the FW900 at 1920x1200, 1920x1080 and 1600x900 ?
My G500 (doing even better than the E500 did) can do 1920x1080@105Hz and 1600x900@123Hz.

Also thank you guys for all the work you do and the help you offer !
 
Hi HAF.

It's always worth to take the risk of buying FW900 if it seems good condition, especially if you can buy it cheap.
I never had problems with losing alignment, so I guess this unit should be free of this problem. Don't bother auto alignment in monitor, just buy TTL adapter and do proper geometry calibration.

G500 and FW900 have exactly the same horizontal frequency, so they are able to do exactly the same vertical resolution at desired refresh rate. Here is table of available rates that I made one day:

https://dl.dropboxusercontent.com/u/14848901/Hardforum/tabela_odswiezania.xlsx

I assumed 121.5 hfreq, but in fact it is 121.7, so they would be a tad higher. You can change this value manually in xls file.

Most people just use 1920x1200@85hz (advised by Sony) or 1920x1200@96hz (max possible at native resolution). If you want to use higher refresh rates it is highly recommended that you use CRU (Custom Resolution Utility) which handles this task much better than GPU driver.
Also you should set timings using GTF (General Timing Formula), like in this example:

Screenshot%202015-11-17%2015.13.35.png


So if you decide to buy this FW900 (which I highly recommend) you could use it "as is" or buy TTL cable, colorimeter, spend about a day or two on setting this beauty and enjoy using it the way it was supposed to.
 
  • Like
Reactions: HAF
like this
Looks bad, let's hope non-reference cards from MSI or Gigabyte would be able to utilize DVI-I. AFAIK mounting RAMDAC is up to 3rd party company and not up to NVIDIA project.

HDFury looks dead, but who knows. I doubt that anyone would have an interest in building >400mhz converter nowadays.

In this case we are stuck to 980Ti or TitanX and I think that their performance should be enough for up to 3-4 years with FW900, then our only hope is OLED :p

Oh and popping in my unit is almost completely gone, it's working around 10 hours a day. From 10 pops an hour this it reduced to 2-3 a day. Thanks god, as Unkle Vito doesn't seem to look here anymore.


I am always looking and reading the posts in the Forum... I am fully aware of the NVidia latest developments...
 
Last edited:
Got a IBM C220P today, owner said he hasn't powered it up in 2 years or so and its just been sitting. Luckily i brought it home and it fired up nicely. I have an i1 dispaly pro for calibrating but up until now i've only had LCDs and have always calibrated them to 2.2 when it came to gamma. What should i do with a CRT? RIght now things definitely look dark like 2.4 gamma levels, i was thinking i can either go with 2.2 as usual or try BT.1886, but i'm not sure.
 
Hi guys. Im scared my FW900 is starting its end of life. When you look at an icon on the desktop there is an extremely faint ghost of it to the right. It follows the icon when I move them, so its not burn-in. You can see this faint ghost with any image, but I guess more noticable with an icon.

I really only notice it when Im about a foot away from the screen and with a bright background. I mean you really need to look hard to notice it, but Im afraid its only going to get worse. Ive already degaused and tried changing other settings with no luck.
 
Hi guys. Im scared my FW900 is starting its end of life. When you look at an icon on the desktop there is an extremely faint ghost of it to the right. It follows the icon when I move them, so its not burn-in. You can see this faint ghost with any image, but I guess more noticable with an icon.

I really only notice it when Im about a foot away from the screen and with a bright background. I mean you really need to look hard to notice it, but Im afraid its only going to get worse. Ive already degaused and tried changing other settings with no luck.


try some different cables, difference source etc.

Would help if you took a pic
 
Man, I'm glad to see you around still. :) My GDM-F520 that I bought from you is still pristine as ever.


I am so glad to have you as a loyal client and a proud owner of a Grade A+ GDM-F520 monitor!!! Like I always said, and still saying... "You'll get what you paid for..." In your case, you've paid and got a Grade A+ unit that still rocks!!!!
 
Crisis diverted! Cable was a bit loose from when I last disconnected to clean the system!
 
Hi.

I've got reply from HDFURY guys that there'll be a new HDFury with a 400mhz ramdac when the fpga price gets lower.
 
Hi.

I've got reply from HDFURY guys that there'll be a new HDFury with a 400mhz ramdac when the fpga price gets lower.

Are you serious?! Citation needed (please PM me) if true. HDFury folks have been going around on AVSForums and other forums stating that due to a legal settlement, they had to take down all of their analog products and telling us all in the process "the time is now or never". I just bought an HDFury2 for an upcoming CRT projector for this exact reason. And if this is true, I'd be happy/mad about it.
 
I would hope the would do something more then the standard 400mhz cause this is for sure the last Ramdac they are going to make. The world just moving on from analogue with OLED on the horizon.
 
Back
Top