24" Widescreen CRT (FW900) From Ebay arrived,Comments.

It's funny that they had better (or more) colors in 1953 though I'm not sure about the provenance of this image so it could be wrong.
They wanted this to be the NTSC standard but quickly changed phosphor formula for something that was producing more luminance brighter. Bigger gamut would also mean filters in cameras would have block more light thus worse camera performance and generally everything would be more expensive.

Gamut of NTSC 1954 is so wide not many even of today's color critical displays can handle it fully.
 
jka: I confirm, don't touch the pots on the FBT, they are perfectly set to match that monitor's hardware and change only focus, there wouldn't be any effect on brightness.

About the Windas issue, I have to ask, did you follow properly the procedure (setting it with the 2 black/grey bands picture displayed so that the grey band is only slightly brighter than the black, watching in a dark room) ? What was the last G2 value you set (could be a clue) ?
Normally running WPB several times in a row is useful to fix issues coming from improper ABL values (appear usually after messing manually with the G2 value), but otherwise one single pass should be enough. About contrast/brightness values in the OSD, they are normal, after a WPB procedure these are set to default values for this monitor (may be different from another Sony model).
 
Appreciate your replies guys, thanks!

The original DAT G2 was 172 and every time I did WPB now it was 116. Which to me seems pretty low already, I shouldnt be getting that grey pedestal I am getting.

I generally know how to perform WPB. I could have made some user error (wouldnt be the first time) but I dont think so. I was doing WPB on FW900 just a few days ago and it was fine for example.

During WPB, all targets were easy to hit and the picture during those tests looked exactly like it should. So during the G2 step, the black level was cutoff and the 2nd level was barely visible. It is actually not necessary to do WPB in the dark (but I did) if you will not be using the monitor in the dark. Yes, different lightning conditions will skew the WPB but a little, same if you will be lazy and wont go precisely for the targets or if you give +/- 10 different G2 than you should. All these "mistakes" would result in relatively small picture errors compared to the grey adventure I am getting here.

I dont think I am hitting ABL, you can test that by alt-tabbing between black and white signals and see if you can see quick dimming on white (=thats ABL in action).

The OSD can hit 200cdm2 because it is only a small area (=few electrons, does not require higher voltage than what is already a max of the anode, to suck them away from the tube), I think this is a special ("contrast"?) feature that is not present on FW900 btw, I think my Mitsu does something similar.
 
Okay, Ive made some (hacky) progress now.

I have:

1 - hexedited G2 from 116 down to 106 and loaded the DAT file in
2 - lowered Brightness from 50 down to 36 to achieve black pedestal (this is a hint for me to go for G2 slightly lower than 106 next time)
3 - increased contrast from 85 to 100
4 - switched from PROFESSIONAL to DYNAMIC MODE

Now I am getting black pedestal and 107cdm2 fullwhite with correct 0.283/0.298 coords :p

Later I will attempt to do a proper WPB with correct G2 minus the synthetic 10 or so to compensate for something not working correctly, will see how that goes.

I also plan on testing which of the PROFESSIONAL/STANDARD/DYNAMIC is the best for my needs (gaming).

PROFESSIONAL seems a bit dim for gaming, STANDARD forces brighter colors and DYNAMIC forces colors even brighter at the expense of a brighter pedestal. I am interested to see if there will be difference between "STANDARD with X OSD Brightness" vs "DYNAMIC with (X-Y) OSD Brightness".

And by the way, the F520 seems to crush the blacks a little less than FW900, its still very little difference, but my eyes noticed it.

But I have actually reached the point where I can safely say this is usable now even if the approach is not ideal. The picture now looks exactly the same as was on my FW900 so thank you everyone for your help so far!

Late edit: I meant 106 instead of 96. But in the end around 100 will be more correct result.
 
Last edited:
What are you using for your color calibrator? I was unsure if my DTP94 was still accurate but it turned out to be okay.

I use DTP94 as well, it is perfectly fine, there is no need to spend any more money on this. It measures a bit slowly in dark picture, and x/y/Y readings of picture dimmer than 5cdm2 are probably skewed/wrong, but neither of these things is an actual issue for our purposes.
 
Ok, to give you an idea of "right" G2 values, I have 2 Dell P1130 (which are basically F520s with a less accurate tube and the same video board with pretty bad grounding planes). I went through the settings saves I have, one was a bit too bright with a G2 at 161, correct G2 values after WPB were in the 145-150 range, and I managed to go as low as 120 after replacing some original ceramic capacitors with much better ones especially regarding current leakage.
So, if you have to go down to 96 and you're still with entirely original electronics, there's definitively an issue, and a big one.

Please check the very first step in the procedures of Windas (before geometry, convergence, WPB ...), it gives you access to a maintenance log, does it indicate any problem ?
 
Hm, I didnt know there are any maintenance logs. Could you please give me more specific directions, it is not readily apparent from the UI and I dont want to click around WinDAS in case I start something that would be hard to get out of.

There is "Adjustments > Maintenance", is that it? Or is it something in "Adjustments > Procedures"? (where WPB is)
 
Adjustments -> Procedures -> Preparation for alignment
Hit the "FAIL_INFO" button when it appears, you'll see a window with several problem types listed, something wrong happened in the past if one of them isn't labelled "normal". Hit the refresh button to refresh the log. OK to close the window, then another OK or CANCEL to leave the procedure part.
 
Now I am getting black pedestal and 107cdm2 fullwhite with correct 0.283/0.298 coords :p

Just so you know, this is the whitepoint for the 9300K mode, not for 6500K.

For movies, gaming, and regular browsing, you want the 6500K mode, which has a whitepoint of 0.31/0.33
 
jka : Actually I just thought about a mistake you could have made: did you remember to set the correct monitor in Windas before performing the WPB "Set up -> Model sel" ? If you performed it with Windas still thinking it is dealing with a FW900 instead of the F520, this may have messed up settings despite the procedure looking to perform properly. That would make more sense than such a bad luck that you have 2 monitors failing in a row. :)
 
Thanks! FAIL_INFO says everything is normal, even after refreshing.

I actually prefer 9300K. I prefer bluer/greener picture for gaming and greener picture for movies on my TV. So to me, 9300K is yellow enough.

The model is indeed selected correctly as GDM-F520 (CR1 chassis).

I have done some testing and seems like each of the 3 modes has its own peak limit in terms of cdm2 output.

PROFESSIONAL: 82
STANDARD: 99
DYNAMIC: 112

And every one of those values is an exact target value you are asked to confirm/match during WPB.

So even if I crank brigthness/contrast both to 100, fullwhite will never exceed 82cdm2 on PROF or 99 on STD. For this reason I think I must use DYNAMIC.

Using DYNAMIC will make the black pedestal slightly grey (assuming you had your pedestal just borderline black on PROF/STANDARD), so I need to compensate for that by lowering Brightness a little bit. When adjusting the Brightness for this, there is a narrow sweet spot where black is black and white is at its limit of 112cdm2. Other colors look fine this way too, looks like FW900 after WPB. There is that black crush of course but that is to be expected.

So the picture can be made to look as expected but it needs some unusual help.

I will do that WPB soon, with the synthetic G2.
 
Okay, so I loaded the last DAT after normal WPB. This resulted in that grey picture. Now I have attempted another WPB, but this time, instead of choosing on-target G2 of 116, I went with a value of 95. I came to conclusion that it needs to be 95 after some trial/error hexediting and loading the modded DAT files in.

Here are all the WPB values, left is normal WPB (G2=116), right one is with a "faked" G2 of 95. I hit all the necessary targets in all steps no problem.

G2:
116,95

9300K:
107,159
96,147
96,149

199,204
216,218
79,73

121,125
107,110

199,201
216,218
103,133

20,47

6500K:

112,165
98,149
89,144

177,180
170,171
77,72

123,121
104,111

180,181
170,171
102,138

22,54

5000K:
116,169
96,149
82,137

165,165
133,135
76,72

124,118
101,114

165,165
133,132
104,142

25,59

SRGB:
215,211

Now to the results, they are definitely different (a bit brighter) than just hexediting the 95 G2 in. So bright in fact, they often touch ABL limit and text is very much blurry (which is manageable in games).

This time I have only upped the Contrast from 85 to 100. Brightness is pretty much spot on with this 95 G2. No grey issues at all. Black is really black.

Fullwhite signal results:

PROFESSIONAL: 112cdm2
STANDARD: 119cdm2 (ABL)
PROFESSIONAL: 119cdm2 (ABL)

And just for fun I have downsized the picture to the smallest possible size under prime mode and I was getting 170cdm2 - hehe, in your face OSD!

Did I say it is blurry? Thats probably because the guns are a bit overcharged with such a low G2 and hitting all the targets as asked. If needed I can just lower the Contrast or switch to PROFESSIONAL or both, but I think I am keeping this fake WPB for now, it looks really nice (blurring aside).
 
80 nits is actually the sRGB luminance target though I don’t know how color temp interacts with that on the F520 so it could be why it was that low in pro mode.
 
Usually when the display is blurry is means excessive or "dirty" voltage (with noticeable interferences, not a nice stable one).

Out of curiosity, I suppose you saved the original settings before doing anything on the screen ? What was the original G2 value ? How did the display look ? Just with a green hue or also too bright ? Could you see refresh lines ?
 
During WPB, the target for 9300K is 112cdm2, 6500K is 97cdm2 and 5000K is 82cdm2. This oddly corresponds to the picture modes.

Yes, I do have a backup of the original state of the monitor. It was 170 for G2. Display was bright, washed out, skewed a little, and green. There were no retrace lines, wasnt that bad. If I did IMAGE RESTORATION over that, it would help a lot (while remaining the same 170 G2).
 
Ok, so it is absolutely certain the low G2 isn't normal.

I checked the list in Windas, there are actually two F520: GDM-F520 and GDM-F520(J), but I don't know what the J stands for (Japan ? A different revision of the microcontroller ? Any clue at the back of your monitor ?). Maybe it would be worth saving your current settings, loading the original ones and trying a WPB with GDM-F520(J) set in Windas to see if it removes the issues.
 
Thanks, I will keep that idea for next time though, because it is working nicely now and I have other issues with it that I need to sort somehow and not too much free time on my hands to handle it all :/

The most pressing issue now is that anything that has to do with EDID does not work well. For example:

- modifications via CRU do not work
- adding DSR factors do not work
- highest possible resolution for selection is 1600x1200 while it should be 2048x1536. Not visible even via "List All Modes".

I can add custom resolutions via NCP (NVIDIA Control Panel), that kind of works. But oftentimes when using these new custom resolutions, the screen gets black and the monitor goes into orange-LED standby mode because I didnt detect any signal I think. I am using the Sunix adapter btw.

Hooking this via BNC to Win98 machine, I can see and select 2048x1536 just fine.
 
SUNIX DPU3000 USERS!!!

lurking the crt collective facebook group, found something that can be very important and the key of why every user seem to have some type of issues with the sunix dpu3000, and it seems related of the way the dpu3000 is being powered, so a good quality non usb pc based powersupply adapter can be the anwser, as these users from the group claim no to have any issues at all powering their dpu3000 with a quality 2.1 amp range usb adaptor:

View attachment 164979

Super late reply, I was only visiting to see if there was any new news on some new adapter. Glad to see that Sunix adapter is still great. It's been around 1 1/2 years now with both my adapters working everyday and I only get a split sec black screen once every 3 days. Both monitors are running at 1600x1200@90Hz and I am powering the adapters via USB 3.1 from my motherboard. I dunno if that was what helped made it stable all this time.
 
Aparently some versions of the RX 550, a Polaris card, have VGA out.



Of course it's not a good card for modern games. But it makes me wonder if there are test points on the 580 or 5700 where there are VGA signals ready to be tapped.


That's pretty cool. I wonder if it's a genuine card using polaris? I've seen a few chinese cards on ebay that say they're GTX 10xx, but end up being rebadged gtx 7xx or 9xx.
 
Setting SMPTE Cand gamma 2.50 in madVRworks very well for FW900
I also like to use it at 1280x720@160Hz dunno why... probably eyes :ROFLMAO:
 
*warning, advanced maintenance stuff*

I'm in the process of fine tuning a P1130 screen. So far I found how to have the horizontal edges of the display almost perfectly straight, but that means moving a couple of elements on the tube itself. I still have an issue with horizontal convergence that I cannot fix with digital settings of Windas.
According to the service manual (attached to this post), a wheel on the side of the tube labelled TLH deals with horizontal convergence, in a way that may be different from the digital settings or the 4 poles/6 poles magnets on the neck.

Problem: I unsealed it but whatever position I put it in, it doesn't seem to affect the display at all ... :ROFLMAO:

Anyone knowing if this is normal ?

edit: Ok, answering myself, this was normal. The long story is that there was still some paint preventing the wheel from turning freely (letting me think I had reached the end of its course, when I hadn't). I was stuck with about one wheel turn amplitude while it takes several full turns to see noticeable differences on the display. :rolleyes:
 

Attachments

  • P1130.pdf
    3.5 MB · Views: 0
Last edited:
Is there a high clock displayport to vga/bnc adapter yet?

Would look through 415 pages of FW-900 goodness but don't have that kinda time :p

Only had to look back a few pages:

Sunix DPU3000
 
Last edited:
The most pressing issue now is that anything that has to do with EDID does not work well. For example:

- modifications via CRU do not work
- adding DSR factors do not work
- highest possible resolution for selection is 1600x1200 while it should be 2048x1536. Not visible even via "List All Modes".

... I am using the Sunix adapter btw.

First, make sure you're plugging in the vga cable from the CRT before the displayport connector to the GPU. That way the Sunix will grab the CRT's EDID.

Then, look next to your monitor's name in CRU, you'll see an "Edit" button. In the next window you will see the max pixel clock reported by your monitor. If this is low, it's going to put an upper limit on the pixel clocks you can hit, and changing it won't help. It seems this entry is always read directly from by Windows and/or the driver, so overriding with CRU doesn't do anything.

I think the solution might be to spoof the EDID with an external device like HD Fury's Dr. HDMI, where you can program custom EDID's. You would be using it to send the information from the ID pins to the Sunix, which then relays it to the GPU. You can do this with an Extron RGB, since it has two outputs, one that is BNC and doesn't receive EDID info, and one that is HD-15 and would relay the ID signal (you'd hook the Dr. HDMI to this). Or you could make a a VGA coupling and splice in the ID signals yourself.

I haven't gone to deeply into all this myself, because my LaCie reports a very high pixel clock, thankfully. So I'm limited by my CRT's max horizontal range for 4:3 resolutions, and by the Sunix's ~550mHz limit if I'm trying to run widescreen resolutions. But when I do hook back up my Dell P992, I'll probably have to try the Dr. HDMI trick
 
How do I access the service menu on the FW900? I just want to know how many hours use my monitor has had!? Please help! Gracias! »
 
The FW900 doesn't have a service menu with a time counter. Only the newer Sony screens have that (G520, F520 for example).;)
 
just a quickie. i have been told that the gdm-fw900 can do 160hz interlaced, is this possible or even advisable. If so i was told that CRU utility can do it. Has anyone done that and have instructions. If its a bad i dea i will leave it well alone.
 
Wait what? No they don't. Where did you find this?
Ahaha, yes they do. :LOL:

DSC05777_light.jpg

It's in a hidden menu you can activate when the screen is on but without a video source connected. On a HP P1130 you have to hold the " - " key for a couple of seconds (just shut down the screen to close), I also have that menu on 2 Dell P1130 which are more or less clones of G520s.

BTW I salvaged that HP screen a few days ago. The plastic casing took some damage (fortunately on the side/bottom), but appart from that it is in a very good shape and wasn't used very much (5200 hours !). It has been sitting for 10 years in a former computer shop before they start cleaning the building.
There was also a GDM-W900 there, but I had to leave it. As much as I would have loved to have one in my collection, I didn't have that much room to store it and I wouldn't have used it because of the low refresh rates.

Anyway, happy archeologist. :happy:
 
Ahaha, yes they do. :LOL:

View attachment 182179

It's in a hidden menu you can activate when the screen is on but without a video source connected. On a HP P1130 you have to hold the " - " key for a couple of seconds (just shut down the screen to close), I also have that menu on 2 Dell P1130 ld which are more or less clones of G520s.

BTW I salvaged that HP screen a few days ago. The plastic casing took some damage (fortunately on the side/bottom), but appart from that it is in a very good shape and wasn't used very much (5200 hours !). It has been sitting for 10 years in a former computer shop before they start cleaning the building.
There was also a GDM-W900 there, but I had to leave it. As much as I would have loved to have one in my collection, I didn't have that much room to store it and I wouldn't have used it because of the low refresh rates.

Anyway, happy archeologist. :happy:

Would have been nice to know when I still had my F520. Can anyone here confirm this is true for it too? I guess that the "down" key would be the functional equivalent of the "minus" key of the G520 (F520 doesn't have a minus key).
 
First, make sure you're plugging in the vga cable from the CRT before the displayport connector to the GPU. That way the Sunix will grab the CRT's EDID.

Then, look next to your monitor's name in CRU, you'll see an "Edit" button. In the next window you will see the max pixel clock reported by your monitor. If this is low, it's going to put an upper limit on the pixel clocks you can hit, and changing it won't help. It seems this entry is always read directly from by Windows and/or the driver, so overriding with CRU doesn't do anything.

I think the solution might be to spoof the EDID with an external device like HD Fury's Dr. HDMI, where you can program custom EDID's. You would be using it to send the information from the ID pins to the Sunix, which then relays it to the GPU. You can do this with an Extron RGB, since it has two outputs, one that is BNC and doesn't receive EDID info, and one that is HD-15 and would relay the ID signal (you'd hook the Dr. HDMI to this). Or you could make a a VGA coupling and splice in the ID signals yourself.

I haven't gone to deeply into all this myself, because my LaCie reports a very high pixel clock, thankfully. So I'm limited by my CRT's max horizontal range for 4:3 resolutions, and by the Sunix's ~550mHz limit if I'm trying to run widescreen resolutions. But when I do hook back up my Dell P992, I'll probably have to try the Dr. HDMI trick

Thanks for the pointers! You are right, it is reported as 220MHz in CRU. CRU has option to change that value but it does not seem to work for me. Didnt have much time to fiddle with it yet.
 
Would have been nice to know when I still had my F520. Can anyone here confirm this is true for it too? I guess that the "down" key would be the functional equivalent of the "minus" key of the G520 (F520 doesn't have a minus key).

GDM-F520, if I hold MENU button for a few seconds I get almost the same thing as Strat_84 is showing above ... but without the 2 bottom lines so nope :) Usage time could be clocked in the DAT file though, anyone looked there?

I definitely saw usage clock on 2070SB in the service menu, had like 10-30k hours there and the colors were just fine (after adjustments of course).

edit: I was holding MENU button WITH signal connected. If I switched to slot 2 where I had no signal then the screen would flash white with retrace lines (0.1s maybe) and then stabilized on light grey color and was just stuck there. Buttons did not seem to do anything but I was able to switch back to slot 1 and its fine now. Perhaps I was supposed to do some secret combo on the grey screen?
 
Last edited:
CRU has option to change that value but it does not seem to work for me. .

Yeah, I think that's because Windows or the GPU drivers are ignoring pixel clock overrides from CRU, instead they're reading directly from the hardware. So as I was saying you need an external device to spoof the pixel clock in the EDID, like a Dr. HDMI or some other dongle. Which means you're probably going to either have to splice wires or use some sort of a splitter with EDID passthrough (like an Extron RGB).

You are right, it is reported as 220MHz in CRU.

You sure that's the EDID from your Sony? Sound more like the EDID from the Synaptics chip in the Sunix, model number starts with "VM" I think. It's the EDID your PC picks up if you plug in Displayport before the VGA cable and power cable are connected to the Sunix
 
GDM-F520, if I hold MENU button for a few seconds I get almost the same thing as Strat_84 is showing above ... but without the 2 bottom lines so nope :) Usage time could be clocked in the DAT file though, anyone looked there?
That's because you didn't hit the right button. I also get almost the same window displayed with model/serial number but without time count when hitting one of the up or down buttons. The other one (up or down, I don't remember) triggers the display of a white screen slowly fading out. If I remember well, it is a display mode intended to burn potential residues on the cathodes after tube manufacturing so it's better to avoid playing with that.
 
Thanks, got it now. Had to switch to no signal and then hold the OK key for a few seconds (GDM-F520). 15 000 hours :)
 
Yeah, I think that's because Windows or the GPU drivers are ignoring pixel clock overrides from CRU, instead they're reading directly from the hardware. So as I was saying you need an external device to spoof the pixel clock in the EDID, like a Dr. HDMI or some other dongle. Which means you're probably going to either have to splice wires or use some sort of a splitter with EDID passthrough (like an Extron RGB).



You sure that's the EDID from your Sony? Sound more like the EDID from the Synaptics chip in the Sunix, model number starts with "VM" I think. It's the EDID your PC picks up if you plug in Displayport before the VGA cable and power cable are connected to the Sunix

It is indeed a Sony EDID (SNY06B0 - GDM-F520). VM is usually stuck at 1280x1024 I think, I am getting 1600x1200 max. I looked into the monitor.inf file at it states:

[GDM-F520.Install]
DelReg=DCR
AddReg=GDM-F520.AddReg, 2048, DPMS

I would think that should register 2048x1536 resolution but nope, probably because it is hitting that pixel clock EDID limit from somewhere...

Wired EDID spoofing is nice and all but it is a project in itself so I am still looking for any other possible solutions. Isnt CRU an EDID spoof anyway? If I change 220 to 500 and restart it then re-open CRU I can see it is saved in CRU but its like getting ignored by the system.

edit: Just something I found, looks like a simple EDID Emulator adapter https://www.aten.com/au/en/products/professional-audiovideo/converters/vc010/
 
Last edited:
edit: Just something I found, looks like a simple EDID Emulator adapter https://www.aten.com/au/en/products/professional-audiovideo/converters/vc010/

Very nice! I was looking for something like that months ago when I first got the Sunix, but I could never find one that was programmable, only ones that copied another display or had low-res VESA presets. I wonder if it takes the same format BIN files that CRU saves with?

I would think that should register 2048x1536 resolution but nope, probably because it is hitting that pixel clock EDID limit from somewhere...

I bet if you made a 2048x1536@50hz mode, it would work. That's a 219mHz pixel clock.

Not that you'd actually want to watch anything at 50hz, but just for a test.

If I change 220 to 500 and restart it then re-open CRU I can see it is saved in CRU but its like getting ignored by the system.

Yeah, that's behavior I've seen, it seems changing that value has no effect. I don't know if it's a Windows problem or a driver problem. I think it's Windows because both AMD and Nvidia have the same behavior. But a nice thing about Nvidia is that you can use their own custom res feature and create resolutions beyond the limit (I probably should have mentioned that way back). But I'm switching to a RX 5700 XT soon so I'll probably need to pick up that adapter you just linked.
 
Yes, 2048x1536@50hz mode works fine. Same with NCP resolutions beyond 220MHz but I cannot use DSR that way and there was a weird issue I posted a few posts above this one.

Regarding the adapter, it looked like the real deal from the outside to me but you might want to study some tech documentation of it first to make sure you are not buying something that is not good enough.

On the topic of adapters, is anyone here using an external LUT box (ideally vga-in/vga-out)? I think that would allow me to get over some limitations of the regular ICM profile loading. Dont know much about these and being more of a professional equipment there is not too much info on them.
 
Back
Top