24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Oh crap, I didn't know VGA on Nvidia couldn't be patched. I've mostly used Radeon cards over the years.

Maybe it's time to go ahead and buy a Sunix DPU-3000. You're going to need one before your next GPU upgrade anyway.
 
But I do have a LaCie electron blue IV, which I believe has a higher max scan rate (140kHz) than the FW900. So let me know if you guys can think of any high resolution-related tests I should try.
I actually have that same monitor. I can run some tests on it as well if need be although I'm not sure if the thin client attached to it can do much more than 2560x1600.
Today I tried it with my LaCie 22 blue IV, which tops out at 140kHz, and I was able to select WAY higher resolutions. Stuff like 2560x1920@70hz (495mHz pixel clock) or 2720x1530 @ 86Hz (520mHz) pixel clock, and they were all stable.

But I still eventually hit point where windows wasn't showing some resolutions. Like I couldn't select 2880x2160@60 on the Sunix, even though I could over VGA.
Wait, you got 2560x1920 on the Lacie?!? How? Was this via VGA or using the Sunix adapter?
 
Wait, you got 2560x1920 on the Lacie?!? How? Was this via VGA or using the Sunix adapter?

Both.

When on VGA, I have to use Toasty X's pixel clock patcher on my AMD card since the resolution is over the 400mHz pixel clock limit for VGA connections. DP 1.2 doesn't have that limitation, so I can run that resolution with no hacking on the Sunix.

From what jka is saying above, I think Sunix might be your only option for Nvidia if you want to go for the silly high pixel clocks.

But you can also always create an interlaced version of the resolution if you just want to try it out and stay below 400mHz.
 
Both.

When on VGA, I have to use Toasty X's pixel clock patcher on my AMD card since the resolution is over the 400mHz pixel clock limit for VGA connections. DP 1.2 doesn't have that limitation, so I can run that resolution with no hacking on the Sunix.

From what jka is saying above, I think Sunix might be your only option for Nvidia if you want to go for the silly high pixel clocks.

But you can also always create an interlaced version of the resolution if you just want to try it out and stay below 400mHz.
Thank you so much! I thought 2560x1600 was going to be the last crt resolution I was going to use, but looks like the lacie has some extra life in it with the right card and adapter. (y)
 
jka - I just read the rest of your post. What was the step? Was it to verify luminance for 9300K? That you couldn't reach that luminance value suggests to me that either one of two things are going on:

1. You set your G2 too low on the inital G2 setting step
2. Your tube is so damn old you cannot reach target.

Assuming #2 isn't the case, I'd suggest redoing your white balance, and when you set your G2 to use a ten-step grayscale ramp, like this:

EDIT: I really doubt that your monitor isn't so old that it cannot reach 115 cd/m2 for 9300K. In my experience calibrating monitors it's the lower color temperatures - like 5600K that give older monitors the most trouble. I strongly suspect you set your G2 too low to begin with.

grayscale.jpg

Follow the directions. It states to set the G2 so that the first bar is invisible and that the second bar is barely visible. Do it in your light-controlled environment, and then continue to adjust from there. You may need to either download or create a 10-bar grayscale pattern for Prime Mode (1920x1200 @85hz)
 
Yes, something like that, dont remember now but it was one of these steps: 48,52 or 55

I only perform 9300K adjustment on my monitors because thats the only temperature that I use. So I simply skip the others. I personally dislike the yellow whites of 6500K and lower.

But when I measure fullwhite now I am getting 105cd. If I bump contrast from 90 to 100 I still get 105, which means it probably reached the default ABL cutoff of total sum of electrons emitted. Or is that a power or heat limit applied on the cathodes? It takes upto a minute to stabilize so it is probably something analog. I was also thinking if it isnt more tied to the anode thing, where you are limited by a maximum number of kV and producing more electrons than that amount of positive charge can attract. I think I have had ABL fail on me many times and it resulted in 2 things (one OR the other):

1) monitor shut down

2) picture started to be super blurry and it stretched (by 300%) until it faded away. In 2D animation terms it was like applying ZoomIn + FadeOut at the same time.

And I think number 2 could look exactly the same as if you had too much electrons in the tube without being able to escape through the top anode. Again, I am no electrician so these are very wild theories on my part.

My FW900 G2 is on 149 I think and its indeed possible that it is too low. I will try to up it a notch and re-do the WPB procedure. I have defined the G2 by looking at the "2 bars" pattern by spacediver where I set G2 so that the right side (7 IRE i think) is only very barely visible. Perhaps it should be a little more than barely visible because if I look at it now after WPB and LUT adjustment, its a lot visible with room lights on. I dont want to use the full greyscale pattern because the right side is too light and basically lights up the whole tube. But maybe I should for whatever reason - it is indeed an actual instruction in WinDAS, to load exactly that pattern to find G2.

UPDATE: Contrary to some things I said above about ABL, I just tested upping my OSD brightness level and I was able to get upto 125cd on full-white. Now how to get that with 31% brightness hmm :)
 
Anyone knows what exactly the IMAGE RESTORATION feature does? Its also available on Diamondtrons by the way. Also wondering if its a good idea to use it in between several passes of WPB procedure.
 
I dont know about you, but these look very similar in a room with lights on:

1) Brightness 31% with LUT adjustment

2) Brightness 44% without LUT adjustment

I am bringing this up because the LUTs are a pain to make "stick" in a lot of games I play so I am considering my other options. I even made a batch script that would load my LUTs after loading a game but when it did work the game looked nothing like it should or I was used to. Its possible game devs use LUT adjustment for some artistic changes so if you load a technically correct LUT, the game will look like shit (usually washed out).

But LUTs are very good for tv/movies as I have found, the wire or house of cards are just perfect, very natural looking.
 
Yes, something like that, dont remember now but it was one of these steps: 48,52 or 55

I only perform 9300K adjustment on my monitors because thats the only temperature that I use. So I simply skip the others. I personally dislike the yellow whites of 6500K and lower.

But when I measure fullwhite now I am getting 105cd. If I bump contrast from 90 to 100 I still get 105, which means it probably reached the default ABL cutoff of total sum of electrons emitted. Or is that a power or heat limit applied on the cathodes? It takes upto a minute to stabilize so it is probably something analog. I was also thinking if it isnt more tied to the anode thing, where you are limited by a maximum number of kV and producing more electrons than that amount of positive charge can attract. I think I have had ABL fail on me many times and it resulted in 2 things (one OR the other):

1) monitor shut down

2) picture started to be super blurry and it stretched (by 300%) until it faded away. In 2D animation terms it was like applying ZoomIn + FadeOut at the same time.

And I think number 2 could look exactly the same as if you had too much electrons in the tube without being able to escape through the top anode. Again, I am no electrician so these are very wild theories on my part.

My FW900 G2 is on 149 I think and its indeed possible that it is too low. I will try to up it a notch and re-do the WPB procedure. I have defined the G2 by looking at the "2 bars" pattern by spacediver where I set G2 so that the right side (7 IRE i think) is only very barely visible. Perhaps it should be a little more than barely visible because if I look at it now after WPB and LUT adjustment, its a lot visible with room lights on. I dont want to use the full greyscale pattern because the right side is too light and basically lights up the whole tube. But maybe I should for whatever reason - it is indeed an actual instruction in WinDAS, to load exactly that pattern to find G2.

UPDATE: Contrary to some things I said above about ABL, I just tested upping my OSD brightness level and I was able to get upto 125cd on full-white. Now how to get that with 31% brightness hmm :)

I don't really know how to answer this. You should never skip any of the temperatures as the service software is designed to handle the WPB adjustment as an entire process, with no steps skipped. Skipping steps is just begging for trouble. I don't know exactly what the software does (since it's closed source), but I can tell you via watching my monitor react as it goes through the steps, the system is making calculations based on your adjustments at each step, and so skipping any step is not smart.

125cd/m2 is way too bright for long-term use for these monitors. You're just going to consume your tube. Full white at 100 cd/m2 is plenty.

So, no matter what your opinion is on 6500k and 5400k, you really should do the whole procedure and not stop in the middle. My two cents - take it or leave it. I only use the 6500k mode on my monitor. But whenever I WPB calibrate my screens, I always do 9300 and 5400 too.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Hi guys. I’ve been out of the loop. I was wondering if anyone can tell me what is the best hdmi/dp adapter to buy for the FW900 currently. There’s quite a few products mentioned in this thread. Could anyone help please?

EDIT: is this the best adapter everyone has been talking about? The Sunix 3000?

https://www.amazon.com/Sunix-DisplayPort-miniDP-DP-Cable-DPU3000-D3/dp/B00JARYTVK

Yes it is the best
DPU3000-D2 is for video cards with mini displayport
DPU3000-D3 is for video cards with displayport
It seems that to have full stability it is required the USB Type-A to Micro USB Cable for additional power and i don't know if it is included
On the Sunix website the usb cable is shown as included,but better to wait confirm from other users
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Yes it is the best
DPU3000-D2 is for video cards with mini displayport
DPU3000-D3 is for video cards with displayport
It seems that to have full stability it is required the USB Type-A to Micro USB Cable for additional power and i don't know if it is included
On the Sunix website the usb cable is shown as included,but better to wait confirm from other users

It does not include the USB cable. It does seem more stabile since some card don't provide enough power, but you may be able to get by without it.
 
It does not include the USB cable. It does seem more stabile since some card don't provide enough power, but you may be able to get by without it.
The D3 version I received included the USB cable, and the D2 should as well. That's why I told earlier that I wondered if the ones on sale on Amazon weren't refurbished (with means potentially problems and/or parts lacking).
 
The D3 version I received included the USB cable, and the D2 should as well. That's why I told earlier that I wondered if the ones on sale on Amazon weren't refurbished (with means potentially problems and/or parts lacking).

You may be right. I did pay a cheaper price. I guess I got lucky in that there are no problems with the unit itself.
 
Okay so the USB cable: does that need to be plugged into a phone charger or something or is it supposed to be plugged into a computer’s USB port?

I ordered the D3 last night. Will test it out in a few days.
 
Okay so the USB cable: does that need to be plugged into a phone charger or something or is it supposed to be plugged into a computer’s USB port?

I ordered the D3 last night. Will test it out in a few days.

You should be able to plug it into the computer's USB.
 
I bought the Sunix DPU3000-D3 from Amazon about a week ago. The package includes, besides the splitter itself, the miniDP-to-DP cable and the micro USB cable both 10 cm long.
I have a GTX980 running on Windows 7 x64 and two 21" CRTs, the Dell P1130 and Sony GDM-F520.
I use CRU, NVIDIA pixel clock patcher and the GeForce control panel to set-up custom resolutions.
If I only connect the included DP cable the LED on the Sunix splitter lights-up barely but when I also connect the USB cable the LED intensity increases.
THE GOOD NEWS: On both CRTs the image quality and responsiveness is as good if not better than the integrated RAMDAC. I can see no ghosting or text doubling or shadows around icons, the image is as sharp and colourful while input lag is virtually non-existent. While playing a fast paced shooter like Call of Duty 1 I have the feeling is even faster than the 980's RAMDAC but that could also be my imagination. Definitely as good!
THE BAD NEWS: No matter what I do I can't get the splitter to be stable above 1920×1440 :(
I can do 85Hz on the Dell and 90Hz on the Sony at that resolution and everything is perfect. As soon as I try 2048x1536 the monitors lose sync and go to standby about 50% percent of the time. It doesn't matter what refresh I try from 60 to 80Hz (85Hz on the F520) the signal is lost either when I restart Windows, change resolutions and return to 2048x1536 or simply when doing nothing and wait long enough. I do find it weird as 85Hz seems more stable than 60Hz. I tried all modes, GTF, DMT, CVT, even manually adjusting polarities...same thing. What is stranger is if I try higher resolutions until I saturate the horizontal bandwidth of the monitors they do not lose sync anymore and shut down as they did on 2048x1536. Instead I see some rippling wavy like artefacts or the image is displaced in such a way that, for instance, the right side goes to the far left of the monitor while the left part of the image is moved to the middle. It doesn't stay like that much and soon I get rapid black screen flickering on the top part of the image. I didn't bother much with resolutions above 2048x1536 as the the text becomes very small, blurry and hard to read.
I tried a different DP cable who did not lit up the LED and required the use of the included USB cable to power up the splitter. The results were the same.
The included USB cable doesn't provide any benefit. Everything works the same with or without it but I do recommend keeping it plugged in to be sure as the LED lights up brighter with it.
I've also tried to power the splitter with a different USB cable, either to a motherboard USB port or a 5.3V 2.0A USB phone charger plugged to the wall. Same behaviour.
Quite disappointed actually :(
 
Last edited:
I bought the Sunix DPU3000-D3 from Amazon about a week ago. The package includes, besides the splitter itself, the miniDP-to-DP cable and the micro USB cable both 10 cm long.
I have a GTX980 running on Windows 7 x64 and two 21" CRTs, the Dell P1130 and Sony GDM-F520.
I use CRU, NVIDIA pixel clock patcher and the GeForce control panel to set-up custom resolutions.
If I only connect the included DP cable the LED on the Sunix splitter lights-up barely but when I also connect the USB cable the LED intensity increases.
THE GOOD NEWS: On both CRTs the image quality and responsiveness is as good if not better than the integrated RAMDAC. I can see no ghosting or text doubling or shadows around icons, the image is as sharp and colourful while input lag is virtually non-existent. While playing a fast paced shooter like Call of Duty 1 I have the feeling is even faster than the 980's RAMDAC but that could also be my imagination. Definitely as good!
THE BAD NEWS: No matter what I do I can't get the splitter to be stable above 1920×1440 :(
I can do 85Hz on the Dell and 90Hz on the Sony at that resolution and everything is perfect. As soon as I try 2048x1536 the monitors lose sync and go to standby about 50% percent of the time. It doesn't matter what refresh I try from 60 to 80Hz (85Hz on the F520) the signal is lost either when I restart Windows, change resolutions and return to 2048x1536 or simply when doing nothing and waiting long enough. I do find it weird as 85Hz seems more stable than 60Hz. I tried all modes, GTF, DMT, CVT, even manually adjusting polarities...same thing. What is stranger is if I try higher resolutions until I saturate the horizontal bandwidth of the monitors they do not lose sync anymore and shut down as they did on 2048x1536. Instead I see some rippling wavy like artefacts or the image is displaced in such a way that, for instance, the right side goes to the far left of the monitor while the left part of the image is moved to the middle. It doesn't stay like that much and soon I get rapid black screen flickering on the top part of the image. I didn't bother much with resolutions above 2048x1536 as the the text becomes very small, blurry and hard to read.
I tried a different DP cable who did not lit up the LED and required the use of the included USB cable to power up the splitter. The results were the same.
The included USB cable doesn't provide any benefit. Everything works the same with or without it but I do recommend keeping it plugged in to be sure as the LED lights up brighter with it.
I've also tried to power the splitter with a different USB cable, either to a motherboard USB port or a 5.3V 2.0A USB phone charger plugged to the wall. Same behaviour.
Quite disappointed actually :(

Thanks for your report! Glad to have more NVidia reports. Have you tried a refresh rate of like 72 Hz? Would you know how to try these out in Ubuntu also? I'm curious to figure out of it's the unit or something else.
 
Yes, I tried increments of 1Hz from 60 to 85 at resolutions very close to 2048x1536 like modifying x to 2040 or 2060 while increasing y to about 1600 etc...
For a 4:3 picture the monitors shut down from 1530 to about 1600 vertical lines no matter the refresh rate. After 1600 the splitter is "stable" but I get the artefacts and symptoms mentioned earlier.
I don't have a Linux PC available to test. Sorry
 
Yes, I tried increments of 1Hz from 60 to 85 at resolutions very close to 2048x1536 like modifying x to 2040 or 2060 while increasing y to about 1600 etc...
For a 4:3 picture the monitors shut down from 1530 to about 1600 vertical lines no matter the refresh rate. After 1600 the splitter is "stable" but I get the artefacts and symptoms mentioned earlier.
I don't have a Linux PC available to test. Sorry

How about live USB stick of Ubuntu? That's what I did, lol.
 
How about live USB stick of Ubuntu? That's what I did, lol.
I never used Ubuntu before but I think I can give it a shot when I have some free time again. Does CRU work in Ubuntu? I use it to generate the INF monitor driver file with only the resolutions I'm interested in.
Honestly I doubt is the OS. I initially suspected the DP cable but after trying a different one that does not power the splitter I can safely say it's the splitter fault for the instability.
 
THE GOOD NEWS: On both CRTs the image quality and responsiveness is as good if not better than the integrated RAMDAC. I can see no ghosting or text doubling or shadows around icons, the image is as sharp and colourful while input lag is virtually non-existent. While playing a fast paced shooter like Call of Duty 1 I have the feeling is even faster than the 980's RAMDAC but that could also be my imagination. Definitely as good!
I'm a bit dubious about a responsiveness improvement but it is absolutely possible that the image quality is better with the adapter than with the internal DAC of a 980. Nvidia analog outputs have a reputation of mediocre quality.
 
THE BAD NEWS: *clip*

Interesting, my Sunix could 1920x1440@90hz no problem. Likewise with 2560x1920@70hz, as well as other crazy resolutions.

One thing the Sunix does is pass the EDID from the monitor to the GPU. On Radeon cards, it seems that I'll get a hard limit on max pixel clock because of that. I'm not sure if EDID data has an influence on Nvidia's behavior, but I'm just throwing that out there.

Also make sure Nvidia is reporting the full 5.4 Gbps clock for DP 1.2 connections. On my AMD card you can view that somewhere in the driver control panel.
 
Interesting, my Sunix could 1920x1440@90hz no problem. Likewise with 2560x1920@70hz, as well as other crazy resolutions.

One thing the Sunix does is pass the EDID from the monitor to the GPU. On Radeon cards, it seems that I'll get a hard limit on max pixel clock because of that. I'm not sure if EDID data has an influence on Nvidia's behavior, but I'm just throwing that out there.

Also make sure Nvidia is reporting the full 5.4 Gbps clock for DP 1.2 connections. On my AMD card you can view that somewhere in the driver control panel.
Interesting. One particular thing that I did not report was the Dell P1130 acts differently than the F520 when I install the INF monitor driver created with CRU. While on the integrated DAC there's no difference between the two, with the Sunix splitter the Dell monitor doesn't register the 2048x1536 resolution in either Windows 7 display resolutions list or NVIDIA control panel. I have to manually add it in custom resolutions section of NVCP. With the F520 that resolution is registered immediately just like when it's connected to the integrated DAC.
I suppose this could be done on the Dell by editing the EDID data? Still, the splitter is as unstable on either monitor at that resolution...
I have no idea how to check DP link speed for my GTX980 card.
 
I never used Ubuntu before but I think I can give it a shot when I have some free time again. Does CRU work in Ubuntu? I use it to generate the INF monitor driver file with only the resolutions I'm interested in.
Honestly I doubt is the OS. I initially suspected the DP cable but after trying a different one that does not power the splitter I can safely say it's the splitter fault for the instability.

Oh, you're new to Ubuntu. Um, it might not be that easy unless you're okay with command line. In Ubuntu you don't need a third party app like CRU, it has CRU function built in. You can add any resolution.

Edit: On WIndows on AMD, we can't add interlaced resolutions after a certain point but it works fine in Ubuntu, so it can still be a driver issue.
 
Edit: On WIndows on AMD, we can't add interlaced resolutions after a certain point but it works fine in Ubuntu, so it can still be a driver issue.

Not sure if I've mentioned this, but another issue for AMD on Windows is that it will read the monitor's EDID, and set a max pixel clock based off of it. I'm not totally sure how it arrives at that max pixel clock, but I think it sets it at 110mHz above whatever is reported as the max clock in the EDID. For example, my LaCie states a max of 420mHz, but AMD allows up to 530, and my Dell p992 states 240mHz, but AMD allows up to 330. No workaround I can see right now, I need to figure out how to reprogram my HDMI dummy plug to have a high max pixel clock.
 
I’m having trouble getting the sunix to give me proper 1920x1200. The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that. Every time I add custom resolution using Nvidia’s control panel, it keep showing me a very distorted version of 1920x1200. Can anyone provide a step by step guide on how to properly getting the Sunix running on DisplayPort?

Edit: Never mind, I’ve set it to “GTF” mode and seems to be working okay now. Is this the correct mode for CRT?
 
Last edited:
I’m having trouble getting the sunix to give me proper 1920x1200. The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that. Every time I add custom resolution using Nvidia’s control panel, it keep showing me a very distorted version of 1920x1200. Can anyone provide a step by step guide on how to properly getting the Sunix running on DisplayPort?

Edit: Never mind, I’ve set it to “GTF” mode and seems to be working okay now. Is this the correct mode for CRT?

Use CVT. GTF is the older version.
 
Thanks. It’s working fine now. Everything looks good so far, able to get to high resolutions and high refresh rates not possible with the previous adapter
 
I’m having trouble getting the sunix to give me proper 1920x1200. The monitor becomes “Synaptics Inc”. I installed the FW900 on top of that. Every time I add custom resolution using Nvidia’s control panel, it keep showing me a very distorted version of 1920x1200. Can anyone provide a step by step guide on how to properly getting the Sunix running on DisplayPort?

Edit: Never mind, I’ve set it to “GTF” mode and seems to be working okay now. Is this the correct mode for CRT?

Most of the CRT monitors if not all are GTF compliant.
For CRT by default,without custom resolutions,the display driver uses GTF timings or DMT.
DMT isn't a timing formula,it is a list of resolutions with various refresh rate,if you set a resolution on a CRT which is on that list the driver uses those timings,otherwise it uses GTF formula.
Timings on DMT list can be custom(old resolutions),GTF or CVT(new resolutions),so for some resolutions if you set DMT or GTF is the same thing,for other resolutions DMT is the same as CVT,for old resolutions like example 1024x768 85Hz DMT is different from both GTF and CVT.
CVT reduced and secondary GTF are only for LCD monitors.
When using custom resolutions you can set any timing formula you like,but you have to keep in mind that if you change the formula of a resolution previously adjusted and stored in the monitor's memory,you will need to adjust the geometry again.
Some time ago i asked ToastyX to implement GTF on CRU utility and it seems that last preview version has it.
 
Last edited:
UPDATE:
I borrowed a GTX 1070 and the results are identical to my 980.
I even tried a VGA-5BNC cable (same outcome) but I had the added inconvenience of CRU resolutions not working this time since with this cable the only info transmitted is RGBHV and no EDID data. I have to manually create ALL resolutions in NVIDIA custom res panel so I recommend using a standard VGA cable but with all pins active.
Again, resolutions at or close to 2048x1536 make the monitors go into standby very often.
I toyed with the highest resolution my F520 produces a somewhat sharp image at 60Hz, namely 2533x1900.
The monitor doesn't shutdown but I get the rippling/waving and displaced parts of the image symptoms.
Will further lower the resolution in small increments to see if this behavior continues.
 
whats the difference bet the sony FW900 and the SGI FW9011? Purley badging? I noticed on my FW 9011 that windows 10 doesn't pick up the EDID info, just displays a generic plug and play monitor.

EDIT: GTX908M for me, i often have to add 2304x1440 custom resolution via nivida control panel however it only outputs it at 75hz, i should give 80hz a try again. i often to manually reset resolution when i turn it on, if I walk away for too long with it turned off.
 
UPDATE:
I borrowed a GTX 1070 and the results are identical to my 980.
I even tried a VGA-5BNC cable (same outcome) but I had the added inconvenience of CRU resolutions not working this time since with this cable the only info transmitted is RGBHV and no EDID data. I have to manually create ALL resolutions in NVIDIA custom res panel so I recommend using a standard VGA cable but with all pins active.
Again, resolutions at or close to 2048x1536 make the monitors go into standby very often.
I toyed with the highest resolution my F520 produces a somewhat sharp image at 60Hz, namely 2533x1900.
The monitor doesn't shutdown but I get the rippling/waving and displaced parts of the image symptoms.
Will further lower the resolution in small increments to see if this behavior continues.

So you can do 1920x1440 90 Hz but 2048x1536 isn't stable even at 60Hz?
Indeed very strange
Do other users have problems with 2048x1536 resolution?
 
I have the 980GTX and Sunix adapter, I also have the rippling effects when plugged into the displayport closest to the bottom of the card. I unplugged that and tried the topmost displayport and the rippling went away. Prior I thought this was related to the usb power supply but it appears not. From what I read all DP ports are the same on the card so not sure what is going on. Resolution is 2304x1440@80
 
Back
Top