sony gdm-fw900 and 2560x1600@75 hz or higher.. possible, but how exactly?

atwix

Limp Gawd
Joined
Oct 7, 2012
Messages
169
i use windows 7 64 bit OS and a hd6950 directcuii video card with a hdmi/vga converter and vga to vga cable on a sony gdm-fw900 CRT monitor (don't start flaming, fw900 is still a *very* good gaming monitor). i usually play games like skyrim, the witcher 2 etc on 2304x1440@80hz. i recently used toastyx his guide to edit monitor drivers, and managed to get a working resolution of 2560x1600@68hz right below the max pixel clock bandwith of 400 Mhz.. (http://111.120hz.net/showthread.php?683-Overclocking-Quick-Start-Guide&p=12344&viewfull=1)

problem: is it even worth it to increase the resolution above 2304x1440? if so, i'd like to get rid of the flickering and crank up the refresh rate at 2560x1600 to 75 or even more.

question 1:i read on other forum that fw900 can handle 2560x1600@75 hz. if so, how? i would need 460 MHZ pixel clock bandwith for this... and the max on an ATI card says 400 MHZ
i can use 2560x1600@68 hz; installing that resolution with toastyx his CRU was not hard at all... but i'm looking for specific values for lowering the timing parameters so i can squeeze out more hertz on 2560x1600 without many flickering lines in the image. i would love to get some values that i can input in the CRU program. anyone got a clue?

question 2: related to question 1

can the 400mhz pixel clock limit be circumnavigated somehow with specific cable and external converters? something like BNC with a Dual link DVI to VGA adapter...? would that give better hertz on 2560x1600? the 400Mhz limit comes from converting digital signal to analog INSIDE the video card right... i don't know a lot about all the type of signals and converters.. sorry if its silly question.

question 3

i actually wonder if the fw 900 monitor can even display *ALL* the 4,096 million pixels at 2560x1600 or 2304x1440 in a fullscreen game because of dot pitch limitations.. any techie know. i think 2560x1600 on a diagonal screen length of 22,5 gives dot pitch thats lower then the minimum value listed in the manual of a fw900.. but i get 0,21 at 2304x1440 and thats lower then the 0.23 listed as well...? or did i do something wrong there. does the image get distorted somehow if i use a resolution in fullscreen gaming thats gives lower then minimum dot pitch? still a noob in these matters, sorry if i ask a silly question.
 
Last edited:
i use windows 7 64 bit OS and a hd6950 directcuii video card with a hdmi/vga converter and vga to vga cable on a sony gdm-fw900 CRT monitor (don't start flaming, fw900 is still a *very* good gaming monitor). i usually play games like skyrim, the witcher 2 etc on 2304x1440@80hz. i recently used toastyx his guide to edit monitor drivers, and managed to get a working resolution of 2560x1600@68hz. (http://111.120hz.net/showthread.php?683-Overclocking-Quick-Start-Guide&p=12344&viewfull=1)

problem: i'd like to get rid of the flickering and crank up the refresh rate at 2560x1600 to 75 or even more.

i actually wonder if the fw 900 monitor can even display *ALL* the 4,096 million pixels at 2560x1600 or 2304x1440 in a fullscreen game because of dot pitch limitations.. any techie know. i think 2560x1600 on a diagonal screen length of 22,5 gives dot pitch thats lower then the minimum value listed in the manual of a fw900.. but i get 0,21 at 2304x1440 and thats lower then the 0.23 listed as well...? or did i do something wrong there. does the image get distorted somehow if i use a resolution in fullscreen gaming thats gives lower then minimum dot pitch? still a noob in these matters, sorry if i ask a silly question.

Your intuition and calculations are correct. The dot pitch ranges from .23 to .25 as you move from the center out. Even with perfect focus, 1920x1200 is about the highest resolution it will fully resolve across the entire screen. As you calculated, 2304x1440 won't even be fully resolved in the center. My favorite resolution to run it at is 2048x1280.
 
Your intuition and calculations are correct. The dot pitch ranges from .23 to .25 as you move from the center out. Even with perfect focus, 1920x1200 is about the highest resolution it will fully resolve across the entire screen. As you calculated, 2304x1440 won't even be fully resolved in the center. My favorite resolution to run it at is 2048x1280.

hmm.. but the quality of the image doesn't seem to suffer. can it ever be noticable?
i don't quite get what the downside is to using resolutions that have a dot pitch less then the monitor's minimum. what exactly happens then?

all i know is: i installed custom 4096x4096 resolution textures in skyrim, and at 2304x1440 it looked as if the quality of the resolution improved thanks to those textures..

are you suggesting i shouldn't play at 2304x1440 or 2500x1600 then? i tried gaming at 2560x1600 and couldn't see graphical anomalies.. maybe i should look closer.. where do they occur usually?

edit: did some googling..

the image will only get a bit blurry depending on the amount of "bleeding" within each phosphor.
So it's in the nature of the phospors to handle the up or down conversion from various resolutions, unlike a digital monitor which has to use an algorithm and can't paint a partial pixel. The point is that CRT monitor phosphors are designed to work well with partial pixel painting though.


he makes it sound as if it isn't a big issue.. :confused:
 
Last edited:
There shouldn't be a 400Mhz pixel clock limitation on a 6950 if you used toasty's Display Driver modification tool. "But AMD says..." blah blah blah ignore them. Just apply the resolution and refresh rate you want to use. If the high resolution is too blurry for your tastes, use a lower resolution. Up to you.
 
Back
Top