24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I've tested two of these Startech DP2VGAHD20 adapters/transcoders and the best I can get out of them on my FW900 is a pixel clock of 320 MHz with a max resolution of 1920x1200 @ 95 Hz. Not the reported 375 MHz. Still great, and better than my super expensive HDFury X3 HDMI to VGA adapter ($200) that has a max pixel clock of 225 MHz at a resolution of 1920x1200 @ 60 Hz.


what resolutions did you try to asume that 320mhz limit? i ask because at 1920x1200 the max you will be able to get with the fw900 is 96hz (323mhz) but that limit in this case is because of fw900 horizontal scan limit, not adapter pixel clock limit.

what happens if you try something like 2560x1600@63hz? which is arround 367mhz, or the same at 64hz? (373mhz)

3dfan does have a point. With the tests that I did at 1920 x 1200 with that Startech, 97Hz (325Mhz pixel clock) was out of the FW900's range, but I was able to get 96Hz Viewable (322Mhz pixel clock) and under. At 2304 x 1440 I was able to get 78Hz Viewable (372Mhz pixel clock). Above 78Hz at 2304 x 1440 the Startech started giving me snow. When I had an analog video card, I was able to push the FW900 to 2304 x 1440 at 80Hz, but the adapter obviously can't handle that. I wanted a higher refresh rate than 78Hz. So I settled for 2235 x 1397 (3% of 2304 x 1440) at 83Hz (375Mhz pixel clock).

So it is a fine balance between what the adapter can support to the limitations of our FW900s. People think I am a weirdo for wanting 83Hz and running that oddball resolution, but everyone's eyes are different, and I can totally tell the difference from running 78Hz to 83Hz. I always get asked why I am running that weirdo resolution. So I figured I'd just get that out of the way LOL.
 
what resolutions did you try to asume that 320mhz limit? i ask because at 1920x1200 the max you will be able to get with the fw900 is 96hz (323mhz) but that limit in this case is because of fw900 horizontal scan limit, not adapter pixel clock limit.

what happens if you try something like 2560x1600@63hz? which is arround 367mhz, or the same at 64hz? (373mhz)

You guys are right. I thought the FW900 at 1920x1200 could do over 100 Hz and hence why I thought the Startech adapter topped out at around 320 MHz pixel clock (1920x1200 at 95 Hz). I tried the other higher resolutions with max refresh rate and was able to get right around the 375 MHz pixel clock.

Cheers on the correction!
 
Last edited by a moderator:
Just saw someone post up on craigslist in Atlanta, GA.

https://charlotte.craigslist.org/sop/d/atlanta-sony-gdm-fw900-crt-monitor/7241113907.html

1.PNG
 
Still great, and better than my super expensive HDFury X3 HDMI to VGA adapter ($200) that has a max pixel clock of 225 MHz at a resolution of 1920x1200 @ 60 Hz.
That's weird, because my much cheaper HD Fury Nano GX can hit around 280mHz.

Did you try using the HDMI.dat profile for CRU? That should remove any driver limitations.
 
That's weird, because my much cheaper HD Fury Nano GX can hit around 280mHz.

Did you try using the HDMI.dat profile for CRU? That should remove any driver limitations.

No kidding!? I also have the Nano and couldn't get over 175 MHz pixel clock. I bought the HDFury X3 so I could get 1920x1200 at 60 Hz on my FW900.

I use CRU and when adding a "Detailed resolution", I set the Timing to "Automatic - CRT standard". But I've never messed with the HDMI.dat profile in the "Extension blocks" field.

The HDFury X3 already has an extension block file so I just changed the "Maximum TMDS clock" from 225 MHz to 340 MHz and tested it out. The best I could previously do was 1920x1200 at 60 Hz. Now after setting it to 340 MHz I'm able to get 1920x1200 at 70 Hz with a pixel clock of 228 MHz. Anything above 70 Hz produces a compromised picture on my FW900.
 
Last edited by a moderator:
4000$ :ROFLMAO::ROFLMAO::ROFLMAO:

The question is, will there be someone stupid enough to buy it. I'm afraid of the answer. :rolleyes:
40x times more than I paid for mine 9 years ago 🤯
Maybe I should sell mine... or do it in like two years for 5000-6000$ 🤩
 
As an eBay Associate, HardForum may earn from qualifying purchases.
3dfan does have a point. With the tests that I did at 1920 x 1200 with that Startech, 97Hz (325Mhz pixel clock) was out of the FW900's range, but I was able to get 96Hz Viewable (322Mhz pixel clock) and under. At 2304 x 1440 I was able to get 78Hz Viewable (372Mhz pixel clock). Above 78Hz at 2304 x 1440 the Startech started giving me snow. When I had an analog video card, I was able to push the FW900 to 2304 x 1440 at 80Hz, but the adapter obviously can't handle that. I wanted a higher refresh rate than 78Hz. So I settled for 2235 x 1397 (3% of 2304 x 1440) at 83Hz (375Mhz pixel clock).

So it is a fine balance between what the adapter can support to the limitations of our FW900s. People think I am a weirdo for wanting 83Hz and running that oddball resolution, but everyone's eyes are different, and I can totally tell the difference from running 78Hz to 83Hz. I always get asked why I am running that weirdo resolution. So I figured I'd just get that out of the way LOL.

As I mentioned in my reply to 3dfan, you are correct. I thought the FW900 at 1920x1200 could do over 100 Hz and hence why I thought the Startech adapter topped out at 320 MHz pixel clock (1920x1200 at 95 Hz). I tried the other higher resolutions with max refresh rate and was able to get around the max 375 MHz pixel clock.

Question... if you prefer the higher refresh rate, why not go with 1920x1200 @ 95 Hz? I don't see any image quality improvement going over 1920x1200 resolution on the FW900.
 
Question... if you prefer the higher refresh rate, why not go with 1920x1200 @ 95 Hz? I don't see any image quality improvement going over 1920x1200 resolution on the FW900.
Sometimes you're CPU limited below 95fps, but not GPU limited. Or it might be a fighting game or platformer that's locked at 60fps. In those cases I go for the highest resolution my GPU can handle at 60hz or whatever my CPU can maintain.

That's my reasoning anyway.
 
So, currently the best adapter is Startech DP2VGAHD20?
Depends on what you're trying to do. It can only go up to 375mHz I think.

On the other hand, the Sunix and other converters with the Synaptics VMM2322 chip can hit almost 550mHz. But they are buggy at certain resolutions and frequencies
 
So, currently the best adapter is Startech DP2VGAHD20?
It's one of the cheapest and one of the best. A good one at least.

Recently I was able to compare some 240 Hz IPS panels and an OLED CX to my 2070sb's, so maybe some of you guys will find the info interesting.
So my friend and I got 2 240 Hz IPS panels, I've got a MAG251RX because of I needed a 24 incher, and he got an XG270 Elite (that Blurbusters certified strobing).
Funny thing is that I'm really liking the experience I've got from this monitor in FPS games - the motion artifacts at 240 Hz strobed seem OK for me compared to a CRT, and the extra features like black equalizer + higher resolution make using this monitor a charm compared to a 2070sb. I also like extra vibrant colors from that more than 100% srgb colorspace it's got.
My friend on the other hand does not like his XG270 just because he still can see the pixel transition blur while flicking really hard compared to a 2070sb he's got.

I've also tested the 48CX a bit. Very wide strobe window with low brightness for such a big screen at max strobing mode makes it hard to swallow. But that separated backlight from each pixel alone makes the strobing really good on the top and the bottom of the screen compared to traditional LCD panels which is really nice.
Honestly, such a big screen is not really suited for FPS gaming with a mouse and kb. It's a brilliant media consumption screen tho, no complaints there.
 
This is really sad but not totally unexpectable.
I mean, I wouldn't be surprised if a good share of the units that recently poped-up and were sold after the Digital Foundry buzz were more or less defective ones. Some people just exploited this to get rid of junk taking up too much room.

This should be a lesson to anyone, electronic devices wear out and paying a 20 years old monitor thousands of dollars/euros is a total nonsense (just as paying insane premiums for items released in ridiculously low numbers, see the latest CPU/GPU releases)
As a previous FW900 owner, I thought it was odd to see DF promote it so much. As, it becomes a diamond in a shit field when trying to find an operable one at this point. I do mildly regret donating mine when I did...but at the end of the day, this was over a decade ago at this point and fixing it would have cost waaay more than it was worth at the time. I will say that having 85hz during the peak LCD craze was a godsend and I had many an argument about why it was important.
 
Depends on what you're trying to do. It can only go up to 375mHz I think.

On the other hand, the Sunix and other converters with the Synaptics VMM2322 chip can hit almost 550mHz. But they are buggy at certain resolutions and frequencies
I've read your other post:
The Synaptics can almost get to 550mHz, which actually blows Nvidia's analog cards like the 980Ti out of the water. Those topped out at 400mHz, and couldn't be unlocked like AMD's cards. But if you only have 19" monitor, 375mHz is enough for most resolutions you could run.
And I've decided that I want to get the absolutely the best experience out of Sony FW900 and NEC MultiSync FE1250+. What is the best adapter with Synaptics VMM2322 I can buy? How can I calculate how many Hz I'm using?
 
Last edited:
Recently I was able to compare some 240 Hz IPS panels and an OLED CX to my 2070sb's, so maybe some of you guys will find the info interesting.
So my friend and I got 2 240 Hz IPS panels, I've got a MAG251RX because of I needed a 24 incher, and he got an XG270 Elite (that Blurbusters certified strobing).
Funny thing is that I'm really liking the experience I've got from this monitor in FPS games - the motion artifacts at 240 Hz strobed seem OK for me compared to a CRT, and the extra features like black equalizer + higher resolution make using this monitor a charm compared to a 2070sb. I also like extra vibrant colors from that more than 100% srgb colorspace it's got.
My friend on the other hand does not like his XG270 just because he still can see the pixel transition blur while flicking really hard compared to a 2070sb he's got.

I've also tested the 48CX a bit. Very wide strobe window with low brightness for such a big screen at max strobing mode makes it hard to swallow. But that separated backlight from each pixel alone makes the strobing really good on the top and the bottom of the screen compared to traditional LCD panels which is really nice.
Honestly, such a big screen is not really suited for FPS gaming with a mouse and kb. It's a brilliant media consumption screen tho, no complaints there.


thanks, its always interesting and helpful when common people come here to share their experiences comparing modern monitors -tvs to crts. specialy when its becoming harder to find reliable, unbiased monitor review or tech specialized sites regarding monitor tech evolution, that rather seem to be doing marketing brain washing tactics like creating those monitor "certifications" , "approvations" claiming "superiority" agains crts that clearly are done in order to increase their sale chances and comissions from the monitor manufacturers, rather than really offering a crt comparable quality product and not being transparent to the final user by hiding, undestimating the monitor massive flaws agains crts.


as for the MAG251RX however being an ips with sure notable light glow and bleed, im curious to know, if you want of course, if you can test if this supports 60hz motion clarity mode? if so, how is its motion clarity compared to the crt? does it have typical modern monitor strobing flaws like crosstalk? brightness loss compared to the crt? more notable agressive flicker?
 
thanks, its always interesting and helpful when common people come here to share their experiences comparing modern monitors -tvs to crts. specialy when its becoming harder to find reliable, unbiased monitor review or tech specialized sites regarding monitor tech evolution, that rather seem to be doing marketing brain washing tactics like creating those monitor "certifications" , "approvations" claiming "superiority" agains crts that clearly are done in order to increase their sale chances and comissions from the monitor manufacturers, rather than really offering a crt comparable quality product and not being transparent to the final user by hiding, undestimating the monitor massive flaws agains crts.


as for the MAG251RX however being an ips with sure notable light glow and bleed, im curious to know, if you want of course, if you can test if this supports 60hz motion clarity mode? if so, how is its motion clarity compared to the crt? does it have typical modern monitor strobing flaws like crosstalk? brightness loss compared to the crt? more notable agressive flicker?
Here you go :)
Bleed and backlight glow is very good on my MAG251RX. There are slight amount of it, but only on the lower outer corners of the screen and it's minimal. I could take a photo on the weekend.

60 Hz storing is not available on any modern fast panel as far as I know, but it is available on the LG CX and I assume BX tv's.
I also could not recommend the MAG251RX for low Hz strobing like 120Hz. The overdrive settings are not adjusted properly and the strobing window timing is quite off the ideal value.
On the other hand, the XG270 is unbelieveably good at 119Hz. It's also better than MAG251RX at 240Hz due to proper strobing window timing.

Yes, at 240Hz you can still see crosstalk with both of the monitors, but it work fine for me. That "for me" statement is very important in case of fps gaming - as I've mentioned earlier my friend who got the XG270 still prefers the CRT. He just sees a CRT as a better monitor for his playstyle because of the way he aims using a lot of flicking in FPS games, and instant and consistent "pixel" response helps with that.
Those new 240Hz AUO IPS panels are very fast, but only compared to anything else based on LCD tech. It's still an avg total response time of ~5-10 ms as measured by a5hun and VG279QM review by rtings. Hardware Unboxed reviews also show similar results.

Brightness is higher on the 240Hz IPS panels. MAG251RX is always brighter as it does not allow to change the strobe windows length. XG270 with strobing set to ultra mode uses a strobe window of 0.85 ms which limits it's max brightness to miserable 70 nits. At a tick lower, the strobing uses a window of 1.68ms and the brightness is up to 145 nits which is higher than any CRT without something like a super bright mode.
About the strobing flicker, I have not tested it myself yet, but judging from my friend words you will feel a the flicker "more" on that ultra mode on the XG270 that on a CRT at same 119-120 Hz.
With MAG251RX, it feels like the strobing window is really close to the extreme mode on the XG270, so around 1.5-2ms. I have not tested it on 120 Hz cause it's trash at 120 Hz strobing, so.. cant say about how it feels myself.
By the way, I cant use a CRT at lower than 100 Hz because of the eye strain and headaches, and my friend has a very similar tolerancy to flicker too. You can compare that with your tolerancy level and make some conclusions :)


Oh and I've forgot something. Here's a LG CX at 120 Hz max strobing mode.
To be honest I'm confused with that result. It looks just like on the photo in real life, which confuses me even more. OLED pixel response on avg is lower than 1ms, so there's no crosstalk at all. Separate by pixel backlight also helps with that on the top and the bottom of the screen.
But why is the picture so blurry? And it's not a focus problem while capturing the photo, it's really similar in real life as I told.
 
Last edited:
  • Like
Reactions: 3dfan
like this
I got a problem with my FW900. I use a Delock 62902 to go from DP to VGA. I have my FW900 running at 1920x1200@96Hz. It works great most of the time but occasionally the picture rolls vertically from bottom to top (like if you would screw around with the vhold on an old TV). It only does that once very quickly and it will be back in sync again. It does it maybe every 5 to 10 minutes or so.

As far as I can tell it doesn't happen at lower resolutions. Does anybody have any suggestion or idea what I can do to get rid of the problem?

Thanks!
 
Well, either it's an issue with the adapter (and you need to plug the monitor on graphic card with a native analog output, or try another adapter, preferably another model, to rule that out), or it's an issue with the electronics of the monitor itself and it needs repairs (where the problem would be isn't exactly obvious regarding the symptoms you describe).
 
I got a problem with my FW900. I use a Delock 62902 to go from DP to VGA. I have my FW900 running at 1920x1200@96Hz. It works great most of the time but occasionally the picture rolls vertically from bottom to top (like if you would screw around with the vhold on an old TV). It only does that once very quickly and it will be back in sync again. It does it maybe every 5 to 10 minutes or so.

As far as I can tell it doesn't happen at lower resolutions. Does anybody have any suggestion or idea what I can do to get rid of the problem?

Thanks!

Try lowering down the refresh rate to 95 Hz.
 
Recently done a g2 adjustment on my fw900 luckily it only needed to come down from 151 to 149. I assume the lower the g2 potentially the more hours the monitor has done? How often should this be done or checked? If I wait a year should I see that number rise again on its own? After I did this I now have a few vertical wavy lines near the right hand side of the screen not as visible once warmed up but weren't there before the adjustment. Have tried different vga cables with no difference. Made sure the cables were run away from power and other cables no difference. It's not terrible just annoying as they weren't there before. Also tried degaussing. Any ideas or anyone else have the same issue?

Thanks in advance for any help.

P. S. Can't really get a good enough picture to show it.
IMG_20201213_202645.jpg
 
Last edited:
Recently done a g2 adjustment on my fw900 luckily it only needed to come down from 151 to 149. I assume the lower the g2 potentially the more hours the monitor has done? How often should this be done or checked? If I wait a year should I see that number rise again on its own? After I did this I now have a few vertical wavy lines near the right hand side of the screen not as visible once warmed up but weren't there before the adjustment. Have tried different vga cables with no difference. Made sure the cables were run away from power and other cables no difference. It's not terrible just annoying as they weren't there before. Also tried degaussing. Any ideas or anyone else have the same issue?

Thanks in advance for any help.

P. S. Can't really get a good enough picture to show it.View attachment 308713

Would it be possible for you to post a clear video of the issue? Thanks so much in advance! Unkle Vito!
 
Would it be possible for you to post a clear video of the issue? Thanks so much in advance! Unkle Vito!
I can't really get a clear video unfortunately. I've noticed it's only there with an input active. No input no problem lol. And when the monitor is fully warmed up it's much more difficult to notice. Maybe something il have to live with until such time as a recap is required. As for now whites are white blacks are black and everything in between is gorgeously rich and deep. I don't think this monitor had much use before I got it a year ago. I don't think a day has gone by that I haven't had some hours using it.
 
In my opinion, based on the age of the monitors at this point, G2 levels shouldn't be a measuring stick for anything other than if the monitor's having issues. Adjust the G2 so that the image is correct per Sony's WinDAS WPB procedure and go from there.
 
Is there any way to get 10-bit deep color out of the FW900?
Hardest part of this is getting the whole pipeline to be 10 bit - this includes software and hardware. You could probably get it working with a quadro (or maybe an AMD card).
 
Hardest part of this is getting the whole pipeline to be 10 bit - this includes software and hardware. You could probably get it working with a quadro (or maybe an AMD card).

It's up to the game to support 10/12 bit colors (software) but what are the hardware limitations? The video card should support it but whatever adapter you are using (Startech DP2VGAHD20 or HDFury x3) to convert digital to analog would probably the issue. Correct?
 
Yes, if you are using an external DAC (an adaptor), then it would have to support 10 bit, and that may be challenging. Alternative is to use a video card with an onboard DAC, and with video drivers that support 10 bit. Apparently, the latest geforce drivers unlock 10 bit even if you're not using quadro, but not sure if this has been tested.
 
Yes, if you are using an external DAC (an adaptor), then it would have to support 10 bit, and that may be challenging. Alternative is to use a video card with an onboard DAC, and with video drivers that support 10 bit. Apparently, the latest geforce drivers unlock 10 bit even if you're not using quadro, but not sure if this has been tested.

Supposably the HDFury X3/X4 supports 10/12 bit. I own the HDFury X3 and Startech DP2VGAHD20. With the HDFury, 10/12 bit are selectable in the Nvidia control panel but doing a color banding test I still see banding. With the Startech DP2VGAHD20, there is only the 8 bit option available in the Nivida control panel. Does anyone know if the Startech DP2VGAHD20 supports 10 bit or is it only 8 bit? Or is there any confirmed analog adapter for the FW900 that supports 10/12 bit?

*I had my duplicate account deleted (Madrok) because I couldn't log in to this account (Foe-hammer). So Madrok/Foe-hammer are the same (me).
 
Last edited:
My friend got the Vention AFVHB, so there's a report about it.
Makes the picture blurry after 247 Mhz.
Does interlaced resolutions fine, like 1440x1080 at 160Hz or 2048x1536 at 107 Hz (244 MHz, just before that mosaic effect)
Only 247 MHz is very strange considering that even the shittiest chipset can do more.

Funny thing I've noticed with Startech DP2VGAHD20, if I disable the monitor using Win-P combo the adapter / AMD driver freaks out and the display starts to connect/disconnect in devices manager with like a 2 second interval causing mouse stutters.
Derupter Flybye have you ever noticed such an issue like that? I'm trying to find out if it's AMD specific or it's just the adapter.
Does the adapter see the monitor's EDID correctly?
What happens next, does it continue with that thing until you restart?
I don't have that adapter, but with my Delock 62967 i have never seen something like that and i use Win-P to disable the FW900 very often.

I got a problem with my FW900. I use a Delock 62902 to go from DP to VGA. I have my FW900 running at 1920x1200@96Hz. It works great most of the time but occasionally the picture rolls vertically from bottom to top (like if you would screw around with the vhold on an old TV). It only does that once very quickly and it will be back in sync again. It does it maybe every 5 to 10 minutes or so.

As far as I can tell it doesn't happen at lower resolutions. Does anybody have any suggestion or idea what I can do to get rid of the problem?

Thanks!
I had this problem a while ago with my adapter, a single fast lost of vertical sync, once or twice every gaming session, i solved the problem by cleaning the displayport connector of the adapter and the port on the graphic card with some isopropyl alcohol.
Eventually try to clean the vga port on the adapter as well.
 
Supposably the HDFury X3/X4 supports 10/12 bit. I own the HDFury X3 and Startech DP2VGAHD20. With the HDFury, 10/12 bit are selectable in the Nvidia control panel but doing a color banding test I still see banding. With the Startech DP2VGAHD20, there is only the 8 bit option available in the Nivida control panel. Does anyone know if the Startech DP2VGAHD20 supports 10 bit or is it only 8 bit? Or is there any confirmed analog adapter for the FW900 that supports 10/12 bit?

*I had my duplicate account deleted (Madrok) because I couldn't log in to this account (Foe-hammer). So Madrok/Foe-hammer are the same (me).
what software are you using to render the 10 bit image?
 
Video game with a 3080 GPU. But you are correct in asking because there might be some hangup with the game or gpu sending a 10 bit image.
I think a more reliable test would be to use adobe photoshop with 10 bit color enabled:

https://www.eizoglobal.com/support/compatibility/gpu/photoshopcc2017_18-nvidia-amd/

I'm guessing that if you have 10 bit color enabled in photoshop (termed 30 bit color in their software I think), you'll have 1024 individual steps per color channel to play around with.
 
Tested Nvidia dithering tweak few days ago and so far it seems to be working correctly.
https://hub.displaycal.net/forums/t...o fix it.-,1.,enable dithering = 100% success.
I tested it year or two ago and it worked but dithering was lost eg. when doing hibernation.
With this there is little need to have 10bit adapter or monitor.

Nvidia should enable dithering by default because it is affecting multiple users. Even if it is working configuring it is pretty bothersome...
 
  • Like
Reactions: Meeho
like this
Tested Nvidia dithering tweak few days ago and so far it seems to be working correctly.
https://hub.displaycal.net/forums/topic/how-to-enable-dithering-on-nvidia-geforce-with-windows-os/#:~:text=to fix it.-,1.,enable dithering = 100% success.
I tested it year or two ago and it worked but dithering was lost eg. when doing hibernation.
With this there is little need to have 10bit adapter or monitor.

Nvidia should enable dithering by default because it is affecting multiple users. Even if it is working configuring it is pretty bothersome...

This is great information! I hadn't even considered or knew anything about dithering. Few questions...

1) Is the Nvidia dithering tweak different from the Windows registry dithering tweak?

2) If you do an Nvidia or Windows registry dithering tweak, does it still need to be reset (enabled and disabled and reboot) when the PC hibernates etc.?

3) Do you know if Nvidia's new 3000 series GPUs support dithering by default? Does AMD's GPU support dithering by default?

4) Is there a noticeable difference between dithering 8 bit and deep color 10/12 bit? I would imagine that 10/12 bit colors would still look better.
 
Last edited:
I recently purchased a DP2VGAHD20 in preparation for getting a new GPU next year, and while it works perfectly for the higher pixel clock resolutions i use (1680x945@120hz, 1920x1080@100hz, 1920x1200@85hz, etc.), it doesn't appear to support the low pixel clock resolutions i use for retro games at all (256x224@120hz, 320x200@140hz, 320x240@120hz, etc.).

Is there any workaround to make these resolutions work using the DP2VGAHD20? Or an alternative DP to VGA converter without such a limitation?
 
I recently purchased a DP2VGAHD20 in preparation for getting a new GPU next year, and while it works perfectly for the higher pixel clock resolutions i use (1680x945@120hz, 1920x1080@100hz, 1920x1200@85hz, etc.), it doesn't appear to support the low pixel clock resolutions i use for retro games at all (256x224@120hz, 320x200@140hz, 320x240@120hz, etc.).

Is there any workaround to make these resolutions work using the DP2VGAHD20? Or an alternative DP to VGA converter without such a limitation?

Are the resolutions limitations of the adapter or the monitor?
 
Are the resolutions limitations of the adapter or the monitor?
The monitor is able to run those resolutions when hooked up to the DVI-I/VGA port on the same video card, so the limitation is either the adapter or the DisplayPort port on my current video card

(I'm given to understand that it isn't a hard limitation for DisplayPort in general, as i have read about people using DP to VGA converters to drive 15kHz displays at 320x200@70hz and other similar resolutions.)
 
the low pixel clock resolutions i use for retro games at all (256x224@120hz, 320x200@140hz, 320x240@120hz, etc.).

You shouldn't be using those resolutions anyway. They're double refresh rate, meaning you get a double scan effect on 60fps games. Go to TestUFO.com at 120hz then 60hz and notice how clear the 60fps UFO is at 60hz.
 
Back
Top