24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Here is what I have compiled so far to get an idea:

The Digital to VGA Adapters Spreadsheet

Still very incomplete. I will keep going through this thread to catch the different adapters, specs, and I'll keep looking for more data from adapter manufacturers and chipset manufacturers.

Please let me know if you guys feel this is too redundant, and I won't bother with it anymore.
 
You have three limits:
- horizontal refresh rate - monitor limits it, DAC does not
- vertical refresh rate - monitor limits it, DAC does not
- pixel clock - DAC limits it, monitor can do "infinite" pixel frequency

Older GPU's like last generation of GeForce/Radeon with analogue outputs had pixel clock limit at 400MHz.
I guess there are DACs with at least 400MHz or even more. My Delock 62967 does about 340MHz.
But you probably do not need very high pixel clock for >200Hz modes because they are typically so much limited by horizontal refresh rate that they do not need large pixel clock.
Awesome, thanks very much. I know that the 454 has different specs than the 514 (132khz vs 142khz), so I'm hoping it can also go above 200hz at lower resolutions like the 514 has been confirmed to. I have a Delock 62967 as well so I will post an update once I get my monitor.
 
Awesome, thanks very much. I know that the 454 has different specs than the 514 (132khz vs 142khz), so I'm hoping it can also go above 200hz at lower resolutions like the 514 has been confirmed to. I have a Delock 62967 as well so I will post an update once I get my monitor.
210Hz should be doable at 800x600 if you drop some vertical blanking lines. Usually you can drop some, though not too much or you get geometry issues at top/bottom of the screen. Similarly when you reduce horizobntal blanking you can save some pixel clock but cannot reduce too much otherwise you get geometry issues, this time a kind of horizontal warping on left/right side of image.
Pixel clock is so small (less than 160MHz) at this screen mode that any even cheapest HDMI to VGA converter should be sufficient for it.
 
So I got the Icy Box SPL1031 today but I didn't know it only takes input from minidp source (only minidp to minidp cable in the box) so now I need to get an adapter for that to use it with my 1080ti that has dp and hdmi outputs?

edit: ordered minidp to dp cable, should get it tomorrow.
 
Last edited:
Please let us know how it goes when it arrives.
Unfortunately those Vention converters are no longer in production and they will never be again. At least that's what a Vention rep told me.
Alliexpress cancelled my order.
To be honest, the best way to use a CRT is with the GTX 980Ti. One thing I see is not mentioned around here is the image quality of these converters. On all my monitors (Dell P1130, GDM-F520 and FW-900 to a lesser degree) the Sunix and Delock produce a blurrier output compared to my GTX 980.
 
Unfortunately those Vention converters are no longer in production and they will never be again. At least that's what a Vention rep told me.
Alliexpress cancelled my order.
To be honest, the best way to use a CRT is with the GTX 980Ti. One thing I see is not mentioned around here is the image quality of these converters. On all my monitors (Dell P1130, GDM-F520 and FW-900 to a lesser degree) the Sunix and Delock produce a blurrier output compared to my GTX 980.
I can't get stuck at the speed of a 980Ti forever, though. I started playing with my 2070 Super yesterday, and I already get increases between 30-50% from the 980ti.

My Delock 87685 looks fine to me...before I push it to its limit. It's rated on the VGA port to 1920 x 1200 @ 60 Hz. It looks real sharp and great up to 96Hz. Things get a little funky at 2304 x 1440 but still tolerable. AT 2304 x 1440 80Hz gives an odd mirroring issue at times. I have been the most stable at 2304 x 1440 @ 79Hz, but I get obvious flicker. Obvious to me. But that flicker I don't even notice it in a game. It is one of those things like when you are refinishing and painting a ceiling, and you notice inconsistencies. You try to be anal about it and then realize no one is going to standing a foot away from the ceiling or looking straight up while walking around the house. I will admit I am unhappy with the 2304 x 1440 performance in windows and flicker that I notice, but I only game with my FW900. Things are moving around so much and with darker colors in games that I don't even notice that I am at 79Hz.
 
Unfortunately those Vention converters are no longer in production and they will never be again. At least that's what a Vention rep told me.

Man that suuucckkksss.

And the 980Ti isn't really gonna cut it next year when the new Xbox and Playstation have much more powerful GPU's.

have been the most stable at 2304 x 1440 @ 79Hz, but I get obvious flicker.

Are you saying you don't notice flicker at 80hz but do at 79hz? That's definitely in your head man. That's like a 1.5% difference in scan time.
 
...Are you saying you don't notice flicker at 80hz but do at 79hz? That's definitely in your head man. That's like a 1.5% difference in scan time.
I do notice some at 80, but it's less. A minor less but still less. But keep in mind flicker can be very objective. Kinda like how some people can't tell the difference between skim milk and regular milk, and some people can't tell the difference between DVD and Blu-Ray quality. Sometime in the 1990s I discovered the monitor refresh rate setting in windows. Ever since then I knew I've always been able to tell the differance between different refresh rates. I've been championing rates higher than 60hz since the 90s :D.
 
To be honest, the best way to use a CRT is with the GTX 980Ti. One thing I see is not mentioned around here is the image quality of these converters. On all my monitors (Dell P1130, GDM-F520 and FW-900 to a lesser degree) the Sunix and Delock produce a blurrier output compared to my GTX 980.
Well, honestly I think it's an issue on your side, whatever it could be. I did test the Sunix adapter carefuly and I didn't notice the display being blurrier. Nothing significant with a Delock 62967 regarding sharpness either. And I'm comparing this against the analog output of an AMD card, these were well known for being generally of better quality than the analog outputs on Nvidia cards. ;)
 
So using the Delock 87685 I'd like to say I have zero difference in the Delock's performance jumping from a 980 Ti to a 2070 Super. 2304 x 1440 @ 80Hz is still problematic with that odd mirroring effect. I am sure this is just a limitation with the chipset the 87685 uses. I am open to trying a USB-C adapter or two that anyone suggests since they are pretty cheap. I am very curious now how an adapter using a different chipset behaives.
 
One thing I see is not mentioned around here is the image quality of these converters. On all my monitors (Dell P1130, GDM-F520 and FW-900 to a lesser degree) the Sunix and Delock produce a blurrier output compared to my GTX 980.

i indeed mentioned my related experiencie me being a sunix dpu3000 user here. as i wrote there, i personaly dont see a image quality difference, incluying sharpness between my sunix vs native analog from tested video cards (gtx 760 and integrated motherboard intel hd 4600).

there are other adapters mentioned that also seem to work straight with no issues and high enough pixel clocks, take a look o what Derupter wrote in previous page.



2304 x 1440 @ 80Hz is still problematic with that odd mirroring effect

you said you were able to use that resolution and refresh for 3 hours with 0 issues with some method you used prevoulsy in your early tests. are you sure someting was not changed, so can be the reason why you are seeing that issue again? for example, your delock not detecing your monitor properly even if you installed its dirvers, and so, ignoring the 2304 x 1440 @ 80Hz you created in nv panel and using the default (which is listed in monitor osd).

i would like to ask you when you use 2304 x 1440 @ 80Hz and start to have issues, check by pushing the osd button to see if that resolution is listed, or just list its horizontal and vertical refreshes, also check in nvidia control panel, resolution section how is your adapter being detected, if its by "sony digital display" or "synaptics inc", when that happens, i ask because in my case, if my monitor is not detected properly, it trends to ignore the custom resolutions, and use default problematic ones.
 
I can't get stuck at the speed of a 980Ti forever, though. I started playing with my 2070 Super yesterday, and I already get increases between 30-50% from the 980ti.

My Delock 87685 looks fine to me...before I push it to its limit. It's rated on the VGA port to 1920 x 1200 @ 60 Hz. It looks real sharp and great up to 96Hz. Things get a little funky at 2304 x 1440 but still tolerable. AT 2304 x 1440 80Hz gives an odd mirroring issue at times. I have been the most stable at 2304 x 1440 @ 79Hz, but I get obvious flicker. Obvious to me. But that flicker I don't even notice it in a game. It is one of those things like when you are refinishing and painting a ceiling, and you notice inconsistencies. You try to be anal about it and then realize no one is going to standing a foot away from the ceiling or looking straight up while walking around the house. I will admit I am unhappy with the 2304 x 1440 performance in windows and flicker that I notice, but I only game with my FW900. Things are moving around so much and with darker colors in games that I don't even notice that I am at 79Hz.
I've been getting the same results with the icy box, 2304x1440 at 79hz is stable, while 80hz has random mirroring effect.
Found 1600x1000 at 96hz to provide very good sharpness and performance but will try more resolutions. So far very impressed with the display in competitive online shooter like Rainbow Six siege. Every detail in the room remains crystal clear as you pan the rooms looking for bogeys.
 
  • Like
Reactions: 3dfan
like this
you said you were able to use that resolution and refresh for 3 hours with 0 issues with some method you used prevoulsy in your early tests. are you sure someting was not changed, so can be the reason why you are seeing that issue again? for example, your delock not detecing your monitor properly even if you installed its dirvers, and so, ignoring the 2304 x 1440 @ 80Hz you created in nv panel and using the default (which is listed in monitor osd).

i would like to ask you when you use 2304 x 1440 @ 80Hz and start to have issues, check by pushing the osd button to see if that resolution is listed, or just list its horizontal and vertical refreshes, also check in nvidia control panel, resolution section how is your adapter being detected, if its by "sony digital display" or "synaptics inc", when that happens, i ask because in my case, if my monitor is not detected properly, it trends to ignore the custom resolutions, and use default problematic ones.
The problem with the Delock at 2304 x 1440 @ 80Hz is that the problem is EXTREMELY random. Yes, I’ve tried when either Sony GDM-FW900 AND when Synaptics is displayed.

An example of how random this problem is:
1) I select 2304 x 1440 @ 80Hz in windows. Screen looks perfectly fine. Maybe 5-10 minutes later the mirroring begins. It pulsates for a few minutes between the mirroring and normal as if the Delock is struggling to maintain normality, and then the mirroring stays.

With the mirroring in windows, I start a game (also at 2304 x 1440 but only the 120.8/80Hz showing in the OSD in windows and the game) and the mirroring is gone. And in the game sometimes the mirroring never comes back. If it does, I alt-tab to windows, mirroring is gone, and I alt-tab back into the game with mirroring still gone. Sometimes the mirroring comes back. Sometimes it doesn’t.

2) If I am at 2304 x 1440 @ 80Hz and the mirroring begins, I can flip to any resolution with a different refresh rate, the mirroring is gone, flip back to 2304 x 1440 @ 80Hz, mirroring is still gone, but it may come back after x time.

Even though windows and the game are both at 2304 x 1440, and OSD shows 80Hz, something causes the “click” in the FW as if the refresh rate is changing. Anytime I make a change that changes the refresh rate, the mirroring goes away. It either goes away only for a little while or for a long time. Any change in refresh rates resets the Delock in some way that causes the mirroring to go away for either a short time or a long time.

I have not had 2304 x 1440 display in the OSD for probably over a decade. I’ve tried a few different monitor drivers and still have never had it display in the OSD.
 
Well, I've got two strikes trying to find an Nvidia card to use as my dedicated "interlaced" GPU to run alongside my Radeon 5700xt. Because I know next year there are going to be some games I can't run at 60fps, and I'll want to run at a lower locked frame rate with a 2x or 3x refresh rate.

I tried a 8500gt first, now I'm trying a GT 730 (Fermi). They both can't display interlaced resolutions over 2048x1536. They both will give blank or artifacting output, but at least the 730 realizes it and will switch back to a working resolution. And 730 is working well as a Physx GPU haha.

But does anybody have experience with an Nvidia card that can go beyond 1536i? Or if you have Kepler/Maxwell card, could you give it a shot?
 
Unfortunately those Vention converters are no longer in production and they will never be again. At least that's what a Vention rep told me.
Alliexpress cancelled my order.
To be honest, the best way to use a CRT is with the GTX 980Ti. One thing I see is not mentioned around here is the image quality of these converters. On all my monitors (Dell P1130, GDM-F520 and FW-900 to a lesser degree) the Sunix and Delock produce a blurrier output compared to my GTX 980.

So i was right, Vention replaced the old adapters with those new models which have lower performance.
About the old models, they have produced thousands of them, i can't believe they've all been sold.
Searching for LT8711X-B you can find the CGMHA or an identical adapter with a different name, example here there are 997 pieces available, if it's true try asking the seller if there is a way to ship it in your area.
Unfortunately most of the shops with availability are mainly from China India Philippines and others that do not ship worldwide.

I have not had 2304 x 1440 display in the OSD for probably over a decade. I’ve tried a few different monitor drivers and still have never had it display in the OSD.

To have that info on OSD it should be enough to use the original timings which are inside the EDID.
2304x1440 from Sony EDID is custom, different from CVT and a little different from GTF, you can recognize it by looking the sync polarity (they are both negative) and the vertical frequency is 79.999 Hz
If you want to create it from scratch using CRU:
-put 2304x1440 79.999 Hz with old standard timings
-set timing to manual, put vertical sync polarity to negative, set front porch to 176 and back porch to 448
The horizontal frequency should be 120.559 kHz


About adapters, in one of my previous post i talked about the chipset ITE IT6564 as one of the alternatives to Synaptics.
The last time i checked i found nothing, but now there is an adapter available, the Startech DisplayPort to HDMI 2.0 and VGA - model code DP2VGAHD20 here
One of the good things of the ITE IT6564 is that it is native displayport, so can be used by everyone.
-it is not limited to 360 MHz and can go up to 720 MHz
-it is available and the price is not bad
-can be a good alternative to the DPU3000 and other Synaptics based adapters or can be a shit adapter with a horrible DAC, only a good test will tell.

So if someone wants to try it...
 
Last edited:
It looks like they used a cheapo cable/connector without a lock like Delock but otherwise very interesting find ! :)
 
....About adapters, in one of my previous post i talked about the chipset ITE IT6564 as one of the alternatives to Synaptics.
The last time i checked i found nothing, but now there is an adapter available, the Startech DisplayPort to HDMI 2.0 and VGA - model code DP2VGAHD20 here
One of the good things of the ITE IT6564 is that it is native displayport, so can be used by everyone.
-it is not limited to 360 MHz and can go up to 720 MHz
-it is available and the price is not bad
-can be a good alternative to the DPU3000 and other Synaptics based adapters or can be a shit adapter with a horrible DAC, only a good test will tell.

So if someone wants to try it...
I will try your resolution suggestion later on today. TY.

Now about this Startech...I am curious!
I found the chipset specs: http://www.ite.com.tw/zh-tw/product/view?mid=137

"...can support VESA resolution up to WUXGA ( 1920x1200 @120Hz ) or WQXGA ( 2560X1600 RB @120Hz)..." :woot: That is the highest I have seen! Do the adapter manufacturers modify the chip capabilities in anyway? If they typically just buy the chips and package them into their product, then I am going to order it to test.
 
That's what they do, sticking the chip on a piece of PCB with a circuitry usually close to what is already advised by the chip manufacturer, plus some programming.

I noticed a funny thing: the older IT6562 is reported to have a DAC supporting 10bit deep color, while the upgraded version IT6564 has only 8bit mentionned on the product page. There might be something to investigate there. :geek:
 
It looks like they used a cheapo cable/connector without a lock like Delock but otherwise very interesting find ! :)

Yea i have a Startech adapter and they use always that shitty tiny cable but if it can sustain the bandwidth, who cares.

I will try your resolution suggestion later on today. TY.

Now about this Startech...I am curious!
I found the chipset specs: http://www.ite.com.tw/zh-tw/product/view?mid=137

"...can support VESA resolution up to WUXGA ( 1920x1200 @120Hz ) or WQXGA ( 2560X1600 RB @120Hz)..." :woot: That is the highest I have seen! Do the adapter manufacturers modify the chip capabilities in anyway? If they typically just buy the chips and package them into their product, then I am going to order it to test.

The important thing is the input bandwidth, with four displayport HBR2 lanes it can do up to 720 MHz, the Startech adapter has the HDMI 2.0 output certified for 4k 60 Hz so we are sure that all the input lanes are connected, the DAC is inside the chip together with the hdmi transmitter and nothing limits it.

That's what they do, sticking the chip on a piece of PCB with a circuitry usually close to what is already advised by the chip manufacturer, plus some programming.

I noticed a funny thing: the older IT6562 is reported to have a DAC supporting 10bit deep color, while the upgraded version IT6564 has only 8bit mentionned on the product page. There might be something to investigate there. :geek:

Yea I had already asked this thing to ITE long ago along with other things about high end crt, ecc..
They confirmed it but i don't think they have different DAC, they usually use always the same old circuit in all of their chipset.
 
What effects could it have being 8-bit vs 10-bit?

Our 32-bit color mode in windows is 16.7 million colors. But I read that 8-bit video is 16.7 million colors so I'm not sure how the two correlate? 🤷‍♂️
 
Last edited:
What effects could it have being 8-bit vs 10-bit?

Our 32-bit color mode in windows is 16.7 million colors. But I read that 8-bit video is 16.7 million colors so I'm not sure how the two correlate? 🤷‍♂️

10 bpc are necessary if you want to do a monitor calibration using a colorimeter, here a guide by specediver .
Once you have the custom LUT you can load it even if the output is 8 bpc, but if the video card doesn't support dithering you will have banding on colors.
At least that's what I understood, here the expert is spacediver
 
What effects could it have being 8-bit vs 10-bit?

Our 32-bit color mode in windows is 16.7 million colors. But I read that 8-bit video is 16.7 million colors so I'm not sure how the two correlate? 🤷‍♂️
8-bit video means 8 bits per component (bpc). Their are 3 components, either RGB or YCbCr.
6 bpc: 2^(6*3) = 2^18 = 262144 = 262 thousand (I seen this on Nvidia in Windows, not sure about AMD, not in macOS).
8 bpc: 2^(8*3) = 2^24 = 16777216 = 16.7 million (the other 8 bits of 32 bits may just be padding in the frame-buffer - they are not transmitted)
10 bpc: 2^(10*3) = 1073741824 = 1.07 billion

Extra bits gives better image (reduces banding in gradients when not using dithering) but reduces max pixel clock.
Chroma sub sampling (used with YCbCr color) reduces components (usually just Cb and Cr) per group of pixels which effectively reduces the average bits per pixel (bpp); (4:4:4 means no chroma sub sampling, = 3 components per pixel, just like RGB)
4:4:4 = 24 components per 8 pixels; using HBR2*4, max pixel clock = 720 MHz 8bpc (24 bpp), 576 MHz 10 bpc (30 bpp).
4:2:2 = 16 components per 8 pixels; using HBR2*4, max pixel clock = 1080 MHz 8bpc (16 bpp), 864 MHz 10 bpc (20 bpp).
4:2:0 = 12 components per 8 pixels; using HBR2*4, max pixel clock = 1152 MHz 10bpc (15 bpp), 1440 MHz 8bpc (12 bpp).

I think the three numbers means the following:
4: = horizontal width in pixels for the group of pixels
:4: = number of pixels in the first line out of the 4 that have chroma (4 means all pixels in the line have chroma)
:2: = number of pixels in the first line out of the 4 that have chroma (2 means there is chroma for every 2 pixels horizontally)
:4 = in the second line of 4 pixels, this is the number of new chroma (4 means all pixels in the line have chroma)
:2 = in the second line of 4 pixels, this is the number of new chroma (2 means there is chroma for every 2 pixels horizontally again - like in the first line). For 4:2:2, a chroma sub sampling test image will show blurriness in the horizontal direction only (rotate the test image 90° and the blurriness changes).
:0 = in the second line of 4 pixels, there is no additional chroma which means the chroma in the first line applies to the chroma in the second line. In the case of 4:2:0, one set of chroma represents 2x2 pixels, so there is chroma every 2 pixels horizontal AND vertically. A chroma sub sampling test image will show blurriness in both the horizontal and vertical direction (rotating the test image 90° will not show a difference).

Other chroma sub sampling modes are need seen in DisplayPort or HDMI: 4:4:0 (16 cp8p), 4:1:1 (12 cp8p), 4:1:0 (10 cp8p)

Display Stream Compression (DSC) uses a visually lossless compression algorithm (reduces bandwidth more than chroma sub sampling but gives much better quality because it chooses where to remove information and what kind of information to remove depending on the image).
 
Last edited:
What effects could it have being 8-bit vs 10-bit?

a 10 bit DAC will potentially allow you to gain a fourfold increase in the precision of your LUT adjustments. Instead of being limited to the same set of 256 shades of gray to determine the values for each of the 256 steps per channel, you can now choose from a palette of 1024 shades of gray when determining each of the 256 values. This means you can adjust gamma (up to a certain point) without sacrificing the total number (256) unique shades of gray.

btw for a really good discussion on video bit depth in general, see the discussion I had with zone74 here.

Our 32-bit color mode in windows is 16.7 million colors. But I read that 8-bit video is 16.7 million colors so I'm not sure how the two correlate? 🤷‍♂️

It's a confusing naming situation. 8 bit generally means the number of bits per pixel per color, whereas 32 bit generally means the number of bits per pixel (across all colors).

if you have 8 bits per pixel per color (typically referred to as 8 bits per color), it means you have 2^8 = 256 values of Red, 256 of Green, and 256 of Blue. So if you have 3 color channels (e.g. not a monochrome video system), then you have 8+8+8 = 24 bits per pixel. If you add 8 bits for a transparency channel, you end up with 32 bit color.
 
  • Like
Reactions: XoR_
like this
Anyone tried dithering on Nvidia cards on new drivers and windows version?
Last time I used it was very finicky eg. disabled itself after sleep cycle. When it worked it looked very good, just like native 10bit.
 
Anyone tried dithering on Nvidia cards on new drivers and windows version?
Last time I used it was very finicky eg. disabled itself after sleep cycle. When it worked it looked very good, just like native 10bit.

oh interesting. Is it done through the nvidia control panel?
 
So I finally tried the StarTech DP2VGAHD20.

Sadly, the results are not what I had hoped for.
So the Delock 87685 you can at least see the screen at 2304 x 1440 @ 80Hz, but you have an odd mirroring effect on the left and right sides. StarTech you cant even see the screen as shown in the pics. I first tried CRU’s CRT standard. Then I tried the old standard with a few tweaks Derupter mentioned. You can see some of the image but still very poor.

Same at 79Hz. Delock is fine at 79Hz at least. 78Hz and under is when you can see the screen with no issues.

1920x1200 I was able to test up to 96Hz, and it looks great.

Monitor detection I tried both the name of the adapter and FW900. I’m open to suggestions. 😊

Why does the chipset have higher specs than what is capable of after StarTech has their way with it?

http://www.ite.com.tw/zh-tw/product/view?mid=137
 

Attachments

  • CDA298EC-F69A-4EA2-A456-590EB6B0431F.jpeg
    CDA298EC-F69A-4EA2-A456-590EB6B0431F.jpeg
    1.1 MB · Views: 0
  • 9F584778-06DD-47C4-B47F-904FC0DA83CB.jpeg
    9F584778-06DD-47C4-B47F-904FC0DA83CB.jpeg
    844.7 KB · Views: 0
Last edited:
So I finally tried the StarTech DP2VGAHD20.

Why does the chipset have higher specs than what is capable of after StarTech has their way with it?

http://www.ite.com.tw/zh-tw/product/view?mid=137

My best guess is that those specs are for the HDMI output, and they didn't really bother to test the limits of the VGA output.

Probably an obvious question: Does it have a USB port for you to hook up for power? I assume you would hook that up, but just checking

Also, did you try other resolutions that are at similar pixel clocks? Like 2560x1600 at 70hz or something? I ask because the Sunix DPU 3000 hates 2048x1536, and anything close above that. But if you go WAY above it, the problems go away. Like I mentioned before, I've never seen the side swapping problem at 2880x2160 @ 60hz. So a lot of the Sunix's problems are independent of pixel clock.
 
So I finally tried the StarTech DP2VGAHD20.

Sadly, the results are not what I had hoped for.
So the Delock 87685 you can at least see the screen at 2304 x 1440 @ 80Hz, but you have an odd mirroring effect on the left and right sides. StarTech you cant even see the screen as shown in the pics. I first tried CRU’s CRT standard. Then I tried the old standard with a few tweaks Derupter mentioned. You can see some of the image but still very poor.

Same at 79Hz. Delock is fine at 79Hz at least. 78Hz and under is when you can see the screen with no issues.

1920x1200 I was able to test up to 96Hz, and it looks great.

Monitor detection I tried both the name of the adapter and FW900. I’m open to suggestions. 😊

Why does the chipset have higher specs than what is capable of after StarTech has their way with it?

http://www.ite.com.tw/zh-tw/product/view?mid=137

Judging from the pics it seems that the DAC can't handle such high pixel clock.
The official specs are:
Input - DP Receiver up to 720 MHz with 24 bit
Output:
-HDMI 2.0 up to 600 MHz with 24 bit
-DAC up to 200 MHz with 24 bit

Can you test the HDMI output at 4k 60 HZ?
Do you have a TV or a monitor that can handle that?
Just to be sure that the cable handles the bandwidth without problems

How much can you go up with the pixel clock before encountering problems with the image?
Is there an option on the graphic card panel to set the bit depth of the DP adapter input? (the DP output of the video card)
 
So first to answer a few questions:
1. The StarTech DP2VGAHD20 doesn't have a USB power port. The Delock 87685 does, but it never made a difference. I'm sure it would if I was using the Delock's 2 or 3 outputs at the same time.
2. I don't have anything to test 4K with.
3. I don't have a bit depth option other than color depth.

So now for my results!

2560 x 1600 new and old CRT timings.

Delock
71-80Hz = Out of scan range
70Hz / 411hz pixel clock Did fine for 15 minutes with no side mirroring.

StarTech
74+ Out of scan range
73 Rainbow snow - 429Mhz pixel clock (curious why I was out of scan range with Delock but not with StarTech)
72 Rainbow snow - 423Mhz pixel clock
70 Rainbow snow - 411Mhz pixel clock
66 Rainbow snow - 385Mhz pixel clock
65 Rainbow snow - 379Mhz pixel clock
64 Viewable - 372Mhz pixel clock
63 Viewable - 366Mhz pixel clock
62 Viewable - 360Mhz pixel clock
61 Viewable - 354Mhz pixel clock
60 Viewable - 340Mhz pixel clock

2304 x 1440

Delock
80Hz side mirroring - 382Mhz pixel clock
79Hz Viewable - 377Mhz pixel clock

StarTech
80Hz Rainbow snow - 382Mhz pixel clock
79Hz Rainbow snow - 377Mhz pixel clock
78Hz Viewable - 372Mhz pixel clock


I decided to try a few oddball resolutions
2189 x 1368 (5% of 2304 x 1440)

Delock
85Hz Out of scan range
84Hz Bad flickering - 363Mhz
83Hz Bad flickering - 358Mhz
82Hz Bad flickering - 354Mhz


StarTech
85Hz Out of scan range
84Hz Viewable - 363Mhz pixel clock


2235 x 1397 (3% of 2304 x 1440)

Delock
84Hz side mirroring - 380Mhz pixel clock
83Hz Viewable - 375Mhz pixel clock

StarTech
84Hz Out of scan range - 380Mhz pixel clock
83Hz Viewable - 375Mhz pixel clock


1920 x 1200
Delock & StarTech
97Hz Out of scan range - 325Mhz pixel clock
96Hz Viewable - 322Mhz pixel clock
95Hz Viewable - 318Mhz pixel clock
90Hz Viewable - 300Mhz pixel clock


So it's pretty clear their limits are around the 375-385Mhz range. What Is interesting is that I can push 2235 x 1397 @ 83Hz vs 2304 x 1440 @ 79Hz. Kinda makes me wonder if running an odd resolution is worth it for that higher refresh rate and better clarity at a slightly less than resolution. Because I can definitely notice the lack in flicker from 79Hz to 83Hz. I mean why not, right? that is why we have CRTs? Because we love having the freedom to choose whatever resolution we want? :) Staying at 1920x1200 @ 96Hz is so tempting.
 
Last edited:
I basically create a new resolution for every game I play. Halo 3 is 1440x1080 @ 120hz. Street Fighter 5 (60fps cap) I play at 3200x1800 @ 60hz. Hellblade, I went with 2024x1518 (not 2048, cuz of Sunix) at 75hz. They all look spectacular.
 
  • Like
Reactions: 3dfan
like this
So first to answer a few questions:
1. The StarTech DP2VGAHD20 doesn't have a USB power port. The Delock does, but it never made a difference. I'm sure it would if I was using the Delock's 2 or 3 outputs at the same time.
2. I don't have anything to test 4K with.
3. I don't have a bit depth option other than color depth.

So now for my results!

2560 x 1660 new and old CRT timings.

Delock
71-80Hz = Out of scan range
70Hz / 411Mhz pixel clock Did fine for 15 minutes with no side mirroring.

StarTech
74+ Out of scan range
73 Rainbow snow - 429Mhz pixel clock (curious why I was out of scan range with Delock but not with StarTech)
72 Rainbow snow - 423Mhz pixel clock
70 Rainbow snow - 411Mhz pixel clock
66 Rainbow snow - 429Mhz pixel clock
65 Rainbow snow - 429Mhz pixel clock
64 Viewable - 372Mhz pixel clock
63 Viewable - 366Mhz pixel clock
62 Viewable - 360Mhz pixel clock
61 Viewable - 354Mhz pixel clock
60 Viewable - 340Mhz pixel clock

2304 x 1440

Delock
80Hz side mirroring - 382Mhz pixel clock
79Hz Viewable - 377Mhz pixel clock

StarTech
80Hz Rainbow snow - 382Mhz pixel clock
79Hz Rainbow snow - 377Mhz pixel clock
78Hz Viewable - 372Mhz pixel clock


I decided to try a few oddball resolutions
2189 x 1368 (5% of 2304 x 1440)

Delock
Did not like this resolution at all. It kept flickering.

StarTech
84Hz Out of scan range
83Hz Viewable - 384Mhz pixel clock


2235 x 1397 (3% of 2304 x 1440)

Delock
84Hz side mirroring - 380Mhz pixel clock
83Hz Viewable - 375Mhz pixel clock

StarTech
84Hz Out of scan range - 380Mhz pixel clock
83Hz Viewable - 375Mhz pixel clock


1920 x 1200
Delock & StarTech
97Hz Out of scan range - 325Mhz pixel clock
96Hz Viewable - 322Mhz pixel clock
95Hz Viewable - 318Mhz pixel clock
90Hz Viewable - 300Mhz pixel clock


So it's pretty clear their limits are around the 375-385Mhz range. What Is interesting is that I can push 2235 x 1397 @ 83Hz vs 2304 x 1440 @ 79Hz. Kinda makes me wonder if running an odd resolution is worth it for that higher refresh rate and better clarity at a slightly less than resolution. Because I can definitely notice the lack in flicker from 79Hz to 83Hz. I mean why not, right? that is why we have CRTs? Because we love having the freedom to choose whatever resolution we want? :) Staying at 1920x1200 @ 96Hz is so tempting.

Thanks for testing it

You are over 360 MHz, so the cable and the receiver do their work. (HBR2 on full lanes)
Sadly the DAC can't go too far from there, but 375 MHz is good for a thing certified for 200.
Also with 375 MHz and a little tweaking on blanking you can do the 2304x1440 80 Hz.
Considering the price (in my country 40 euro shipped), if it works with all the video cards without those fucking cable problems like the 62967, it's a very good DP to VGA adapter and you have the plus of an additional HDMI 2.0 output.

About your test i think there are some errors here and there, example:
2560x1600 65 Hz here is 378.77 MHz with CVT and not 429, same thing with 66 Hz
2189x1368 83 Hz here is 358.41 MHz with CVT and not 384

Image quality is good up to 375 MHz?
 
I tried to get a Vention with the older chip a few months ago, but my order was cancelled. I do have one with the newer and I guess weaker chip, but I don't have anything with USB-C to try it with yet.

My needs are probably low though. I normally use 1600 by 1024 at 100 Hz. (To align with the display's aspect ratio. And with the idea that it fits within the 1920 across or so aperture grill with a margin for error as the screen is painted, which should increase sharpness. Not that 1920 by 1200 looks bad though. Though I guess I'd make that 1920 by 1228 given this displays slightly odd dimensions.)
 
Thanks for testing it

You are over 360 MHz, so the cable and the receiver do their work. (HBR2 on full lanes)
Sadly the DAC can't go too far from there, but 375 MHz is good for a thing certified for 200.
Also with 375 MHz and a little tweaking on blanking you can do the 2304x1440 80 Hz.
Considering the price (in my country 40 euro shipped), if it works with all the video cards without those fucking cable problems like the 62967, it's a very good DP to VGA adapter and you have the plus of an additional HDMI 2.0 output.

About your test i think there are some errors here and there, example:
2560x1600 65 Hz here is 378.77 MHz with CVT and not 429, same thing with 66 Hz
2189x1368 83 Hz here is 358.41 MHz with CVT and not 384

Image quality is good up to 375 MHz?
Thank you. I corrected my numbers. Not sure what happened as I had the correct numbers in my notes. Probably just got distracted.

My clock rates are based off the CRT standard and Old Standard in CRU which are usually about 1 - 1.5Mhz off. The only difference I see between the CVT option in Nividia and the CRT Standard in CRU is the sync width. Nvidia puts the sync width at 10 while CRU puts it at 6. All other figures stay the same.

Yesterday I ran the Delock 87685 at 2235 x 1397 @ 83Hz - 375Mhz pixel clock for a few hours in windows and gamed with it for another 2 hours or so with no issues. I've had it for an hour now so far with no issus. I didn't do an extended test with the StarTech DP2VGAHD20 because I noticed the StarTech either works or doesn't work (snow). The StarTech I would stay at some of the viewable resolutions for 30+ minutes, and they were fine. The Delock is a little finicky. The Delock at 2304 x 1440 @ 80Hz can be fine for an hour or a few hours before it starts mirrowing. It is very random. Going back to 79Hz fixes it. If the Delock seriously doesn't want to work with a pixel clock, it will immediatly show mirrowing vs slowly creeping to the mirroring error like it sometimes does.

I can do some extended tests for a few hours with the StarTech if you would like at 2235 x 1397 @ 83Hz / 375Mhz pixel clock. It is a great alternative for the price, and is a great backup to my Delock should it ever die.
 
  • Like
Reactions: 3dfan
like this
thanks for your test again ;) i personaly have been having some different results from you and other people here with my dpu3000, in my case i have been able to run custom made res - refresh like 2304 x 1440 80hz 2048x1536 55-60hz 1920x1200 90hz with zero issues with the method i mentioned already that otherwise have issues, it seems to me dpu3000, at least my model and with my 1080 ti it rather hate some standard resolutions and refresh as they come by default from the edid.

but however i think the important thing here is that thanks to one of the many advantages crt´s have, such the real flexibility of use many diferent resolutions and refreshes without worring about image degradation due to not using the native resolution, users can simply use some different custom res-refreshes that works for them, as me, you ,Enhanced Interrogator, and other Synaptics chip based adapter users have managed to as well, and everyone happy!
 
I think it is VERY understandable that we all get slightly different results towards the adapter's limits most especially since we are pushing them FAR beyond their limits :) ! I am actually very happy I experimented some more and discovered a resolution just a hair smaller than 2304x1440 that I can run at 83Hz. I have always been sensitive to flicker, and I can tell a world of a difference between 79 to 83Hz. Even from 80 to 83Hz I can tell the difference.

Why is it so difficult in finding a converter that supports a high a pixel clock? Is it something with the physical chip design? Is it internal programming? I mentioned this problem the community is having finding high pixel clock converters to an electrical engineering student friend. If I can get details to what goes into a converter chip, he might be able to look into it for his graduation project.
 
I think it is VERY understandable that we all get slightly different results towards the adapter's limits most especially since we are pushing them FAR beyond their limits :) ! I am actually very happy I experimented some more and discovered a resolution just a hair smaller than 2304x1440 that I can run at 83Hz. I have always been sensitive to flicker, and I can tell a world of a difference between 79 to 83Hz. Even from 80 to 83Hz I can tell the difference.

Why is it so difficult in finding a converter that supports a high a pixel clock? Is it something with the physical chip design? Is it internal programming? I mentioned this problem the community is having finding high pixel clock converters to an electrical engineering student friend. If I can get details to what goes into a converter chip, he might be able to look into it for his graduation project.
Chipsets capable of a 400Mhz pixel clock and beyond have been widely available for tens of years, I doubt there is any technical difficulty with this.

The problem is probably rather that the companies that had that kind of expertise ditched their products because there's no wide need anymore for it, and everything we can find now comes from second-class chip makers. And since it's barely believable for the average Joe nowadays that a monitor with a VGA input may display anything beyond 1920x1080@60hz, that's what they'll design their products for.
 
I think it is VERY understandable that we all get slightly different results towards the adapter's limits most especially since we are pushing them FAR beyond their limits :) ! I am actually very happy I experimented some more and discovered a resolution just a hair smaller than 2304x1440 that I can run at 83Hz. I have always been sensitive to flicker, and I can tell a world of a difference between 79 to 83Hz. Even from 80 to 83Hz I can tell the difference.
If you are sensitive to vertical refresh rate then why do you need adapter with higher pixel clock?
Anything beyond 1600/1920x1200 is completely pointless on CRT monitors and at 1200 lines you will hit horizontal refresh limit before you do pixel clock given current adapters eg. Delock 62967

For FW900 1920x1200@97Hz is 319MHz
If I want more refresh rate I need to lower resolution and them even with higher refresh rate pixel clock will be lower eg. 1600x1000@115Hz is 294.5MHz

EDIT://
Why is it so difficult in finding a converter that supports a high a pixel clock? Is it something with the physical chip design? Is it internal programming? I mentioned this problem the community is having finding high pixel clock converters to an electrical engineering student friend. If I can get details to what goes into a converter chip, he might be able to look into it for his graduation project.
Back when no converter could do 1920x1200@97Hz I was pondering at idea of using FPGA for that or even without FPGA, just using chips alone.
The issue is that you need digital input chip (DP or HDMI) and then DAC chip. I found 330MHz chip which would be fine.
With FPGA I could do color correction (gamma/gamut) and it would be much preferable. Ultimately I dropped the idea when already made converters became available.
With FPGA alone I can receive about 1280x720@60Hz and output it with R2R ladder just fine but that is pretty far away from required 320MHz...
 
Last edited:
Back
Top