24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Really appreciate your great answer here Derupter.

And before we get to this question, are you saying there are three independent bit depths going on here?

1) bit depth of DAC
2) bit depth of digital input
3) bit depth of digital output

Either way, my hunch is that digital output being set to 8 bits would still allow 10 bit LUT. But this is based on my speculative understanding of how things work.

If anyone has any adapters they want tested for LUT bit depth, feel free to PM me. Alternatively, if anyone has a colorimeter and a DAC they want to test for bit depth, I can provide instructions on how to do so.

By digital input i mean the input of the adapter which is the digital output of the graphic card
By digital output i mean what exit from the graphic card and what you set on graphic options
Digital receivers of these adapters usually pass the signal as it is,if the DAC is 8 bit the receiver trims the lower 2 bits (in case of 10 bit input)
From what i understood to preserve all the informations of the custom LUT you need to set the digital output of the graphic card to 10 bit, but i don't know much about these things
The downside of this is that you lose bandwidth on the digital part,in this case the only chip with enough bandwidth for 10 bit is the Lontium LT8612UX,the ITE IT6562 can do max 288 MHz
 
Thanks for clarification.

Yea, it's a good question, I suppose we could do a test with an adaptor that has 10 bit DAC and see whether we get 10 bits of LUT precision even when working with 8 bit digital output.

It would indeed be nice to preserve bandwidth by maintaining 8 bit digital output while still benefiting from 10 bit LUT precision.

I do know that my Nvidia GTX 660 has 10 bits of LUT precision yet I can run 2304x1440 @ 70 hz (which requires > 330 MHz), although to be fair, I've only tested bit depth of LUT at 1920x1200 @ 85 hz, which is ~ 281 MHz). I've no way of knowing if this is in the context of 8 bit or 10 bit digital output though.

It's hard to imagine how the DAC would be able to achieve 10 bit precision with only 8 bit output though... If I'm not mistaken, 8 bit output means that the DAC is receiving a time series, where each sample (for each of three channels) is an 8 bit integer (e.g. 00001111). So there are 256 unique values that the DAC gets fed for each sample, and without access to the videoLUT, it will have no idea how to "scale" these values to convert into voltages (i.e. it would just scale the voltages linearly between peak voltage and 1/256 * peak voltage). The only way I see around this is having 10 bit integers fed into the DAC (which are still scaled linearly), but which can now encode 10 bit precision of the LUT.

(the alternative is to have a 10 bit LUT that can be loaded onto the DAC itself, which could do the remapping of 8 bit integers onto voltages - I believe some monitors have this capability for the same reason).

Really would be interesting to see someone try and cannibalize a DAC from these older video cards and merge it with an HDMI or DP to VGA converter - I don't know enough to know whether this is even feasible though.
 
Last edited:
All these issues with 8b/10b are caused by Nvidia's badf implementation of digital outputs that does not use any dithering. There is zero issues with banding on AMD cards when using 8bit connection and implementation is so good I doubt anyone would pass ABX test with it vs. true 10bit

DAC with its own LUTs and preferable even gamut correction would be awesome but it is too much to ask for imho.
 
Thanks for clarification.

Yea, it's a good question, I suppose we could do a test with an adaptor that has 10 bit DAC and see whether we get 10 bits of LUT precision even when working with 8 bit digital output.

It would indeed be nice to preserve bandwidth by maintaining 8 bit digital output while still benefiting from 10 bit LUT precision.

I do know that my Nvidia GTX 660 has 10 bits of LUT precision yet I can run 2304x1440 @ 70 hz (which requires > 330 MHz), although to be fair, I've only tested bit depth of LUT at 1920x1200 @ 85 hz, which is ~ 281 MHz). I've no way of knowing if this is in the context of 8 bit or 10 bit digital output though.

It's hard to imagine how the DAC would be able to achieve 10 bit precision with only 8 bit output though... If I'm not mistaken, 8 bit output means that the DAC is receiving a time series, where each sample (for each of three channels) is an 8 bit integer (e.g. 00001111). So there are 256 unique values that the DAC gets fed for each sample, and without access to the videoLUT, it will have no idea how to "scale" these values to convert into voltages (i.e. it would just scale the voltages linearly between peak voltage and 1/256 * peak voltage). The only way I see around this is having 10 bit integers fed into the DAC (which are still scaled linearly), but which can now encode 10 bit precision of the LUT.

(the alternative is to have a 10 bit LUT that can be loaded onto the DAC itself, which could do the remapping of 8 bit integers onto voltages - I believe some monitors have this capability for the same reason).

Really would be interesting to see someone try and cannibalize a DAC from these older video cards and merge it with an HDMI or DP to VGA converter - I don't know enough to know whether this is even feasible though.

Well the DAC integrated on old graphic cards is inside the GPU so it seems impossible to cannibalize it.
Also with the integrated DAC you don't have to take care about the digital output for the same reason.
But with an external adapters with a proper 10 bpp DAC we must take care of it because there is a digital output interface before the DAC with all its problems (limited bandwidth,dithering,bit depth,ecc.)
I'd like to try one of these new chip but all the new adapters are USB-C and i don't have this output.
Maybe i will buy a card like the Sunix UPD2018 so i can connect those things in any video card,or a nice USB-C connection on my next card.
 
All these issues with 8b/10b are caused by Nvidia's badf implementation of digital outputs that does not use any dithering. There is zero issues with banding on AMD cards when using 8bit connection and implementation is so good I doubt anyone would pass ABX test with it vs. true 10bit

DAC with its own LUTs and preferable even gamut correction would be awesome but it is too much to ask for imho.

I seem to have read not long ago about some registry keys to enable dithering on Nvidia cards with one of the last drivers.

EDIT
Here https://forums.geforce.com/default/...rver-to-geforce-driver-/post/5934577/#5934577
 
Last edited:
Flybye, thats very strange, i have a sunix dpu3000 and can do 2304x1440 @ 80Hz and even can get the maximum 160hz and all the refreshes FW900 support.
you seem the first one of all sunix dpu3000 users here reporting that 60hz max limit from what i remember...... sure something is not correctly setup on your graphics card - sunix - fw900 combo or your sunix maybe defective? or..maybe sunix downgraded the chip frequency in your model? (hopefully not)

i strongly suggest you and other users recently asking about digital to analog video converters to make a search on this thread about words like sunix, delock, dac, plugable, displayport to vga hdmi to vga... its not a long 400+ page search, just look for the more recent results, the digital to analog discution is relatively new since the last graphics card vga compatible was produced some few years ago.

also as a quick help, did you tried software like CRU (custom resolution utility) or via your graphics card control panel to create the custom resolution?
Sunix website only mentions 60hz even at 1920x1080. I figured that is all it can push. I sent them an email a few days ago and have NOT heard from them to see if they can confirm 2304x1440 @80 or 85hz. I would like them to confirm it can just in case those saying they can run at 80Hz + are just really like ones.
 
Sunix website only mentions 60hz even at 1920x1080. I figured that is all it can push. I sent them an email a few days ago and have NOT heard from them to see if they can confirm 2304x1440 @80 or 85hz. I would like them to confirm it can just in case those saying they can run at 80Hz + are just really like ones.
60Hz is usually what you'll find on the datasheet of all converters because it's the default refresh rate with LCDs. That doesn't mean it can't run higher, but you'll usually need to use CRU to set custom resolutions with a different refresh rate.
 
Sunix website only mentions 60hz even at 1920x1080. I figured that is all it can push. I sent them an email a few days ago and have NOT heard from them to see if they can confirm 2304x1440 @80 or 85hz. I would like them to confirm it can just in case those saying they can run at 80Hz + are just really like ones.

There are a dozen people on this forum that have confirmed it can run beyond a 500mHz pixel clock. Sunix the company has been selling it as a multi-monitor splitter, so they never stress-tested it like we have.
 
Well that Is interesting to know! Only problem is I can not find it anywhere in the US for sale. :( Has it been discontinued? I never even got a reply email from Sunix. What other adapters have been verified to do 2304x1440@80Hz if the Sunix will no longer be available?
 
i played around this morning and last night, no matter what resolution i choose it still get jiggly shaky vision. weather i use my video card directly or the sunix at any resolution even the super low ones. i must have some sort of jacked up vertical and horizontal frequencies. power strip used to automatically adjust all those. CRU dsure didn't and it's shaky. i'm to ignorant and ignorant to know what the hell i'm doing.
 
Well that Is interesting to know! Only problem is I can not find it anywhere in the US for sale. :( Has it been discontinued? I never even got a reply email from Sunix. What other adapters have been verified to do 2304x1440@80Hz if the Sunix will no longer be available?

Delock 87685 seems to be the same product,no one tested it and probably you have to order it from Europe
 
i played around this morning and last night, no matter what resolution i choose it still get jiggly shaky vision. weather i use my video card directly or the sunix at any resolution even the super low ones. i must have some sort of jacked up vertical and horizontal frequencies. power strip used to automatically adjust all those. CRU dsure didn't and it's shaky. i'm to ignorant and ignorant to know what the hell i'm doing.

Are you choosing "CRT standard timings" in the dropdown box?
 
The Delock 87685 is available here: https://www.grooves-inc.com/delock-...ock-hardware-electronic-pZZa1-2097988797.html;https://www.grooves-inc.com/delock-...ock-hardware-electronic-pZZa1-2097988797.html

Looks like a reliable shop and you can use PayPal without creating another extraneous account.

The specs given do say 1920x1200 at 60 Hz max on the VGA output as opposed to 2560x1600. The DPU3000 specs claim support for 2560x1600 on all outputs. The chipset is the same, though. Could be a difference in documentation only.
 
Last edited:
The specs given do say 1920x1200 at 60 Hz max on the VGA output as opposed to 2560x1600. The DPU3000 specs claim support for 2560x1600 on all outputs. The chipset is the same, though. Could be a difference in documentation only.

Yeah, I'm going to take the specs on a website ending with ".land" as gospel. It's got the Synaptics chip, it's good to go.
 
DPU3000/4 user here. I too am getting the screen jiggling, noticeable after around a few hours of use, but at complete random. Only occurs on certain resolutions like 1024x768 and similar. Though I'm trying to run at 120Hz.

And there's also what seems to be an issue with scaling, at least on AMD cards. Sometimes I can only get the resolution to work "within" another resolution and it will only display within the bounds of that one no matter what I try to change. Other times the scaling completely dies. I had to turn GPU scaling off or the entire setup would just turn a blank screen after a few seconds of attempting to put on 1366*768 100hz - it tries to do so at 1280x1024 or something like that. A lot of the other smaller and larger resolutions garner the same result. Is there a way to rectify this in CRU?
 
i'm not even using the DPU i'm straight in to my video card, i don't even get CRT timings with power strip, in fact power strip only shows my second monitor, it's as if this version of power strip has the resolution features stripped out. i can't run past 60hz anymore, ever since some major windows 10 update. either the timings are weird now or the monitor has crapped out. Windows 10 major updates love to jack up drivers and settings.
 
i'm not even using the DPU i'm straight in to my video card, i don't even get CRT timings with power strip, in fact power strip only shows my second monitor, it's as if this version of power strip has the resolution features stripped out. i can't run past 60hz anymore, ever since some major windows 10 update. either the timings are weird now or the monitor has crapped out. Windows 10 major updates love to jack up drivers and settings.

Why are you using Power Strip? Custom Resolution Utility is the new standard, it's way easier to set up and is still getting updates.

Sometimes I can only get the resolution to work "within" another resolution and it will only display within the bounds of that one no matter what I try to change....(snip)....Is there a way to rectify this in CRU?

I think the best way is go into Windows' advanced display settings and choosing the resolution and refresh rate manually. On Nvidia you can also do it through the control panel. I don't think AMD has a way to do this through the driver anymore though, only throw Windows. There are probably some 3rd party programs that can make it a quicker process. One program I know of if RefreshLock, though it's still tedious if you use multiple refresh rates for one resolution.
 
Why are you using Power Strip? Custom Resolution Utility is the new standard, it's way easier to set up and is still getting updates.



I think the best way is go into Windows' advanced display settings and choosing the resolution and refresh rate manually. On Nvidia you can also do it through the control panel. I don't think AMD has a way to do this through the driver anymore though, only throw Windows. There are probably some 3rd party programs that can make it a quicker process. One program I know of if RefreshLock, though it's still tedious if you use multiple refresh rates for one resolution.

I had the PS menus memorized, CRU is what i used when i i the jiggle wiggle vision. i'll try and force my self to sit down with it. The Nvidia DEFINITELY does not hold your with the timings and also gave my squiggle vision.

I miss the Days when you could just install a .inf file or share a profile file something instead of manually setting timings for god knows what.
 
SUNIX DPU3000 USERS!!!

lurking the crt collective facebook group, found something that can be very important and the key of why every user seem to have some type of issues with the sunix dpu3000, and it seems related of the way the dpu3000 is being powered, so a good quality non usb pc based powersupply adapter can be the anwser, as these users from the group claim no to have any issues at all powering their dpu3000 with a quality 2.1 amp range usb adaptor:

important.jpg
 
Very intersting. I'll be trying that soon. I get some waviness in 1920x1440p@75hz.

I have my doubts that this will fix the issue where the left and right margins of the screen randomly swap on certain very high resolutions. I've only seen it happen above 2048x1536, and it seems more like a bug in the Synaptics chip than an issue with power. Weirdly, the issue seems to happen more frequently the closer you get 2048x1536 without going below. So for example I think I saw it quite a bit when I created a 2160x1620 resolution. But running 2560x1920 for Battlefield 1, I'm not sure if I ever saw it.
 
Sunix DPU3000 Power Consumption 1.45W (don't know if max or typical)
Displayport 3.3V 500mA (1.65W Max)
Classic USB 2.0 power 5V 500mA (2.5W)
USB 3.0 power 5V 900mA (4.5W)
The adapter works with only the displayport cable (the included one) so the chip is probably powered by the 3.3V line,not sure without the datasheet but certainly it doesn't work directly with the 5V line of the additional USB power.
I don't think it consumes more than what is supplied by a simple USB port,otherwise the chip temperature would be prohibitive.
Anyway if someone wants to try,connect the USB cable to a USB 3.0 port and see if anything changes,or try with a Y-USB cable to double the power.
I don't think there is a digital communication between the adapter and the USB host,but you can try to see in device manager how much power is requested to the USB Hub.
 
SUNIX DPU3000 USERS!!!

lurking the crt collective facebook group, found something that can be very important and the key of why every user seem to have some type of issues with the sunix dpu3000, and it seems related of the way the dpu3000 is being powered, so a good quality non usb pc based powersupply adapter can be the anwser, as these users from the group claim no to have any issues at all powering their dpu3000 with a quality 2.1 amp range usb adaptor:

View attachment 164979
No. This is incorrect. The issues I presented in earlier posts once you reach 2048x1536 are independent of the way you power the splitter. It's either the chip (most likely) or the firmware.
Has anyone tried the Delock 87685? It's the Sunix equivalent.
 
I am sure it has been mentioned here already, but here it is again for posterity:

If you experience waviness/image swapping with the Sunix adapter, do something that will trigger a resolution switch. Sometimes that will fix it after first try, sometimes after 10th try, but it always fixes it (at least for a little while).

Examples on how to trigger resolution switch:

A) Change in-game resolution back and forth

B) ALT+TAB from game to WIndows and back (works only if the resolutions are different or if your game is in exclusive fullscreen mode only)

C) Use the "restart" exe program that is bundled with CRU

etc.

I would still buy this adapter again, the above issues are a little annoying when they happen but I can live with them no probs. Once (if) a superior adapter is released, I would of course get the better one.
 
I ran off a fast charger yesterday during a long Apex Legends session, and I still got the wavy-pixel issue. It seemed like it was less frequent, but that was probably down to luck
 
thats correct, jka i remember those temporal fixes being mentioned some pages back, but those people from the facebook group seem no to have any issues at all after using that power adapter, at least is the message i perceive from them.


I ran off a fast charger yesterday during a long Apex Legends session, and I still got the wavy-pixel issue. It seemed like it was less frequent, but that was probably down to luck

thanks for sharing, sad then there is no real permament fix yet then.
before the facebook group post, i emailed sunix some days ago asking them about the posibility of a fix for those issues, their answer: "let us study that a bit, and not sure if it is an issue or something that we can fix easily by firmware. I will get back to you later on that."

talking about sunix dpu3000, the device seems to be oficially available again in amazon for those interested: https://www.amazon.com/Sunix-Displa...d=1559694456&s=gateway&sr=8-1#customerReviews at the moment of posting this, 4 are left in stock.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
hhhmm even did the nvidia patch even though i'm on VGA, i'll try whole other machine and OS to see if it's the CRT, it would be one thing if i was even using an adapter.
 
but those people from the facebook group seem no to have any issues at all after using that power adapter,

That's only two people, and they may not mess with things like CRU as much as other users do. Like 1920x1440@75hz, the mode I see the problem in most frequently, wasn't available until I added it in CRU.

But I am curious to see other people try it. Maybe my charger wasn't switching into high-amps mode or something. I don't know how fast chargers work.
 
Has there been a better adapter than the Sunix DPU3000? Haven’t kept up with this thread?

Also, in my experience, the adapter loves even numbers for refresh rates. I’d get wavy lines occasionally and blank screen where I’d be required to restart and reconnect the adapter. This was when I was running it at 85hz prime mode for the FW900. Once I used 84hz all problems went away. I’m using an original Apple iPhone charger as well to power the adapter btw. I don’t think it matters much though.
 
Last edited:
  • Like
Reactions: 3dfan
like this
Has there been a better adapter than the Sunix DPU3000? Haven’t kept up with this thread?

Last year HD Fury confirmed they're still developing an HD Fury 5, but that's literally all they had to say about it: that it's still being developed. Of course the longer they wait, the smaller the potential customer base gets.

Well, unless they also make it work for 15kHz standard def CRTs, in which case it could have a larger number of customers in the retro-games community.
 
Has anyone tried the Delock 87685? It's the Sunix equivalent.

They are advertising it as 4k2K at 30 HZ only on Displayport, while Sunix is telling about his DPU 3000:
4096 x 2160 60 Hz in single monitor setup.

I have similar one from Logilink CV0109 with the same specs at least on paper with Delock, which costed me 19 USD, and his max working resolution was only 1280x1024 85 Hz with CRT monitor. 3840x2160 30 Hz pixel clock is 297 Mhz in digital mode , so it should at least work at 1600x1200 100 Hz (270 Mhz) but it isn't (tested with NVIDIA card). I forced it to 1600x1200 (but real resolution reported by monitor was still 1280x1024 ...).

Becouse that I use GTX 970 for now, while Sunix is still unavaiable. Or wait for a price drop for cards with USB-C output and use cheap Sunix C2VC7A0 which is 7 USD on Amazon or ex. Plugable which was reported on this forum as: 330-335 MHz

From leaflet of Delock 87685:
VGA resolution up to: 1920x1200 60 Hz

Somebody however sayed, it's working with higher res/refresh with some adjustements: restricting colour space to 4.2.2 as I remember. But in Nvidia panel I can't do that with my Logilink adapter. There is no such option. I should probably add this in CRU: CEA-861 extended block/video capability/selectable RGB and YCC range and try again. (Tried - not working with latest Nvidia drivers for GTX 970)

For example here: "Delock 62967,190x1200 96 Hz is guaranteed but if it doesn't work you will need to replace the cable,so if you are able to do it this is the most economical solution".

but Delock is saying 62967 is only 270 Mhz pixel clock, so ... I don't know where the truth is, becouse on paper 1920x1200 96 HZ (331 MHz) is far beyond those specs. That Delock was reported however on forum as: 340-355 MHz. Why those makers cant just tell the true specs ?!.

Last year HD Fury confirmed they're still developing an HD Fury 5, but that's literally all they had to say about it: that it's still being developed. Of course the longer they wait, the smaller the potential customer base gets.

Well, unless they also make it work for 15kHz standard def CRTs, in which case it could have a larger number of customers in the retro-games community.

For 15kHz the best and cheapest option is buy old Radeon card for a few bucks. Ex. HD 3470. In contrast to NVIDIA, low res interlaced resolutions like 576i are still working under Windows. Made them with CRU and tested with WIN 8 and 10. The funny thing is that when using such small res, You can even severly underclock such card in Bios using Radeon Bios Editor (I am using 36' Sony CRT for watching movies) . Both GPU/Ram speed and voltage to make power consumption a few Watts only.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Last year HD Fury confirmed they're still developing an HD Fury 5, but that's literally all they had to say about it: that it's still being developed. Of course the longer they wait, the smaller the potential customer base gets.

Well, unless they also make it work for 15kHz standard def CRTs, in which case it could have a larger number of customers in the retro-games community.
I see. Is there a version of the DPU3000 that supports dvi, not just vga? I have a dvi-5bnc cable for the fw900, should be superior to the vga-5bnc that I’m using.
 
They are advertising it as 4k2K at 30 HZ only on Displayport, while Sunix is telling about his DPU 3000:
4096 x 2160 60 Hz in single monitor setup.

I have similar one from Logilink CV0109 with the same specs at least on paper with Delock, which costed me 19 USD, and his max working resolution was only 1280x1024 85 Hz with CRT monitor. 3840x2160 30 Hz pixel clock is 297 Mhz in digital mode , so it should at least work at 1600x1200 100 Hz (270 Mhz) but it isn't (tested with NVIDIA card). I forced it to 1600x1200 (but real resolution reported by monitor was still 1280x1024 ...).

Becouse that I use GTX 970 for now, while Sunix is still unavaiable. Or wait for a price drop for cards with USB-C output and use cheap Sunix C2VC7A0 which is 7 USD on Amazon or ex. Plugable which was reported on this forum as: 330-335 MHz

From leaflet of Delock 87685:
VGA resolution up to: 1920x1200 60 Hz

Somebody however sayed, it's working with higher res/refresh with some adjustements: restricting colour space to 4.2.2 as I remember. But in Nvidia panel I can't do that with my Logilink adapter. There is no such option. I should probably add this in CRU: CEA-861 extended block/video capability/selectable RGB and YCC range and try again. (Tried - not working with latest Nvidia drivers for GTX 970)

For example here: "Delock 62967,190x1200 96 Hz is guaranteed but if it doesn't work you will need to replace the cable,so if you are able to do it this is the most economical solution".

but Delock is saying 62967 is only 270 Mhz pixel clock, so ... I don't know where the truth is, becouse on paper 1920x1200 96 HZ (331 MHz) is far beyond those specs. That Delock was reported however on forum as: 340-355 MHz. Why those makers cant just tell the true specs ?!.



For 15kHz the best and cheapest option is buy old Radeon card for a few bucks. Ex. HD 3470. In contrast to NVIDIA, low res interlaced resolutions like 576i are still working under Windows. Made them with CRU and tested with WIN 8 and 10. The funny thing is that when using such small res, You can even severly underclock such card in Bios using Radeon Bios Editor (I am using 36' Sony CRT for watching movies) . Both GPU/Ram speed and voltage to make power consumption a few Watts only.

That Logilink CV0109 is a dual chip design PS8339+ANX9833
Parade PS8339 is for DVI and HDMI
Analogix ANX9833 is for VGA and its digital input bandwidth is limited to 180 MHz
So with this adapter you can do max 180 MHz on analog output

Delock 87685 and Sunix DPU3000 are one chip design with Synaptics VMM2322 and its digital input bandwidth is 720 MHz with 8 bpc

About specs manufacturers consider VGA for old flat panels with max 1920x1080 60 Hz,so they usually test DAC only up to 200 MHz
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I found an adapter that can potentially have the same performance of DPU3000.
The chipset is the Lontium LT8711X-B which has the same specs as the Synaptics VMM2322.
I asked Lontium about the DAC precision and seems to be 10 bit,but it's a thing that must be tested.
The adapter is the Vention USB-C to VGA (model code CGMHA)
Vention seems to be a company that focuses on quality and effectively the adapter looks well built.
With a card like Sunix UPD2018 it can be connected to any video card with displayport output.

If someone has a Nvidia RTX card or a laptop with the USB-C video output and wants to try,look here:

https://www.amazon.com/Vention-Adapter-Support-Compatible-Pixelbook/dp/B07C4TP4BJ
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top