24" Widescreen CRT (FW900) From Ebay arrived,Comments.

thanks but damn, its getting very confusing, now the guys from Plugable replied they dont have a 90hz compatible adapter :s maybe they decided to discontinue it due to lack of demand.....

maybe getting the sunix only then, thinking about it....

View attachment 144236
They probably mean 90 hz is not supported by them, not that the hardware can't do it.
 
thanks but damn, its getting very confusing, now the guys from Plugable replied they dont have a 90hz compatible adapter :s maybe they decided to discontinue it due to lack of demand.....

Like aeliusg said, officially all these adapters support something lower than what users on this thread have tried. So while officially it only supports up to 60 Hz, someone here as tested the adapter to have an higher max pixel clock and got high refresh rates working fine.
 
thanks but damn, its getting very confusing, now the guys from Plugable replied they dont have a 90hz compatible adapter :s maybe they decided to discontinue it due to lack of demand.....

maybe getting the sunix only then, thinking about it....

View attachment 144236

Don't worry,most of the adapters are officially 1920x1080 60Hz, that's why we search for chipset specs and we test them to see how much they can do.
The most important thing is the digital input bandwidth, DAC can do much more than official specs.
Sunix DPU3000: max input 720 MHz 24 bit (tested up to 500 MHz)
Plugable USBC-VGA: max input 360 MHz 24 bit (tested up to 330-335 MHz)
Delock 62967 with ANX9847 chipset: max input 360 MHz 24 bit (tested worst sample 340 MHz, best sample 355 MHz)
 
jka, are you still having those issues you reported with the sunix some time ago? i was reading some of your past post regarding the sunix for example, you reported that using it at 120hz gave you issues.

my goal would be to use some custom resolutions and refreshes depending on games, such 2048 x 1536 @55hz and 60hz to 1920 x 1200 90hz, can i ask you how your sunix behave at those resolutions with your 1080 ti?
 
jka, are you still having those issues you reported with the sunix some time ago? i was reading some of your past post regarding the sunix for example, you reported that using it at 120hz gave you issues.

Yeah, about that.. I thought I mentioned it in later posts but maybe I forgot. IIRC it was one of those issues:

1. "Scanline sync" from RTSS program not working properly in Unreal Tournament 1999 (under both XP and later Windows), seems like that game works best with "framerate limit" settings which is very rare in my experience.

Also, DOOM 3 engine runs internally at 62.5 FPS (=1000/16) resulting in microstuttering unless you manually change it, which you can only do very little otherwise audio gets out of sync. So being able to set CRT to 63Hz is very handy here, even 60Hz would have audio out of sync in longer scenes. I did not bother trying exactly 62.5Hz since 63Hz was okay on the first try. For posterity and googlers:

Create "AutoExec.cfg" file in /base/ directory. For RoE expansion do it in /d3xp/ directory.

Add the following commands:

# allows console and enables some extra tweaks from below
com_allowConsole "1"

# disables ingame 60FPS limit, makes the game too fast
com_precisetic 0
com_fixedtic 1

# -1 means activating custom resolution mode
r_mode -1

r_customHeight 1200
r_customWidth 1600

# 0 = 4:3, 1 = 16:9, 2 = 16:10
r_aspectratio 0

# 1 = vsync ON
r_swapInterval 0

# must force 63Hz refresh rate this way too
r_displayRefresh 63
r_fullscreen 1

2. Logitech mouse software was not installed under Windows XP so the mouse used some default settings with 2 DPI modes only (one of them probably being somewhat "native") which felt smoother than when using it on later Windows with some shitty intermediary Logitech software that was also probably mis-configured with polling rates lower than maximum. In both cases the mouse was used via USB wire. Could be placebo since I have no means to measure it, but still. My next mouse will be one where I can adjust all settings directly on the mouse itself.

-----------------

All in all, I have gotten the wrong impression and didnt find anything wrong with the Sunix adapter in terms of lag (yet).

my goal would be to use some custom resolutions and refreshes depending on games, such 2048 x 1536 @55hz and 60hz to 1920 x 1200 90hz, can i ask you how your sunix behave at those resolutions with your 1080 ti?

55Hz would make your eyes bleed, are you sure its not a typo? But yes, both resolutions (even 55Hz) should work.

Sunix handled like 99.9% of resolutions I threw at it (upto 500MHz pixel clock). If some res is problematic you can just adjust it a notch higher or lower in the problematic value and its working again.

I use CRU program to manage resolutions on later Windows, you can save each into a profile and just load the desired profile before you launch the game. I need this profiling because of DSR (which works automatically only with highest resolution from the profile, not any other). If you dont need DSR you can just specify 20+ resolutions easily into 1 profile.

-----------------

By the way, since it hasnt been mentioned yet I think, the perfect compromise between high refresh rate and high resolution on FW900 is 98Hz, here is a rough table for illustration purposes: https://imgur.com/a/J0meakU
 
Last edited:
thanks, i use 55hz for some emulated games such classic mortal kombats that run at 55fps, since i like the really close to real arcade feeling playing those with no stuttering, no blur, and with good mame filters, and vsync off with scanline sync for good latency, really feel like the real thing. no eye bleeding while playing those.

i am about to order the sunix, with ordering the following cables which from what i understand are not bundled

cable miniDP to DP:

https://www.amazon.com/AmazonBasics...7Ta7QOHfkKKS0Brik-2hvPPz_p1fFrJjcDG56o1lFFKcY



and USB 2.0 A-Male to Micro B for aditional power:

https://www.amazon.com/AmazonBasics...BtEvpfRrbNioqs8Yzk50qKc89Ewq3vR5XvLiJRRxzbI6M

hope everything works
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
All cables were included in my box, purchased from the link I gave you somewhere above. MiniDP-DP, MiniDP-MiniDP and MicroBUSB-AUSB. So unless you would be mad if you by any chance had to order those later separately then dont order them now I would say :)
 
the idea is to order from your link, but according to sunix page, they dont include power cable and the DPU3000-D2 only includes miniDP to MiniDP, not miniDP to DP, also someone from the US will bring those to colombia soon where i am and dont have much time to wait to see if the cables come and order after if those dont, i maybe be able to get those cables here, bur here those type of cables use to be generic garbaje, or very overpriced, so better not to take risk since those are cheap in amazon.
 
Hm, could you expand on that please? I wasnt able to go over 500MHz (1080ti here).

That is the max possible digital input bandwidth because the displayport receiver is a four lanes HBR2,so 720 MHz 24 bit or 576 MHz 30 bit.
The max analog output is another story,i said tested up to 500 MHz thanks to your report,if i remember correctly someone else in the past has tested resolutions over 500 MHz but i don't know if it was stable.
The causes of the fact that you can't go over 500 MHz can be:
the displayport receiver is not good enough (possible)
the displayport cable can't handle that (i dubt because with 500 MHz you are in HBR2 mode and with at least three lanes should go at 540 MHz)
the internal DAC can't handle such fequency (most probably)
firmware or PCB design problem (possible)

The input lag of Sunix DPU3000 has been tested by Strat_84 with SMTT 2.0 and it's null.
Anyway i think that none of these all-in-one chipset have any input lag.
 
Let us know what cables you ended up receiving in your box later, just out of curiosity.
 
That is the max possible digital input bandwidth because the displayport receiver is a four lanes HBR2,so 720 MHz 24 bit or 576 MHz 30 bit.
The max analog output is another story,i said tested up to 500 MHz thanks to your report,if i remember correctly someone else in the past has tested resolutions over 500 MHz but i don't know if it was stable.
The causes of the fact that you can't go over 500 MHz can be:
the displayport receiver is not good enough (possible)
the displayport cable can't handle that (i dubt because with 500 MHz you are in HBR2 mode and with at least three lanes should go at 540 MHz)
the internal DAC can't handle such fequency (most probably)
firmware or PCB design problem (possible)

The input lag of Sunix DPU3000 has been tested by Strat_84 with SMTT 2.0 and it's null.
Anyway i think that none of these all-in-one chipset have any input lag.

Thanks for the details man! 500 is possibly some arbitrary limit, hard-coded somewhere, because >=501 just doesnt work for me at all.

edit: possibly even 500.01 doesnt work but dont remember that for sure from the top of my head
 
the idea is to order from your link, but according to sunix page, they dont include power cable and the DPU3000-D2 only includes miniDP to MiniDP, not miniDP to DP, also someone from the US will bring those to colombia soon where i am and dont have much time to wait to see if the cables come and order after if those dont, i maybe be able to get those cables here, bur here those type of cables use to be generic garbaje, or very overpriced, so better not to take risk since those are cheap in amazon.

I don't want to break balls but are you sure you want to buy the DPU3000?
Because if you use resolutions like 1920x1200 90Hz or 2048x1536 55Hz there are much more economical solutions.
Anyway if the card you will buy (already bought?) doesn't have USB-C output or you want experiment with very high resolutions then ok,DPU3000 is the best for performance.
 
I have a Sunix and there are a couple limits to consider:

1) Windows will read the hardware EDID from the monitor, ignoring software EDID overrides like CRU, and set an upper limit on custom resolutions. I imagine you could get around this by creating a custom EDID and writing it to something like a Dr. HDMI then feeding it to the ID pins on the VGA cable.

2) The Synaptics chip seems to stop working around 550mHz. I've created resolutions a little higher than that, and got a blank signal. Monitor didn't go to sleep, but it didn't display anything either. This needs more testing though, there may be other variables.
 
i have not ordered anything yet
Derupter
, it worries me that getting a cheaper near limit MHZ pixel clock at 1920x1200 90z will not last long since it would be working near its limits for long and surely would get too hot = reduced life span for sure, also i would prefer more widely available ports such display port, also the sunix seems to be more popular and more tested from users here

at this moment i am not completely sure if i will get a RTX card, a 1000 series instead is posible, currently RTX are being very expensive even midrange ones. (not interested on AMD cards but AMD needs to do something competitive to stop nvidia overpiced monopoly!!)

so having a more expensive and somewhat overkilled adapter such the sunix but still capable of what i want could have a better life span since i would not push it to its limits....yes more expensive but potentially more realiable. also considering that digital to analog adapters in my country are complete garbage, even for 1920 x 1080 @60hz .... (when i ask in stores here about adapters handling better tan 60hz they just wont know what i am talking about, and here you dont have the chance as in the US to return a product if you dont like it or does not work as you expect :(.)

Enhanced Interrogator, thanks for your info, as long as it reads custom resolutions created in nvidia control panel, would be usefull. and well, jka claimed being able to create custom resolutions and worked on his nvidia gtx 1080 ti. which seems positive.
 
Last edited:
I have the Sunix DPU3000-D3 and have no use for it. I live in Romania and am willing to sell it. I'm thinking at 80 Euros but it's not set in stone.
 
To anyone who cares. I ended up selling my Samsung. Ultimately, the inability to adjust brightness while in ULMB mode was the deal-breaker for me. So I'm back to using my Artisan. Not as sharp as my old F520 but whatever. It's still more than adequate for the task at hand.
 
To anyone who cares. I ended up selling my Samsung. Ultimately, the inability to adjust brightness while in ULMB mode was the deal-breaker for me. So I'm back to using my Artisan. Not as sharp as my old F520 but whatever. It's still more than adequate for the task at hand.

Welcome back! :hugs:

I would have thought the brightness would be lower on CRT anyway? Or that wasnt the issue for you?
 
Welcome back! :hugs:

I would have thought the brightness would be lower on CRT anyway? Or that wasnt the issue for you?

The issue was that the monitor's brightness was way too bright using ULMB mode. For whatever reason, Samsung doesn't allow you to adjust the backlight level with it enabled. So to get the full contrast that the monitor was capable of with ULMB enabled, the white point was something stupid like 230 cd/m2! Way too bright for my environment, which has already been setup for CRT's low brightness levels. If I disabled ULMB mode, then I could take it down as low as the CRT, but then I threw away the main reason I got the stinking thing. :)

The only way I could get ULMB mode to work and not have it be too bright was to lower the contrast levels. But, as you would expect, I threw away the contrast advantage that VA monitors would have given. So it was very much an "either-or" kind of deal. And in the end it was a deal breaker. With CRT, yes it's not as bright, but I don't have to compromise on contrast. There's another monitor out there made by AOC which is like the Samsung but it apparently allows you to adjust the backlight while using ULMB mode, thus being able to keep your ~3800:1 contrast while not being eye-bleedingly bright. I may look at that one next.
 
To anyone who cares. I ended up selling my Samsung. Ultimately, the inability to adjust brightness while in ULMB mode was the deal-breaker for me. So I'm back to using my Artisan. Not as sharp as my old F520 but whatever. It's still more than adequate for the task at hand.

pretty much appreciated, thanks for that info, after all we still, (well, me at least) keep using CRT monitors since there still don't seem to be a really good substitute ......yet,.......in 2019!! :cry: just mimics with drawbacks like the one you mentioned,

i have personally witnessed some monitors losing so much brightness even lower than CRT levels or creating double imag effect making that ULMB mode pointlees, but i wasnt able to test latency, i have read that ULMB also introduces input lag, have you experienced that with that samsung?
 
Last edited:
pretty much appreciated, thanks for that info, after all we still, (well, me at least) keep using CRT monitors since there still don't seem to be a really good substitute ......yet,.......in 2019!! :cry: just mimics with drawbacks like the one you mentioned

talking about ULMB mode, i have read that it also introduces input lag, have you experienced that with that samsung?

Yeah, sucks that we're still looking for that alternative doesn't it? :( As far as I could tell, I didn't perceive any input lag difference with ULMB mode. Honestly, if Samsung just let me lower the backlight on the damn thing then I would have kept it. Why they didn't...
 
i have personally witnessed some monitors losing so much brightness even lower than CRT levels or creating double imag effect making that ULMB mode pointlees, but i wasnt able to test latency, i have read that ULMB also introduces input lag, have you experienced that with that samsung?
Double image effect is present on all strobed displays when framerate is half of refresh rate. On CRT also
 
right, i know that. however from my own experience, sometime ago i saw a modern monitor with blur reduction option from the OSD, when i activated it i first notice a big reduction in brightness and when i ran the testufo site street map motion test i notice the text was clearly readable but doubled, i dont remeber what the framerate exaclty was, but i do remeber that the refresh rate and the framerate were the same reported at the bottom of the page.

also from what i have read, in most modern monitors backlight strobing to mimic CRT motion clarity have a more notable flicker effect than CRT so that you need higher refresh rates on those than CRT to achieve same motion clarity without bad notable flicker, so you would need something like 90hz + on most modern monitor to achieve the same clarity and flicker perception than a CRT can at 60hz, so the backlight strobes twice the refresh rate when using refreshes lower than arround 90hz to reduce bad flicker with the drawback of creating double images due to strobe rate = twice the refresh rate even if the framerate is the same as refresh rate. so one would be stuck to higher refresh rates on modern strobed monitors to achieve the smoother flicker non double image non blurry motion that can be achievable at lower refreshes on the CRTs. (i even play some games at 55 HZ on my FW900 and personally i dont notice any annoying flicker)

quote from a moderator from the blurbusters forum: "60Hz strobing on LCD looks like 30Hz CRT or something. It's really bad."

also it seems there is only one modern family monitor model being able to do single strobe at lower refreshes with not so bad flicker and no double images at refreshes such 60hz with the backward of being a TN monitor with crap colors black levels, etc...
 
Last edited:
right, i know that. however from my own experience, sometime ago i saw a modern monitor with blur reduction option from the OSD, when i activated it i first notice a big reduction in brightness and when i ran the testufo site street map motion test i notice the text was clearly readable but doubled, i dont remeber what the framerate exaclty was, but i do remeber that the refresh rate and the framerate were the same reported at the bottom of the page.

also from what i have read, in most modern monitors backlight strobing to mimic CRT motion clarity have a more notable flicker effect than CRT so that you need higher refresh rates on those than CRT to achieve same motion clarity without bad notable flicker, so you would need something like 90hz + on most modern monitor to achieve the same clarity and flicker perception than a CRT can at 60hz, so the backlight strobes twice the refresh rate when using refreshes lower than arround 90hz to reduce bad flicker with the drawback of creating double images due to strobe rate = twice the refresh rate even if the framerate is the same as refresh rate. so one would be stuck to higher refresh rates on modern strobed monitors to achieve the smoother flicker non double image non blurry motion that can be achievable at lower refreshes on the CRTs. (i even play some games at 55 HZ on my FW900 and personally i dont notice any annoying flicker)

quote from a moderator from the blurbusters forum: "60Hz strobing on LCD looks like 30Hz CRT or something. It's really bad."

also it seems there is only one modern family monitor model being able to do single strobe at lower refreshes with not so bad flicker and no double images at refreshes such 60hz with the backward of being a TN monitor with crap colors black levels, etc...
I'm pretty sure ULMB is single strobe only. Some other ones like BenQ blur reduction or Lightboost have double strobing at 60 hz, I think, but can be turned off in the service menu. Only the Eizo FG2421 was double strobe at all refresh rates. The problem with strobing is that if the backlight is not bright enough the image will be very dim with lower duty cycles (lower persistence and better motion clarity). Nowadays LED backlights have no problems being overbright even so a strobe of 2 ms will still be more than bright enough for 120 nits or more. The FG2421 also had a slow VA panel so the double strobing was needed to smooth out the artifacts.

Most people perceive no flickering at 85 hz and above, so that's what Nvidia went with for the lower limit I believe. I can see a perceptible difference between CRT flickering and LCD strobing being mediated by the scanning of CRT vs flashing of the whole screen by LCD.
 
yes, agree that ULMB is single strobe, dont think it just nvidia but monitor manufacturers that decide to make it strobe twice below something like 85HZ - 90HZ on many modern monitors if not most of them due to how bad it flickersm even worse than CRT so that they find it unusuable and compensate with the drawback of doble strobing double images even when matching framerate with refreshrate, an issue non existant in CRTs, regardless of its scanning nature, my point is that based on jbltecnicspro and his samsung ex monitor experience, which sadly seems another modern tech failed attemp to match CRT in this case with motion clarity without such drawbacks, in this case, being to overbright with no option to reduce it, is something i dont really undestand, in this particular case since the LED backlight is to overbright i dont see the point on not letting it to be adjustable. so many struggles still in 2019 to consider.
 
interesting, however, as usual and expected, strobing only oficialy works on that monitor at 100 or higher HZ (n)

"(Motion Blur Reduction) setting causes the backlight to flicker at a frequency matching the refresh rate – with 100Hz, 120Hz and 144Hz selectable"

and surelly, as with other modern monitors, it may be enabled with CRU or something like that below 100hz but with double strobing or any other drawbacks

Honestly, when I had the Samsung this wasn't that big of a deal. You get used to 120hz really quickly. :D
 
well, thats a fact of tastes, for me the problem i see there is that there is the massive GPU power needed for that, so with that monitor it will be needed a GPU that constant renders at 100fps or faster to enable strobing and get similar clarity that a CRT can at much lower refresh rate and that without counting those games that run at 60fps, seems no way to play those with the CRT smoother flicker and clarity.
thanks for sharing the monitor info anyway (y)
 
and that without counting those games that run at 60fps

Yeah, this is one of the most important points. Lots of games are hard locked to 60, like Street Fighter 5, Mega Man 11, Rayman Legends, not to mention basically every game your run on an emulator. They all look way clearer on a CRT.
 
2005? you sure you did not write wrong? in that review says 2015, thats a monitor from 2015

about the lightboost hack, thats correct, i've been following the modern monitor CRT clarity progress since that lightboost hack discovery back in arround 2014
2015 is correct, it was a typo =P

Some Lightboost 2 monitors had pretty bad calibration and issues like pink tint.
Also TN panels used in these monitors were pretty bad and W-LED backlight and color filters used were absolute garbage with colors nothing like sRGB gamut standards specify. I would not get such monitor for its atrocious colors even without counting in Lightboost color degradation.

LCD backlight strobing have few drawbacks inherent to it:
- slight increase in input lag due to LCD pixel color shift happening during screen draw time after which backlight goes on to present image - though of course input lag being dependant on refresh rate high refresh rate will offset it compared to CRT at higher resolutions (some people are so often claiming to use...)
- some pixels (at bottom of screen) do not have enough time to switch pixel color - can be improved by custom display modes with faster pixel clock and more blanking - basically using CRT timing vs. reduced LCD timings have better results
- LCD flicker is more visible at given refresh rate - I would however not go as far as comparing CRT 60Hz to LCD 100Hz let alone 120Hz

Overall from what I have seen Lightboost 2 monitor was pretty good at motion with CRT-like clarity and input lag was not noticeable.
Non-perfect pixel transition times can be dismissed when doing comparisons to CRTs because CRT is also not perfect in motion clarity and one could argue that while have better text readability when moving windows around it have more motion artifacts like long bright smudges. Motion clarity of strobed LCDs are simply "good enough" to the point it becomes non-issue.

For 60fps CRT running at 60Hz will still be far superior than competing display technologies. For console playing FW900 and HDMI-VGA adapter is the best possible display, nothing can beat it PERIOD.
With console 30fps games on CRT unfortunately image doubling is pretty excessive... though still good enough and certainly having lowest possible input lag is great for competitive gameplay. For PS3/X360/WiiU games CRT is pretty awesome choice due to native 720p support.
 
Yeah, I personally saw no flicker with my monitor at 100hz. Definitely not 120hz or 144hz. The VA panel's slow transition times made 144hz a waste though. Honestly, for high-res gaming, if I could give CRT a 10/10, then the Samsung was like a 7.5/10 without ULMB enabled, and a 8.5~9.0 with it enabled. If the backlight was adjustable so that I could get some good contrast, then bump it up to 9.5. It was pretty darn close to CRT.
 
Could somebody point me to a good, working copy of WinDAS? I had it working before, but I don't remember where I found it or how I set it up.

Thanks.
 
Back
Top