24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Discussion in 'Displays' started by mathesar, Sep 13, 2005.

  1. aeliusg

    aeliusg Gawd

    Messages:
    726
    Joined:
    Jan 14, 2016
    The 5V was meant to power the TTL, since your adapter chip should already get USB power you don't have to plug it in usually.
     
  2. crt_modern_user

    crt_modern_user n00b

    Messages:
    8
    Joined:
    Apr 18, 2019
    So maybe that is the reason that my adapter dosen't work? I will try it out.

    Also another question: when I plug the +5V cable on the monitor it seems that gives some power to the monitor, 'cause the standby turns to green, what about that?

    EDIT: nope, that +5V cable wasn't the problem, so definely that adapter will not work anyway. What about this other adapter that I found on Amazon (https://www.amazon.com/MonkeyJack-MAX3232-Serial-Converter-Connector/dp/B06Y2CXMQT), one review says that it worked with WinDAS...
     
    Last edited: Apr 28, 2019
  3. aeliusg

    aeliusg Gawd

    Messages:
    726
    Joined:
    Jan 14, 2016
    I've never used RS232 with WinDAS, but the product looks well enough put together that it should meet the bare minimum of functionality... Most of the info I know was from here: http://www.massmind.org/images/com/geocities/www/gregua/windas/cable.htm and whatever else was in the guides put together by users here.
     
  4. crt_modern_user

    crt_modern_user n00b

    Messages:
    8
    Joined:
    Apr 18, 2019
    So the next adapter that I will buy will be a USB based one. Also it could be a good idea to buy it from the original seller? Because he said me to remind him for a resent.
     
  5. ellover009

    ellover009 [H]ard|Gawd

    Messages:
    1,844
    Joined:
    Jul 17, 2005
    I sorta did. You wont quite find anything like it 100%. Oled are still a bit expensive, they still aren't burn proof.
    Tried a few things and found something I like. First replacement was one of those Korean overclock monitors, mine would max 95Hz. I ran mine 80-85ish, had to run some program to compensate for the color shift/dimming from running o/c. It was fine until I saw tearing. drove me nuts at first, then I got used to it. It was around 24-25 inch. It lasted around 2yrs, I was ready to try something more name brand this time. Got a viewsonic XG2703-GS with viewsync. No more tearing with gsync, no more blanking out when you switch refresh rates, it felt more like my old crt, and this thing will hit 165Hz.
    Observations so far are that, the FW900 days are limited due to the aging of the components and the weight, cost and complexity to keep them running in modern times. As much as I loved mine the benefits of LCD's is that it occupies less space behind the screen, I can go bigger and more of it is actually screen, less uniformity issues "a circle will be a perfect circle", less power consumption and radiation output, I can flip it vertically if I wanted and it's no longer messing up my desk with it's weight.
     
  6. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    CRTs aren't burn proof either, RTINGs did a burn in test with a ton of OLEDs and 1 IPS panel. the IPS burned in just as bad as the OLEDs :ROFLMAO: I just bought a 55" LG B8 and it was a hair over 1K (USD) shipped. Several OLED factories are being spun up right now, they be using the new OLED printer-jet mfg process. so prices will be coming down hopefully. BTW funny thing with power saver off i can see the power drop on my watt meter with darker scenes. :LOL:

    I've noticed CRT collection on Reddit is slowly rising. Seems to be inline with retro gaming. I've noticed there a few on youtube restoring old old old CRTs, i imagine the true hard core will have the expertise on how to re cap boards or in worse case scenario refurbish the tubes like Hawk-Eye Picture Tube MFG did in recent past years.
     
  7. Strat_84

    Strat_84 Limp Gawd

    Messages:
    255
    Joined:
    Jul 16, 2016
    Indeed, CRTs aren't burn proof. I retrieved an IBM P275 some time ago with a burnt tube, I couldn't believe my eyes. The screen didn't have a very high hour count but I would bet it came from an office and the idiots made it display some sort of DOS program during its entire life, with a fixed window and full contrast. Now the program interface is printed in the phosphors. It's a pity but nothing could be done except saving the boards for spare parts. :(
     
    cdoublejj and DooLocsta like this.
  8. XoR_

    XoR_ Gawd

    Messages:
    678
    Joined:
    Jan 18, 2016
    IPS panels do have image retention issues. Some less, some more but all seems to have it. This is however hardly an issue because it seems to always go away after displaying some content that forces pixels to change a lot like movies or something. Crystals need to be shaken a little to realign themselves properly - because LCD panel have actually moving parts are are thus not purely solid state like for example CRT... I always find this absolutely crazy XD

    In case of OLED we are worried about permanent image burn-in.
    In plasma panels there were both not-permanent image retention issues and permanent phosphor burn-in to deal with.
    Not sure if OLED have only permanent burn-in or does it also have impermanent one also?
     
  9. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    i think OLED has both because my TV has a thing it can run when you turn it off to help mitigate burn in. i've also heard there some stuck pixel type videos one can run to help reduce retention that sometimes helps. i remember i had a iMac freeze once over night and didn't catch it till the next day, it took a looooong time of "Stuck pixel" videos on YouTube before it got better. (IPS)
     
  10. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    is it normal to get left / right jitter @ 2304 x 1440?

    EDIT: it seems when i force it with 70, 75 and 80z the picture jiggles and shakes by a few "pixels" it's constant and noticeable
     
    Last edited: May 5, 2019
  11. 3dfan

    3dfan [H]Lite

    Messages:
    82
    Joined:
    Jun 2, 2016
    it seems to be "normal" to get any type of issues with the sunix and the FW900. looks like every sunix - FW900 user have had some type of issues unfortunately.
    since the sunix seems not being made with high refresh - resolutions analog monitor in mind its unknow if those issues would be ever fixed sadly.
    from my limited testing with the sunix for a week, 2560x1600@72hz was the only res- refresh convination that never developed any sign of issues, but that was only a 2 hour test, not long enough for a true verdict.
     
    Last edited: May 7, 2019
  12. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    858
    Joined:
    Mar 23, 2013
    1920x1440@75hz, the resolution I play Apex Lends at, will get the "jitter" sometimes. But I just alt+tab then switch resolution real quick (I have a program that does this from the taskbar) and then switch back. Takes about 5 seconds. Sometimes the jitter comes back in 10 minutes, sometimes it doesn't come back. Seems to eventually happen if I play for long enough though, like more than a couple hours.

    Other resolutions, though, it never happens. Like I play Battlefield 5 at 1792x1344@80hz, and I've never seen it there. Likewise for the 2880x1800@60hz I use for Mega Man 11.
     
  13. SwedishMeatball

    SwedishMeatball n00b

    Messages:
    3
    Joined:
    Mar 18, 2019
    So I managed to get my hands on an FW900 the last month. It suffered from the brightness problem so I have done the "color restoration" which made it much better. However, it stills takes the monitor about half an hour before it settles regarding to brightness/contrast. So when I power it on I have to lower brightness to 0 in order for black levels to be where they should, otherwise the picture is washed out. Once it settles I crank brightness up to around 24 (or I do it gradually over time). I have gotten the necessary equipment for full calibration according the Windas guide here on the forum, but I am questioning if changing the G2 value will help with this? I read a few pages back and this seems to be a different issue? I am thinking that fixing G2 will just move the sliders up, so that I will have to start with a value of 10-15 and end up with something in the 30-40 range?

    What's your advice?

    I am thinking of doing the Windas anyway, as I have problems with too much red in the "near-black" range IRE. Also, when the screen starts from cold, the elevated blacks are reddish which confirms this I believe.

    Is it safe to run Windas on WIndows 10? Should I run it in a compatiblity mode with Win 7?
     
  14. spacediver

    spacediver 2[H]4U

    Messages:
    2,468
    Joined:
    Mar 14, 2013
    the 30 min warmup is normal. My best guess is that it is the tube's attempt to compensate for the time it takes for the cathode to heat up.
     
  15. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    I'm straight in to my laptop actually, the cord my dw came with seems thinner than most VGA cables so that made me wonder if that could have something to due with it, maybe it's not shielded or something.
     
  16. Ashun

    Ashun n00b

    Messages:
    38
    Joined:
    Mar 4, 2016
    SwedishMeatball WinDAS works fine on Win 10. Just make sure to register msflxgrd.ocx. It's been a while, but I think I had to use the SysWOW64 directory instead of System32.
     
    spacediver likes this.
  17. Flybye

    Flybye Limp Gawd

    Messages:
    188
    Joined:
    Jun 29, 2006
    It is not just the parts/repair cost of the FW900 or the physical advantages of a new panel. Some games are recommending 8+GB of vram. The 980ti only have 6GB and the only other option is a Titan X with 12GB. The point will come where games will be recommending over 12GB of vram and then what? I keep reading OLEDs will make us happy in the future, though.
     
  18. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    858
    Joined:
    Mar 23, 2013
    My GTX 1080 + Sunix DPU3000 combo is working out great. Approximately 550mHz pixel clock, higher than Nvidia will let you go on the analog output on a 980Ti
     
  19. Strat_84

    Strat_84 Limp Gawd

    Messages:
    255
    Joined:
    Jul 16, 2016
    Yes, graphic card speed limitation isn't a relevant point since there is a couple of quality adapters that allow to keep using the FW900 (or any VGA screen) with digital only outputs.
     
  20. spacediver

    spacediver 2[H]4U

    Messages:
    2,468
    Joined:
    Mar 14, 2013
    Has anyone tested the ADV7123 chipset? (10 bit and looks like it does at least 330 MHz)

    Or know of any 10+ bit adaptors capable of ~330 MHz?
     
  21. Flybye

    Flybye Limp Gawd

    Messages:
    188
    Joined:
    Jun 29, 2006
    But I still have not discovered one that outputs at 80hz. Do you know of one? The Sunix DPU3000 shows only 60hz, and I would like to be able to do 2304x1440 @ 80Hz.
     
  22. 3dfan

    3dfan [H]Lite

    Messages:
    82
    Joined:
    Jun 2, 2016
    Flybye, thats very strange, i have a sunix dpu3000 and can do 2304x1440 @ 80Hz and even can get the maximum 160hz and all the refreshes FW900 support.
    you seem the first one of all sunix dpu3000 users here reporting that 60hz max limit from what i remember...... sure something is not correctly setup on your graphics card - sunix - fw900 combo or your sunix maybe defective? or..maybe sunix downgraded the chip frequency in your model? (hopefully not)

    i strongly suggest you and other users recently asking about digital to analog video converters to make a search on this thread about words like sunix, delock, dac, plugable, displayport to vga hdmi to vga... its not a long 400+ page search, just look for the more recent results, the digital to analog discution is relatively new since the last graphics card vga compatible was produced some few years ago.

    also as a quick help, did you tried software like CRU (custom resolution utility) or via your graphics card control panel to create the custom resolution?
     
    Last edited: May 13, 2019
    cdoublejj likes this.
  23. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    whats weird is i used to be able to 2304x1440 @ 70Hz with out all the wiggling,jiggly shaking picture. i wonder if my it's monitor it's self. it was supposed to be low hours


    EDIT: you know what all the specs in the CRU and Nvidia utlitiy might not be quite right unless the horizontal and vertical frequencies don't matter so much?
     
    Last edited: May 13, 2019
  24. Derupter

    Derupter Limp Gawd

    Messages:
    153
    Joined:
    Jun 25, 2016
    That is only a DAC, it needs a digital receiver and everything else and i don't know custom adapters with that chip.

    Adapters with 10 bit DAC:
    Not 100% sure but these seem to be the correct specs:
    HDFURY Nano GX 10 bit DAC but digital input is 8 bit
    HDFURY2 11 bit DAC but digital input is 8 bit
    HDFURY3 11 bit DAC and digital input up to 10 bit
    HDFURY4 11 bit DAC and digital input up to 12 bit

    Chipset with 10 bit DAC:
    ITE IT6562 digital input is 360 MHz 24 bit (288 MHz with 30 bit), DAC is 10 bit tested up to 200 MHz (by the manufacturer)
    Adapters with this chip are Delock 63924-63925

    The new Lontium LT8612UX is out:
    input is HDMI 2.0 (600 MHz 24 bit and 480 MHz 30 bit)
    DAC seems to be 10 bit but sadly it has only been tested up to 200 MHz
    The funny thing is that the chip from which it derives(LT8612SX) has been tested up to 300 MHz
    No adapter with this chip for now because it has just come out

    ITE has a new chip IT6564 with 720 MHz input,DAC seems to be 8 bit tested as always up to 200 MHz,this is a good competitor to the Synaptics chipset

    Now the question is,with an adapter capable of 10 bit analog output do you need to set the digital output to 10 bit in order to use your custom LUT?
     
    Last edited: May 17, 2019
    spacediver likes this.
  25. spacediver

    spacediver 2[H]4U

    Messages:
    2,468
    Joined:
    Mar 14, 2013
    Really appreciate your great answer here Derupter.

    And before we get to this question, are you saying there are three independent bit depths going on here?

    1) bit depth of DAC
    2) bit depth of digital input
    3) bit depth of digital output

    Either way, my hunch is that digital output being set to 8 bits would still allow 10 bit LUT. But this is based on my speculative understanding of how things work.

    If anyone has any adapters they want tested for LUT bit depth, feel free to PM me. Alternatively, if anyone has a colorimeter and a DAC they want to test for bit depth, I can provide instructions on how to do so.
     
  26. Derupter

    Derupter Limp Gawd

    Messages:
    153
    Joined:
    Jun 25, 2016
    By digital input i mean the input of the adapter which is the digital output of the graphic card
    By digital output i mean what exit from the graphic card and what you set on graphic options
    Digital receivers of these adapters usually pass the signal as it is,if the DAC is 8 bit the receiver trims the lower 2 bits (in case of 10 bit input)
    From what i understood to preserve all the informations of the custom LUT you need to set the digital output of the graphic card to 10 bit, but i don't know much about these things
    The downside of this is that you lose bandwidth on the digital part,in this case the only chip with enough bandwidth for 10 bit is the Lontium LT8612UX,the ITE IT6562 can do max 288 MHz
     
  27. spacediver

    spacediver 2[H]4U

    Messages:
    2,468
    Joined:
    Mar 14, 2013
    Thanks for clarification.

    Yea, it's a good question, I suppose we could do a test with an adaptor that has 10 bit DAC and see whether we get 10 bits of LUT precision even when working with 8 bit digital output.

    It would indeed be nice to preserve bandwidth by maintaining 8 bit digital output while still benefiting from 10 bit LUT precision.

    I do know that my Nvidia GTX 660 has 10 bits of LUT precision yet I can run 2304x1440 @ 70 hz (which requires > 330 MHz), although to be fair, I've only tested bit depth of LUT at 1920x1200 @ 85 hz, which is ~ 281 MHz). I've no way of knowing if this is in the context of 8 bit or 10 bit digital output though.

    It's hard to imagine how the DAC would be able to achieve 10 bit precision with only 8 bit output though... If I'm not mistaken, 8 bit output means that the DAC is receiving a time series, where each sample (for each of three channels) is an 8 bit integer (e.g. 00001111). So there are 256 unique values that the DAC gets fed for each sample, and without access to the videoLUT, it will have no idea how to "scale" these values to convert into voltages (i.e. it would just scale the voltages linearly between peak voltage and 1/256 * peak voltage). The only way I see around this is having 10 bit integers fed into the DAC (which are still scaled linearly), but which can now encode 10 bit precision of the LUT.

    (the alternative is to have a 10 bit LUT that can be loaded onto the DAC itself, which could do the remapping of 8 bit integers onto voltages - I believe some monitors have this capability for the same reason).

    Really would be interesting to see someone try and cannibalize a DAC from these older video cards and merge it with an HDMI or DP to VGA converter - I don't know enough to know whether this is even feasible though.
     
    Last edited: May 15, 2019
  28. XoR_

    XoR_ Gawd

    Messages:
    678
    Joined:
    Jan 18, 2016
    All these issues with 8b/10b are caused by Nvidia's badf implementation of digital outputs that does not use any dithering. There is zero issues with banding on AMD cards when using 8bit connection and implementation is so good I doubt anyone would pass ABX test with it vs. true 10bit

    DAC with its own LUTs and preferable even gamut correction would be awesome but it is too much to ask for imho.
     
  29. Derupter

    Derupter Limp Gawd

    Messages:
    153
    Joined:
    Jun 25, 2016
    Well the DAC integrated on old graphic cards is inside the GPU so it seems impossible to cannibalize it.
    Also with the integrated DAC you don't have to take care about the digital output for the same reason.
    But with an external adapters with a proper 10 bpp DAC we must take care of it because there is a digital output interface before the DAC with all its problems (limited bandwidth,dithering,bit depth,ecc.)
    I'd like to try one of these new chip but all the new adapters are USB-C and i don't have this output.
    Maybe i will buy a card like the Sunix UPD2018 so i can connect those things in any video card,or a nice USB-C connection on my next card.
     
  30. Derupter

    Derupter Limp Gawd

    Messages:
    153
    Joined:
    Jun 25, 2016
    I seem to have read not long ago about some registry keys to enable dithering on Nvidia cards with one of the last drivers.

    EDIT
    Here https://forums.geforce.com/default/...rver-to-geforce-driver-/post/5934577/#5934577
     
    Last edited: May 17, 2019
  31. Flybye

    Flybye Limp Gawd

    Messages:
    188
    Joined:
    Jun 29, 2006
    Sunix website only mentions 60hz even at 1920x1080. I figured that is all it can push. I sent them an email a few days ago and have NOT heard from them to see if they can confirm 2304x1440 @80 or 85hz. I would like them to confirm it can just in case those saying they can run at 80Hz + are just really like ones.
     
  32. Strat_84

    Strat_84 Limp Gawd

    Messages:
    255
    Joined:
    Jul 16, 2016
    60Hz is usually what you'll find on the datasheet of all converters because it's the default refresh rate with LCDs. That doesn't mean it can't run higher, but you'll usually need to use CRU to set custom resolutions with a different refresh rate.
     
  33. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    858
    Joined:
    Mar 23, 2013
    There are a dozen people on this forum that have confirmed it can run beyond a 500mHz pixel clock. Sunix the company has been selling it as a multi-monitor splitter, so they never stress-tested it like we have.
     
  34. Flybye

    Flybye Limp Gawd

    Messages:
    188
    Joined:
    Jun 29, 2006
    Well that Is interesting to know! Only problem is I can not find it anywhere in the US for sale. :( Has it been discontinued? I never even got a reply email from Sunix. What other adapters have been verified to do 2304x1440@80Hz if the Sunix will no longer be available?
     
  35. cdoublejj

    cdoublejj Limp Gawd

    Messages:
    229
    Joined:
    Apr 12, 2013
    i played around this morning and last night, no matter what resolution i choose it still get jiggly shaky vision. weather i use my video card directly or the sunix at any resolution even the super low ones. i must have some sort of jacked up vertical and horizontal frequencies. power strip used to automatically adjust all those. CRU dsure didn't and it's shaky. i'm to ignorant and ignorant to know what the hell i'm doing.
     
  36. Derupter

    Derupter Limp Gawd

    Messages:
    153
    Joined:
    Jun 25, 2016
    Delock 87685 seems to be the same product,no one tested it and probably you have to order it from Europe
     
    spacediver likes this.
  37. 3dfan

    3dfan [H]Lite

    Messages:
    82
    Joined:
    Jun 2, 2016
  38. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    858
    Joined:
    Mar 23, 2013
    Are you choosing "CRT standard timings" in the dropdown box?
     
  39. aeliusg

    aeliusg Gawd

    Messages:
    726
    Joined:
    Jan 14, 2016
    The Delock 87685 is available here: https://www.grooves-inc.com/delock-...ock-hardware-electronic-pZZa1-2097988797.html;https://www.grooves-inc.com/delock-...ock-hardware-electronic-pZZa1-2097988797.html

    Looks like a reliable shop and you can use PayPal without creating another extraneous account.

    The specs given do say 1920x1200 at 60 Hz max on the VGA output as opposed to 2560x1600. The DPU3000 specs claim support for 2560x1600 on all outputs. The chipset is the same, though. Could be a difference in documentation only.
     
    Last edited: May 18, 2019
    Flybye likes this.
  40. Enhanced Interrogator

    Enhanced Interrogator Gawd

    Messages:
    858
    Joined:
    Mar 23, 2013
    Yeah, I'm going to take the specs on a website ending with ".land" as gospel. It's got the Synaptics chip, it's good to go.