I remember Viewsonic having one of the sharpest clearest images right up there with Sony and the other champs. Anyone remember what types of tubes they used if they made their own or sourced it?That reminds me, I found this sole listing of a G90f on ebay: https://www.ebay.com/itm/Serial-339...323481?hash=item1aa475a699:g:BcsAAOSw7Nlfbkfq
View attachment 282956
I feel luckier and luckier with my local snag of a mint condition G90f for $120 this year by the day!
I've finally taken the plunge on a CRT a couple of weeks ago (an Iiyama Master Vision Pro 451 for about £88 excl. delivery). Mighty fine tube, much better than any display I've had (save for the Kuro I bought a couple months ago, however they have their own merits). Tested the display in Halo CE and things really popped off the screen and looked almost 3D-like, and the 0ms response times at refresh rates going all the way up to even 200hz is unlike anything I've ever seen before. However, I'm having quite a bit of trouble getting things set up properly.
I first bought an Icy Box converter thanks to the reports of many users on this forum, however I apparently didn't read enough to the part where I learnt that DisplayPort doesn't support interlaced at all on windows; presumably not due to a theoretical incapability to support it, but because manufacturers aren't bothered enough to support such a niche market. This was quite the dealbreaker for me as being able to do things like 2048x1536i at 120hz and certain lower resolutions even higher was the dream.
So I invested in a couple more bits of kit to test, one of which being an active HDMI 2.0 to miniDP converter as I was hoping I'd be able to harness HDMI's proficiency for interlacing that way instead, along with a £7 Benfei VGA to HDMI adapter (well, not that exact one but a Rankie instead) in case that set-up didn't work properly, as per it's supposed ability to push up to 345mhz pixel clock.
The HDMI to miniDP converter didn't work at all with the setup. the Icy Box had no idea what was going on and refused to output any image at all using that method, so I guess daisy-chaining is out the window until further notice. the Rankie adapter, however, was another story. To my surprise, the image quality completely rivalled the Icy Box (atleast to my untrained eye), and after adjusting the max pixel clock in CRU, I was indeed able to run the interlaced resolutions the £77 Icy Box couldn't.
However, as stated by other users, past a certain point with the pixel clock, the darker areas of the image begin to have a rather nasty green outline. Now this can be resolved to a certain extent through turning up the brightness of the image in software using Radeon Settings, however once you breach something around 310mhz, the green outlining is there to stay no matter what.
Except, maybe not. According to the reports of other users using this adapter in this forum, setting the colour space to YCBCR instead of RGB resolves this issue instantly. However, no matter what I do, Radeon Settings will not expose any YCBCR colour space to me unless i run at TV-like resolutions such as 1920x1080 at 59hz, which is obviously no good for a 4:3 CRT and at this resolution and refresh rate such modifications aren't even necessary anymore, however it running it there does prove that it is indeed YCBCR-capable, suggesting that the issue is indeed some sort of arbitrary software limit, compoundeded by the fact that NVIDIA GPUs apparently have no problem doing this, according to other user reports. I've tried everything I can think of to get it to expose the option, including swapping out the monitor's EDID entirely with custom drivers for it, along with downgrading to the earliest AMD driver available to my Vega 64, however nothing seems to get it to show up.
This is the one thing that's stopping me from fully enjoying this monitor at it's full potential and also getting my £77 back for the Icy Box that doesn't want to accept interlaced at all. I've already emailed AMD support about the issue in the hopes that they might be able to provide some kind of workaround, however if anyone else here has already figured it out since the last sub-thread in here about these adapters in particular, then I'd be incredibly grateful if you could let me and anyone else interested in getting this kind of setup working know. It's a long shot, I know, but I figured if anyone has found a way to solve this problem, it'll likely be one of you fine chaps on this HardForum thread.
Did you have the external usb power connected to the Icy Box when you tested the daisy-chain with those adapters?
About the YCbCr with the AMD card try some things with CRU:
-open CRU and add an extension block (type CEA-861)
-add a data block inside this extension block (HDMI support with default settings)
-eventually try to add some interlaced resolutions (in the detailed resolutions of the extension block)
-save all and restart the drivers
These things should be resolved when a HDMI 2.0 to VGA adapter with Lontium LT8612UX will come out.
I've attempted these changes, however YCbCr (thanks for showing me how to capitalize that right by the way!) still wont expose itself on any non-tv resolution and refresh rate (1920x1080@60 is as high as I can go).
Interlacing 60hz (for an effective 120) at 1920x1080 doesnt work in getting the colour space to show up, however the included 30hz interlacing for an effective 60 does indeed work just fine with it.
I did have usb power hooked up to the Icy Box (doesn't work at all for me without it). I just took a look at things again, and the miniDP to HDMI adapter does work on my laptop, just not with that configuration, so perhaps it's a one-way adapter and I'm just a poopyhead?
Something in the displayport output of that adapter is incompatible with the Icy Box, maybe the hotplug signal.
About the YCbCr you can only try with some of those settings of CRU and see if something work.
About the interlaced resolutions, i can't test much because with my card (AMD 7950) they work without problems with my Displayport to VGA adapter (Delock 62967) with AMD 19.3.3 on Windows 7 (never tested on Windows 10)
So must probably it's a problem with those stupid drivers, they remove useful things and add stupid things every year.
I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall(like it would be directly plugged on a VGA output)
For the PerfectFlat tubes used in the G series, if I'm not mistaken, they were ViewSonic's own make. There weren't many flat shadowmasks out there.I remember Viewsonic having one of the sharpest clearest images right up there with Sony and the other champs. Anyone remember what types of tubes they used if they made their own or sourced it?
However, no matter what I do, Radeon Settings will not expose any YCBCR colour space to me unless i run at TV-like resolutions sch as 1920x1080 at 59hz,
I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall
Ah, funny. Maybe that model for some reason is well detected on AMD cards and not on Nvidias, when it was the contrary with the Delock one.I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall
I have over 200 hours on mine with no issues.Last couple of days with the StarTech connecting my laptop to FW900, I've noticed the display disappearing/clicking once in a while. The screen disappears almost like it's changing resolution, but it's not. Anyone else notice such a thing? I've ruled out the monitor itself I think, hopefully, because on my desktop machine, which still has an analog out, it's not doing that. It might be my laptop though.
I have over 200 hours on mine with no issues.
If it doesn’t do it on the PC, then maybe it’s a
power setting on the laptop?
Fun to see a fellow Windows 7 user out in the wild! Just tested 19.3.3 on my rig with the Icy Box, interlaced still doesn't appear to work. My display seems to switch continously on and off like it's attempting to change resolution when applying an interlaced mode this way, so perhaps it's actually a hardware issue regarding the chip on the Icy Box. Damn shame. Would be good to know if there's anyone here with that StarTech DP to VGA converter to test interlacing on there so we know for sure.
In the meantime, would you happen to know where or when we can expected a HDMI to VGA converter with that Lontium LT8612UX onboard to come out? Checking their website on it, seems that chip began production sometime in mid-2019.
Last couple of days with the StarTech connecting my laptop to FW900, I've noticed the display disappearing/clicking once in a while. The screen disappears almost like it's changing resolution, but it's not. Anyone else notice such a thing? I've ruled out the monitor itself I think, hopefully, because on my desktop machine, which still has an analog out, it's not doing that. It might be my laptop though.
AMD broke something in the drivers with the new cards (from VEGA, i'm not sure about Polaris)
In the meantime, you can try with a HDMI adapter that not requires the YCbCr to reach high pixel clock, which is a difficult find because we tested only few models.
I can only recommend the Vention AFVHB which uses the LT8612SX, this chipset is the last before the LT8612UX, it should handle the full HDMI 1.4 bandwidth, no one tested it so no guarantee that it works good without YCbCr, but on paper it seems a good chipset.
Also it seems that with HDMI the input bandwidth limit can be exceeded a little using the graphic control panel to force custom resolutions (over 340 MHz)
yes, tried and that works, but i guess it will depend on the motherboard capabilities, the one i tried is asus z87pro which has that option mentioned in the video, tested with gtx 1080 ti, and worked, performance and latency seems the same as outputting directly from the geforce output, but it was a quick, not absolute reference test though.
i had to select primary display from bios as "iGPU" otherwise motherboard vga output wont display anything.
also was not able to acess nvidia control panel, when trying to launch it just a message telling me about not using a nvidia gpu was displayed, and only resolutions available or created from the integrated gpu utility were usable, everthing created from toasyx´s custom resolution utility (cru) was ignored, and igpu utility just allowed me to created some few resolutions like 1920 x 1080 60hz, was not able to create something like 1920x1200 60hz.
interesting finding , but the lack of using own discrete graphics utility and the lack of resolution - refresh combos make it rather wortheles in my opinon, definitelly better to stick to a good digital to analog active adapter.
Which operating system are you running? If you are running Catalina (Mac OS 10.15.x), there has been several reports of external monitor flickering; and some users have suggested some fixes to the issues. Please check the internet under "external monitor flickering while running Catalina"... Also, other Mac OS including High Sierra may cause flickering issues...
Hope this helps...
AFAIK not possible at all on DP adapters. Probably possible with HDMI adapters, maybe Derupter can shed the light on that question. Also wait till Rtas will test out the new Vention HDMI adapterHas anyone succeeded with getting interlaced resolutions to work on Windows 10 with Displayport adapters?
I am still at 8 bits per channel up to 240 MHz. Here's DisplayCal:If what J5 Create told me is correct, that adapter uses the chipset ITE IT6516BFN + another VIA chip to handle the USB-C connection.
So the max clock should be 180 MHz with 8 bpc and 240 MHz with 6 bpc, do you see somewhere in the graphics control panel, if the color depth changes when you are over 180 MHz?
What happen if you go over 240 MHz?
I updated the adapters summary here with the new entries, i'll add more when your tests with Tendak and J5 Create adapters will finish.
Of course not. But, they're not comparable. People keep saying this as if older architectures hold their value well.That's way weaker than the GPU in the PS5. You're not going to be running any modern games on three monitors with that card.
I'm getting a 3080, but I only really crave the extra performance for speedier GPU-based media work. While most games I play need something better than my current GTX 1050 for 720p@125fps stable, a 3080 is definitely overkill. Also, the 3080 has a hugely better price to performance ratio than the 1080 and 2080 when they were new, so that's tempting as well.Titan X Maxwell is what I still have. Got used a few years back from eBay. Came with water cooling, which presumably addresses a design weak spot it had. At the time, it seemed like a nice bookend on the GPU front for my presumably last system anchored by a CRT.
Might be fun to throw an RTX 3800 at it though to get a taste of ray tracing before I upgrade from the FW900 to whatever is next.
I am still at 8 bits per channel up to 240 MHz. Here's DisplayCal:
View attachment 286038
I use xrandr for custom resolutions. When I tell it to output any resolution/refresh rate with a pclk higher than 240 MHz, my CRT very briefly flashes off and on (like it's trying to change resolution), and I get "xrandr: Configure crtc 0 failed" as an output in the terminal.
I only play FPS games, 6700K + Titan X Maxwell will be more than enough to maintain over 200fps in most modern FPS titles, so don't worry. I just want to have 1 monitor for games and 1-2 side monitors for stuff like discord/music, but it's not a big deal either way, because I have another PC with a 3700x + 2070S that I can use for other games if I want.That's way weaker than the GPU in the PS5. You're not going to be running any modern games on three monitors with that card.