24" Widescreen CRT (FW900) From Ebay arrived,Comments.

That reminds me, I found this sole listing of a G90f on ebay: https://www.ebay.com/itm/Serial-339...323481?hash=item1aa475a699:g:BcsAAOSw7Nlfbkfq
g90fb.png

I feel luckier and luckier with my local snag of a mint condition G90f for $120 this year by the day! :joyful:
Crazy listings.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Greetings!

I've finally taken the plunge on a CRT a couple of weeks ago (an Iiyama Master Vision Pro 451 for about £88 excl. delivery). Mighty fine tube, much better than any display I've had (save for the Kuro I bought a couple months ago, however they have their own merits). Tested the display in Halo CE and things really popped off the screen and looked almost 3D-like, and the 0ms response times at refresh rates going all the way up to even 200hz is unlike anything I've ever seen before. However, I'm having quite a bit of trouble getting things set up properly.

I first bought an Icy Box converter thanks to the reports of many users on this forum, however I apparently didn't read enough to the part where I learnt that DisplayPort doesn't support interlaced at all on windows; presumably not due to a theoretical incapability to support it, but because manufacturers aren't bothered enough to support such a niche market. This was quite the dealbreaker for me as being able to do things like 2048x1536i at 120hz and certain lower resolutions even higher was the dream.

So I invested in a couple more bits of kit to test, one of which being an active HDMI 2.0 to miniDP converter as I was hoping I'd be able to harness HDMI's proficiency for interlacing that way instead, along with a £7 Benfei VGA to HDMI adapter (well, not that exact one but a Rankie instead) in case that set-up didn't work properly, as per it's supposed ability to push up to 345mhz pixel clock.

The HDMI to miniDP converter didn't work at all with the setup. the Icy Box had no idea what was going on and refused to output any image at all using that method, so I guess daisy-chaining is out the window until further notice. the Rankie adapter, however, was another story. To my surprise, the image quality completely rivalled the Icy Box (atleast to my untrained eye), and after adjusting the max pixel clock in CRU, I was indeed able to run the interlaced resolutions the £77 Icy Box couldn't.

However, as stated by other users, past a certain point with the pixel clock, the darker areas of the image begin to have a rather nasty green outline. Now this can be resolved to a certain extent through turning up the brightness of the image in software using Radeon Settings, however once you breach something around 310mhz, the green outlining is there to stay no matter what.

Except, maybe not. According to the reports of other users using this adapter in this forum, setting the colour space to YCBCR instead of RGB resolves this issue instantly. However, no matter what I do, Radeon Settings will not expose any YCBCR colour space to me unless i run at TV-like resolutions such as 1920x1080 at 59hz, which is obviously no good for a 4:3 CRT and at this resolution and refresh rate such modifications aren't even necessary anymore, however it running it there does prove that it is indeed YCBCR-capable, suggesting that the issue is indeed some sort of arbitrary software limit, compoundeded by the fact that NVIDIA GPUs apparently have no problem doing this, according to other user reports. I've tried everything I can think of to get it to expose the option, including swapping out the monitor's EDID entirely with custom drivers for it, along with downgrading to the earliest AMD driver available to my Vega 64, however nothing seems to get it to show up.

This is the one thing that's stopping me from fully enjoying this monitor at it's full potential and also getting my £77 back for the Icy Box that doesn't want to accept interlaced at all. I've already emailed AMD support about the issue in the hopes that they might be able to provide some kind of workaround, however if anyone else here has already figured it out since the last sub-thread in here about these adapters in particular, then I'd be incredibly grateful if you could let me and anyone else interested in getting this kind of setup working know. It's a long shot, I know, but I figured if anyone has found a way to solve this problem, it'll likely be one of you fine chaps on this HardForum thread.

Thanks!
 
Last edited:
That reminds me, I found this sole listing of a G90f on ebay: https://www.ebay.com/itm/Serial-339...323481?hash=item1aa475a699:g:BcsAAOSw7Nlfbkfq
View attachment 282956
I feel luckier and luckier with my local snag of a mint condition G90f for $120 this year by the day! :joyful:
Crazy listings.
I remember Viewsonic having one of the sharpest clearest images right up there with Sony and the other champs. Anyone remember what types of tubes they used if they made their own or sourced it?
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Greetings!

I've finally taken the plunge on a CRT a couple of weeks ago (an Iiyama Master Vision Pro 451 for about £88 excl. delivery). Mighty fine tube, much better than any display I've had (save for the Kuro I bought a couple months ago, however they have their own merits). Tested the display in Halo CE and things really popped off the screen and looked almost 3D-like, and the 0ms response times at refresh rates going all the way up to even 200hz is unlike anything I've ever seen before. However, I'm having quite a bit of trouble getting things set up properly.

I first bought an Icy Box converter thanks to the reports of many users on this forum, however I apparently didn't read enough to the part where I learnt that DisplayPort doesn't support interlaced at all on windows; presumably not due to a theoretical incapability to support it, but because manufacturers aren't bothered enough to support such a niche market. This was quite the dealbreaker for me as being able to do things like 2048x1536i at 120hz and certain lower resolutions even higher was the dream.

So I invested in a couple more bits of kit to test, one of which being an active HDMI 2.0 to miniDP converter as I was hoping I'd be able to harness HDMI's proficiency for interlacing that way instead, along with a £7 Benfei VGA to HDMI adapter (well, not that exact one but a Rankie instead) in case that set-up didn't work properly, as per it's supposed ability to push up to 345mhz pixel clock.

The HDMI to miniDP converter didn't work at all with the setup. the Icy Box had no idea what was going on and refused to output any image at all using that method, so I guess daisy-chaining is out the window until further notice. the Rankie adapter, however, was another story. To my surprise, the image quality completely rivalled the Icy Box (atleast to my untrained eye), and after adjusting the max pixel clock in CRU, I was indeed able to run the interlaced resolutions the £77 Icy Box couldn't.

However, as stated by other users, past a certain point with the pixel clock, the darker areas of the image begin to have a rather nasty green outline. Now this can be resolved to a certain extent through turning up the brightness of the image in software using Radeon Settings, however once you breach something around 310mhz, the green outlining is there to stay no matter what.

Except, maybe not. According to the reports of other users using this adapter in this forum, setting the colour space to YCBCR instead of RGB resolves this issue instantly. However, no matter what I do, Radeon Settings will not expose any YCBCR colour space to me unless i run at TV-like resolutions such as 1920x1080 at 59hz, which is obviously no good for a 4:3 CRT and at this resolution and refresh rate such modifications aren't even necessary anymore, however it running it there does prove that it is indeed YCBCR-capable, suggesting that the issue is indeed some sort of arbitrary software limit, compoundeded by the fact that NVIDIA GPUs apparently have no problem doing this, according to other user reports. I've tried everything I can think of to get it to expose the option, including swapping out the monitor's EDID entirely with custom drivers for it, along with downgrading to the earliest AMD driver available to my Vega 64, however nothing seems to get it to show up.

This is the one thing that's stopping me from fully enjoying this monitor at it's full potential and also getting my £77 back for the Icy Box that doesn't want to accept interlaced at all. I've already emailed AMD support about the issue in the hopes that they might be able to provide some kind of workaround, however if anyone else here has already figured it out since the last sub-thread in here about these adapters in particular, then I'd be incredibly grateful if you could let me and anyone else interested in getting this kind of setup working know. It's a long shot, I know, but I figured if anyone has found a way to solve this problem, it'll likely be one of you fine chaps on this HardForum thread.

Thanks!

Did you have the external usb power connected to the Icy Box when you tested the daisy-chain with those adapters?

About the YCbCr with the AMD card try some things with CRU:
-open CRU and add an extension block (type CEA-861)
-add a data block inside this extension block (HDMI support with default settings)
-eventually try to add some interlaced resolutions (in the detailed resolutions of the extension block)
-save all and restart the drivers

These things should be resolved when a HDMI 2.0 to VGA adapter with Lontium LT8612UX will come out.
 
Did you have the external usb power connected to the Icy Box when you tested the daisy-chain with those adapters?

About the YCbCr with the AMD card try some things with CRU:
-open CRU and add an extension block (type CEA-861)
-add a data block inside this extension block (HDMI support with default settings)
-eventually try to add some interlaced resolutions (in the detailed resolutions of the extension block)
-save all and restart the drivers

These things should be resolved when a HDMI 2.0 to VGA adapter with Lontium LT8612UX will come out.

I've attempted these changes, however YCbCr (thanks for showing me how to capitalize that right by the way!) still wont expose itself on any non-tv resolution and refresh rate (1920x1080@60 is as high as I can go).
Interlacing 60hz (for an effective 120) at 1920x1080 doesnt work in getting the colour space to show up, however the included 30hz interlacing for an effective 60 does indeed work just fine with it.

I did have usb power hooked up to the Icy Box (doesn't work at all for me without it). I just took a look at things again, and the miniDP to HDMI adapter does work on my laptop, just not with that configuration, so perhaps it's a one-way adapter and I'm just a poopyhead?
 
I've attempted these changes, however YCbCr (thanks for showing me how to capitalize that right by the way!) still wont expose itself on any non-tv resolution and refresh rate (1920x1080@60 is as high as I can go).
Interlacing 60hz (for an effective 120) at 1920x1080 doesnt work in getting the colour space to show up, however the included 30hz interlacing for an effective 60 does indeed work just fine with it.

I did have usb power hooked up to the Icy Box (doesn't work at all for me without it). I just took a look at things again, and the miniDP to HDMI adapter does work on my laptop, just not with that configuration, so perhaps it's a one-way adapter and I'm just a poopyhead?

Something in the displayport output of that adapter is incompatible with the Icy Box, maybe the hotplug signal.
About the YCbCr you can only try with some of those settings of CRU and see if something work.
About the interlaced resolutions, i can't test much because with my card (AMD 7950) they work without problems with my Displayport to VGA adapter (Delock 62967) with AMD 19.3.3 on Windows 7 (never tested on Windows 10)
So must probably it's a problem with those stupid drivers, they remove useful things and add stupid things every year.
 
Something in the displayport output of that adapter is incompatible with the Icy Box, maybe the hotplug signal.
About the YCbCr you can only try with some of those settings of CRU and see if something work.
About the interlaced resolutions, i can't test much because with my card (AMD 7950) they work without problems with my Displayport to VGA adapter (Delock 62967) with AMD 19.3.3 on Windows 7 (never tested on Windows 10)
So must probably it's a problem with those stupid drivers, they remove useful things and add stupid things every year.

Fun to see a fellow Windows 7 user out in the wild! Just tested 19.3.3 on my rig with the Icy Box, interlaced still doesn't appear to work. My display seems to switch continously on and off like it's attempting to change resolution when applying an interlaced mode this way, so perhaps it's actually a hardware issue regarding the chip on the Icy Box. Damn shame. Would be good to know if there's anyone here with that StarTech DP to VGA converter to test interlacing on there so we know for sure.

In the meantime, would you happen to know where or when we can expected a HDMI to VGA converter with that Lontium LT8612UX onboard to come out? Checking their website on it, seems that chip began production sometime in mid-2019.
 
(like it would be directly plugged on a VGA output)
I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall
 
I remember Viewsonic having one of the sharpest clearest images right up there with Sony and the other champs. Anyone remember what types of tubes they used if they made their own or sourced it?
For the PerfectFlat tubes used in the G series, if I'm not mistaken, they were ViewSonic's own make. There weren't many flat shadowmasks out there.
 
However, no matter what I do, Radeon Settings will not expose any YCBCR colour space to me unless i run at TV-like resolutions sch as 1920x1080 at 59hz,

Did you say you had a Navi based GPU? Because I have a 5700xt, and I have all the same problems.

It seems to just be driver side stuff, like interlaced isn't enabled across the board, and YCbCr is disabled for everything but TV modes. Did you use the built-in bug submission tool?

That said, you're not going to find a better converter than the Icy Box for hitting super high pixel clocks. Like I can run 2880x2160 @ 60hz on my LaCie, which is 545mHz or so.

My solution for interlaced, has been to put a second GPU in my system, an older one with VGA out. You have two options here, either a GPU from the other brand (so Nvidia in your case) or a GPU from the same brand that is supported current drivers. So a recent analog capable AMD GPU like the 260x would work, or a HD 7770.

To use this method, on most games you will have to use borderless windowed, otherwise the game will try to run on the less powerful card. This means an extra frame of lag. But for the games I use interlaced to play, that's not a big deal. I just use interlaced to get higher refresh rates on games that I run sub-60fps. Any game I can maintain 60 and higher, I run progressive with S-sync or V-sync.

I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall

Download Custom Resolution Utility. Don't limit yourself to the resolutions Windows gives you. You can basically create any resolution/refresh rate combo you want. I make a new one for almost every game I play.
 
I just got mine a few weeks ago and it's A LOT sharper than my delock, like its blowing my mind. The only thing I've noticed is that my adapter doesnt get resolutions as if it was a VGA output (I get the DP Monitor in Nvidia Control Panel) and all available resolutions are capped to 60. But other than that this thing is awesome! I was able to get 1600x1200 that isnt the screen scrambling with the delock, had a bit of a problem after clicking 1440x1080 and back to my normal res but it was just a slight hiccup, well worth the purchase overall
Ah, funny. Maybe that model for some reason is well detected on AMD cards and not on Nvidias, when it was the contrary with the Delock one.
 
I have decided to go ahead and purchase a Titan X Maxwell since I needed a GPU and a PSU only to complete a secondary rig I will be running my CRT(s) off of. Question here, is it possible to split the signal from one dvi-i without lag?

I will have up to 4 CRT monitors, probably running two at a time ideally, but would like to be able to keep interlacing ability. So I am curious about whether or not using something like a DVI-I splitter (if this exists in the first place) woudl add any potential lag.

Any help is appreciated, thanks. Should be receiving my Iiyama Vision Master Pro 454 within the next couple of weeks, and have some 19-21 inch Sony CRT monitors that I can buy for cheap from a family member soon too, not sure about the models but will post about them if I need help with adjustments. ;)
 
yes, tried and that works, but i guess it will depend on the motherboard capabilities, the one i tried is asus z87pro which has that option mentioned in the video, tested with gtx 1080 ti, and worked, performance and latency seems the same as outputting directly from the geforce output, but it was a quick, not absolute reference test though.

i had to select primary display from bios as "iGPU" otherwise motherboard vga output wont display anything.
also was not able to acess nvidia control panel, when trying to launch it just a message telling me about not using a nvidia gpu was displayed, and only resolutions available or created from the integrated gpu utility were usable, everthing created from toasyx´s custom resolution utility (cru) was ignored, and igpu utility just allowed me to created some few resolutions like 1920 x 1080 60hz, was not able to create something like 1920x1200 60hz.

interesting finding , but the lack of using own discrete graphics utility and the lack of resolution - refresh combos make it rather wortheles in my opinon, definitelly better to stick to a good digital to analog active adapter.
 
Was working some long hours and haven't trusted my eyes yet in terms of an A/B comparison versus native analog out.

From what I can tell the StarTech works pretty well though. Got it hooked up directly to my FW900's VGA connector with an M/M adapter. Using it at 1600 by 1024 by 100 Hz and at 1880 by 1200 by 85 Hz from my laptop. (Conformed the 1200p resolution more closely to the FW900's aspect ratio.)

(It's just being held on there by pressure, so not sure how long that will hold. Will also try connecting it other ways like a combination of analog and digital cables. This is actually the first connection I've used in many years that passes the EDID data. That seems to add some extra trouble I think in terms of the driver/control panel. Not sure.)
 
interesting finding , but the lack of using own discrete graphics utility and the lack of resolution - refresh combos make it rather wortheles in my opinon, definitelly better to stick to a good digital to analog active adapter.
Thanks, it was very helpful.
 
Last couple of days with the StarTech connecting my laptop to FW900, I've noticed the display disappearing/clicking once in a while. The screen disappears almost like it's changing resolution, but it's not. Anyone else notice such a thing? I've ruled out the monitor itself I think, hopefully, because on my desktop machine, which still has an analog out, it's not doing that. It might be my laptop though.
 
Last couple of days with the StarTech connecting my laptop to FW900, I've noticed the display disappearing/clicking once in a while. The screen disappears almost like it's changing resolution, but it's not. Anyone else notice such a thing? I've ruled out the monitor itself I think, hopefully, because on my desktop machine, which still has an analog out, it's not doing that. It might be my laptop though.
I have over 200 hours on mine with no issues.

If it doesn’t do it on the PC, then maybe it’s a
power setting on the laptop?
 
I have over 200 hours on mine with no issues.

If it doesn’t do it on the PC, then maybe it’s a
power setting on the laptop?

Thank you. Laptop is the leading suspect right now. Not sure what's happening there. Settings seem ok.

I've still not been able to reproduce this on the monitor itself (from the desktop's native analog out). Thankfully. Nor am I seeing it with the StarTech switched to the desktop PC.
 
Fun to see a fellow Windows 7 user out in the wild! Just tested 19.3.3 on my rig with the Icy Box, interlaced still doesn't appear to work. My display seems to switch continously on and off like it's attempting to change resolution when applying an interlaced mode this way, so perhaps it's actually a hardware issue regarding the chip on the Icy Box. Damn shame. Would be good to know if there's anyone here with that StarTech DP to VGA converter to test interlacing on there so we know for sure.

In the meantime, would you happen to know where or when we can expected a HDMI to VGA converter with that Lontium LT8612UX onboard to come out? Checking their website on it, seems that chip began production sometime in mid-2019.

AMD broke something in the drivers with the new cards (from VEGA, i'm not sure about Polaris)
About the adapter with the LT8612UX we can only wait, i'm in contact with Unitek and maybe something will happen.
In the meantime, you can try with a HDMI adapter that not requires the YCbCr to reach high pixel clock, which is a difficult find because we tested only few models.
I can only recommend the Vention AFVHB which uses the LT8612SX, this chipset is the last before the LT8612UX, it should handle the full HDMI 1.4 bandwidth, no one tested it so no guarantee that it works good without YCbCr, but on paper it seems a good chipset.
Also it seems that with HDMI the input bandwidth limit can be exceeded a little using the graphic control panel to force custom resolutions (over 340 MHz)

xykreilon your J5 Create USBC-VGA is the JCA111?

Now for users with the USB-C output:
-here the Vention CGMHA with the LT8711X-B, they ship worldwide, i don't know how it is this CAFAGO shop
-here the Vention CMFHB with the LT8712X
Also the Unitek V1126A uses the LT8712EXC, if you can find it somewhere.
It seems that the new AMD RDNA 2 cards will have the USB-C output.



 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Last couple of days with the StarTech connecting my laptop to FW900, I've noticed the display disappearing/clicking once in a while. The screen disappears almost like it's changing resolution, but it's not. Anyone else notice such a thing? I've ruled out the monitor itself I think, hopefully, because on my desktop machine, which still has an analog out, it's not doing that. It might be my laptop though.


Which operating system are you running? If you are running Catalina (Mac OS 10.15.x), there has been several reports of external monitor flickering; and some users have suggested some fixes to the issues. Please check the internet under "external monitor flickering while running Catalina"... Also, other Mac OS including High Sierra may cause flickering issues...

Hope this helps...

Sincerely,

Unkle Vito!
 
AMD broke something in the drivers with the new cards (from VEGA, i'm not sure about Polaris)
In the meantime, you can try with a HDMI adapter that not requires the YCbCr to reach high pixel clock, which is a difficult find because we tested only few models.
I can only recommend the Vention AFVHB which uses the LT8612SX, this chipset is the last before the LT8612UX, it should handle the full HDMI 1.4 bandwidth, no one tested it so no guarantee that it works good without YCbCr, but on paper it seems a good chipset.
Also it seems that with HDMI the input bandwidth limit can be exceeded a little using the graphic control panel to force custom resolutions (over 340 MHz)

Thankies for the recommendation, just placed an order from AliExpress for the AFVHB. I'll be sure to post back here with the results I get with this adapter once I receive it.
 
yes, tried and that works, but i guess it will depend on the motherboard capabilities, the one i tried is asus z87pro which has that option mentioned in the video, tested with gtx 1080 ti, and worked, performance and latency seems the same as outputting directly from the geforce output, but it was a quick, not absolute reference test though.

i had to select primary display from bios as "iGPU" otherwise motherboard vga output wont display anything.
also was not able to acess nvidia control panel, when trying to launch it just a message telling me about not using a nvidia gpu was displayed, and only resolutions available or created from the integrated gpu utility were usable, everthing created from toasyx´s custom resolution utility (cru) was ignored, and igpu utility just allowed me to created some few resolutions like 1920 x 1080 60hz, was not able to create something like 1920x1200 60hz.

interesting finding , but the lack of using own discrete graphics utility and the lack of resolution - refresh combos make it rather wortheles in my opinon, definitelly better to stick to a good digital to analog active adapter.

Although I wonder because If using say the ryzen 4650 or 4750G variants, you could output through the integrated Vega output, which can select what kind of resolution and refresh rate you'd like much better than shintels trash tier utilitrash. I'm planning on nabbing one of those as my next upgrade, selling both my 3600 ryzen and DPU 3000G, which has an annoying random blank every 10 or so minutes, and overheating issues that cause it to just thermal shutoff. Also the annoying bug where it starts to screen jitter at certain refresh rates due to resonance.

I'm using the AB350 pro 4 from asrock if anyone can pull up the data sheets to see if it has a 10 bit DAC onboard with the VGA port, but apart from that, I hope it does work. Otherwise, good old startech saves the day again.
I'm actually having to resort to cable tying my DPU3000G to my exhaust fan to keep it cool enough to run my CRT at 1600 1200 79hz. An absolutely pitiful adapter, but does OK when you run at 1024*768 156hz on CSGO.
Sony G520 still going strong after 5 years of use, and probably a decade before that. Got an identical Dell non-flat trinitron as a backup, but that image is a little blurry.

I've been unable to tune these CRTs much apart from using magnetic strips. Maybe I'll need to start epoxying some magnets to them? I'm getting tons of screen sag on the left and right, and it seems that ever since a cockroach found a nest underneath 2 capacitor terminals and exploded, the CRT never displayed as clearly or crisply as it used to. Maybe it fried something along the way, huh?
 

If what J5 Create told me is correct, that adapter uses the chipset ITE IT6516BFN + another VIA chip to handle the USB-C connection.
So the max clock should be 180 MHz with 8 bpc and 240 MHz with 6 bpc, do you see somewhere in the graphics control panel, if the color depth changes when you are over 180 MHz?
What happen if you go over 240 MHz?

I updated the adapters summary here with the new entries, i'll add more when your tests with Tendak and J5 Create adapters will finish.
 
Last edited:
Which operating system are you running? If you are running Catalina (Mac OS 10.15.x), there has been several reports of external monitor flickering; and some users have suggested some fixes to the issues. Please check the internet under "external monitor flickering while running Catalina"... Also, other Mac OS including High Sierra may cause flickering issues...

Hope this helps...

Sincerely,

Unkle Vito!

Thank you, Unkle Vito!

It's Windows, but I'll cross reference those.

Latest suspect might be hardware. Laptop has been well behaved at 85 Hz last few hours. Maybe this machine couldn't cope with 100 Hz. (Though it didn't seem to have a problem with the higher refresh rate initially.)
 
Has anyone succeeded with getting interlaced resolutions to work on Windows 10 with Displayport adapters? Not sure if Windows version affects anything, may end up using Windows 7 anyways..

I plan on having a 3 CRT monitor setup (Iiyama Vision Master Pro454+ 2x Dell P991s), but would like to be able to run interlaced on all of my monitors. 6700k+Titan X Maxwell.
Are there any other options, like VGA signal splitters to output one analog output to multiple monitors?

Any help is appreciated, thanks.
 
Has anyone succeeded with getting interlaced resolutions to work on Windows 10 with Displayport adapters?
AFAIK not possible at all on DP adapters. Probably possible with HDMI adapters, maybe Derupter can shed the light on that question. Also wait till Rtas will test out the new Vention HDMI adapter :)
 
  • Like
Reactions: jei
like this
If what J5 Create told me is correct, that adapter uses the chipset ITE IT6516BFN + another VIA chip to handle the USB-C connection.
So the max clock should be 180 MHz with 8 bpc and 240 MHz with 6 bpc, do you see somewhere in the graphics control panel, if the color depth changes when you are over 180 MHz?
What happen if you go over 240 MHz?

I updated the adapters summary here with the new entries, i'll add more when your tests with Tendak and J5 Create adapters will finish.
I am still at 8 bits per channel up to 240 MHz. Here's DisplayCal:
Screenshot_20201006_101414.png

I use xrandr for custom resolutions. When I tell it to output any resolution/refresh rate with a pclk higher than 240 MHz, my CRT very briefly flashes off and on (like it's trying to change resolution), and I get "xrandr: Configure crtc 0 failed" as an output in the terminal.
 
Last edited:
That's way weaker than the GPU in the PS5. You're not going to be running any modern games on three monitors with that card.
Of course not. But, they're not comparable. People keep saying this as if older architectures hold their value well.
With GPUs, the performance to price ratio is much higher with older flagships than modern low end models- especially if you shop beyond the big boys like Amazon, Newegg, and Ebay.
 
Titan X Maxwell is what I still have. Got used a few years back from eBay. Came with water cooling, which presumably addresses a design weak spot it had. At the time, it seemed like a nice bookend on the GPU front for my presumably last system anchored by a CRT.

Might be fun to throw an RTX 3800 at it though to get a taste of ray tracing before I upgrade from the FW900 to whatever is next.
 
Titan X Maxwell is what I still have. Got used a few years back from eBay. Came with water cooling, which presumably addresses a design weak spot it had. At the time, it seemed like a nice bookend on the GPU front for my presumably last system anchored by a CRT.

Might be fun to throw an RTX 3800 at it though to get a taste of ray tracing before I upgrade from the FW900 to whatever is next.
I'm getting a 3080, but I only really crave the extra performance for speedier GPU-based media work. While most games I play need something better than my current GTX 1050 for 720p@125fps stable, a 3080 is definitely overkill. Also, the 3080 has a hugely better price to performance ratio than the 1080 and 2080 when they were new, so that's tempting as well.
I'm likely not upgrading from my ViewSonic G90f anytime soon. I just refuse to buy a display that has non-perfect blacks and no state of the art motion fluidity in tandem. I've already experienced the excellent 4k IPS screen with 155% RGB gamut of my XPS 9560 laptop. Yet, I'm still not interested in getting a bigger version for the desk. That's how disappointing low motion clarity and non-perfect blacks are to me.
 
AFAIK not possible at all on DP adapters. Probably possible with HDMI adapters, maybe Derupter can shed the light on that question. Also wait till Rtas will test out the new Vention HDMI adapter :)

I did some tests with interlaced resolutions:
-AMD 7950 with Delock 62967, it works with CRU and AMD custom resolution panel with both Windows 10 and 7
-Nvidia GTX 1070 with the same adapter, using CRU is like doing nothing, using Nvidia control panel it says not supported, Win 10 and 7 same results

Now looking at what Rtas and Enhanced Interrogator say, on VEGA cards it seems to be working, but only with HDMI adapters, on NAVI cards it doesn't work even with HDMI.
It remains to be seen if it works with the new Nvidia cards, at least with HDMI adapters.

I am still at 8 bits per channel up to 240 MHz. Here's DisplayCal:
View attachment 286038
I use xrandr for custom resolutions. When I tell it to output any resolution/refresh rate with a pclk higher than 240 MHz, my CRT very briefly flashes off and on (like it's trying to change resolution), and I get "xrandr: Configure crtc 0 failed" as an output in the terminal.

Are you able to see the displayport link rate?
Something like 2x2.7 Gbps HBR or 2x5.4 Gbps HBR2
 
That's way weaker than the GPU in the PS5. You're not going to be running any modern games on three monitors with that card.
I only play FPS games, 6700K + Titan X Maxwell will be more than enough to maintain over 200fps in most modern FPS titles, so don't worry. I just want to have 1 monitor for games and 1-2 side monitors for stuff like discord/music, but it's not a big deal either way, because I have another PC with a 3700x + 2070S that I can use for other games if I want.

Fortunately, my VMP454 is being shipped tomorrow and should arrive before the end of the week, and I have found two seemingly in good condition Dell P991s to go along with it :) Guess this CPD-520GS will be going to a friend of mine or into my storage room.
 
Back
Top