24" Widescreen CRT (FW900) From Ebay arrived,Comments.

from my understanding, pixel clock limit is something exclusively of the digital to analog converter chipset, either if its integrated (like in your motheboard vga output or a videocard with native analog output) or from external adapter inside chipset (rankie, sunix, delock, etc) , and is not related to the monitor, whose limits are related to things like its maximum vertical and horizontal frequencies instead.

according to info found in the internet, your monitor horizontal and vertical limits are around 130khz and 170hz respectively, its hard to tell whats the "Absolute" limit on crts since they dont have one specific native resolution and are more imited with the other things like those mentioned, so to figure out what are its "max" limits (better call it you own desired max limits) you may try to create custom resolutions and refresh rates inside those limits and inside adapter pixel clock limits to figure out those by yourself when you get your adapter,

god luck with your rankie adapter, hope it suit your needs, since you havent decided to buy the delock yet and in case you consider getting it, i dont remember if you have asked but i want to let you know that its chipset, as with the sunix dpu3000 and icy box have some weird issues when using some resolutions and refresh rate combos hence some patience dealing - learing to setup those is needed, for more info about those issues search this thread for "dpu3000" "Icy Box" "Delock 87685".
 
Last edited:
Ah i see.. thank you for the advice!


Yes i turned it on and.. well (i shouldve taken pictures) it looked like it had serious G2 drift, but thankfully i waited 30 minutes for it to warm up, and did the COLOR RETURN, and it showed some grey on screen and flipped on and off a little and then it turned back on with what seemed like PERFECT G2 levels! I'm surprised from 10 years in storage (albeit climate controlled and dust-free) it didnt seem to exhibit any G2 drift at all. I was able to get some REALLY high refresh rates just from my VGA out on my Lenovo Thinkpad X220t, i was able to get 153Hz at 1024x768 and up to 2560x1920 @ 50Hz, seemed my VGA out on my laptop is maxed out at 350MHz, anything above that was "out of range" of the Intel Integrated Graphics it has... i was just testing it before I get my DAC to hook it up to my GTX 1080..

By the way, the link i purchased from on Amazon.de, the seller 'Hawks Photo Video" emailed me the next day that the Icy Box IB-SPL1031 i ordered, was OUT OF STOCK due to it being discontinued, and they refunded my order.. :( so I'l have to go for a slightly more expensive Delock if i want the super high refresh rates and resolutions on my P1130... honestly I'm moreso loving the lower resolutions with the higher refreshes, the smoothness of motion is just UNREAL after having used VA panels for 10 years exclusively.. so i like lower res and higher refresh - moreso than the higher resolutions.. so i MAY not actually NEED the Delock.. we'll see (i may just get one for shit's and giggles, just so i can see what MHz Pixel clock the P1130 will max out at), I ordered a Rankie 1080P Active HDTV HDMI to VGA Adapter , just as a stop-gap while i wait to consider if i want to go for the Delock...

Does anyone on here KNOW the maximum Pixel Clock the DELL P1130 or Sony GDM-C520K can reach? Please let me know, this will REALLY affect my decision of either spending $130+ on a Delock or keeping the $8 Rankie, lol..




oh wow, thank you! How did you know this? and are you familiar with it's max Pixel Clock?.. or any other little things about it?

By the way, how come it's not a rebrand of the GDM-C520K? it looked like they has the same specs? what's the difference between the GDM-C520K and the CPD-G520K?

Also, I'd like to ask you (or anyone that knows..) the following about my P1130..
-What brightness & Contrast numbers should i be running this monitor at during and after calibration?
-What COLOR MODE should i be on, Easy where i just select from 5000K to 11000K, PRESET which is 5000k, 6500k, 9300k, or Expert where i have all the different RGB Gan/Bias levels to edit manually, which mode should i run during and after calibration?


In short and not going too deep into details.... The GDM-C520/520K has a different chassis than both the P1130 and the CPD-G520 (there is no CPD-G520K), which in fact have the same chassis. Besides that, the GDM-C520/520K had a high grade picture tube with a different phosphor than the CPD-G520 and the P1130 which basically had the same tube. In addition to this, the GDM-C520/520K came with a pigtail USB/VGA cable that made possible for the monitor to be color calibrated using the Sony Artisan Color Reference System (Sony Custom Color Calibration Application along with a color calibrator which was known as "the puck"). Other Sony monitors were not compatible with the Sony Custom Color Calibration Application.

The Sony Artisan (GDM-C520/520K), when properly calibrated was color accurate and much better than any monitor I've used for this purpose. That is why it was the standard monitor widely used in all service bureaus, color processing houses, commercial printers (besides the BARCO), and for professional photographers as well as videographers. As of today, it pretty much rivals any LCD/LED displays in terms of color accuracy, color rendition and black levels. Again, this is my professional opinion. Remember that ultimately, beauty is in the eye of the beholder...

Hope this helps...

Sincerely,

Unkle Vito!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Holy mackerel...
https://old.reddit.com/r/crtgaming/comments/iidwzp/ive_heard_you_guys_like_crts/
mpv-shot0315.jpg
 
In short and not going too deep into details.... The GDM-C520/520K has a different chassis than both the P1130 and the CPD-G520 (there is no CPD-G520K), which in fact have the same chassis. Besides that, the GDM-C520/520K had a high grade picture tube with a different phosphor than the CPD-G520 and the P1130 which basically had the same tube. In addition to this, the GDM-C520/520K came with a pigtail USB/VGA cable that made possible for the monitor to be color calibrated using the Sony Artisan Color Reference System (Sony Custom Color Calibration Application along with a color calibrator which was known as "the puck"). Other Sony monitors were not compatible with the Sony Custom Color Calibration Application.

The Sony Artisan (GDM-C520/520K), when properly calibrated was color accurate and much better than any monitor I've used for this purpose. That is why it was the standard monitor widely used in all service bureaus, color processing houses, commercial printers (besides the BARCO), and for professional photographers as well as videographers. As of today, it pretty much rivals any LCD/LED displays in terms of color accuracy, color rendition and black levels. Again, this is my professional opinion. Remember that ultimately, beauty is in the eye of the beholder...

Hope this helps...

Sincerely,

Unkle Vito!
I can also attest to high end CRTs being color accurate when calibrated. My Dell XPS 9560 has a killer 4k screen- something to the tune of 99.2% Adobe RGB and 99.5% sRGB coverage, last time I calibrated it.
But, I recently got the ViewSonic G90f. Once calibrated, I don't feel like I'm looking at all too different a screen, as far as color fidelity is concerned.
Gamut G90f-2 #1 2020-08-14 22-21 D6504 Rec. 709 S XYZLUT+MTX.png

G90f_color-specs.png

While the G90f may be missing some of the deep blues and reds of the Dell's screen, it makes up for it by having a far superior contrast ratio. Not only are the G90f's blacks deep, they're deep at a very high OSD brightness setting- something like 75%.
Really, the main thing I'm missing is resolution. While a .21mm dot pitch @ 18" shadowmask conforming 1736x1302 resolution makes the G90f the sharpest 18" viewable-screen CRT, it's still no 3840x2160.
Argh, if only Applied Nanotech Inc.'s dastardly patent fiasco didn't cause SED's death! I'd surely have the best of all worlds by now, otherwise.
 
I haven't. But, that reminds me, I should update Deruptor's adapter summary with one I recently found locally that works.
I think it would be cool if we contribute to the list in a post every time we find an adapter that works, and make bold the new one we found.
I also think it would be nice if there were a list of adapters that are verified to not work well. I have actually, unfortunately, had to deal with 3 before finding one that works well- one of which being a popular Amazon choice that has a MHz spec listed higher than what it's actually capable of. Not only does it say the usual useless 1920x1080@60Hz max bologna, it specifically says it can do 250Mhz, and it can't! I actually only chose it because it was the only one on Amazon I found to list a pclk- only to find out it was a lie!
So, I'll probably start a list like that soon- at least to warn against that one popular option.
Anyway...

Update to Deruptor's cohesive adapter summary:

Adapters capable or potentially able to go over 360 MHz pixel clock:

With Synaptics chipset (DP to VGA)
ICY BOX IB-SPL1031 (tested)
Delock 87685 (tested)
Sunix DPU3000 (tested)
Problems with some resolutions at specific refresh rates with all the three models

With Lontium chipset (USB-C to VGA maybe with 10 bit DAC)
Vention CGMHA and CMFHB (not tested)

Adapters capable or potentially able to go over 180 and up to 360MHz pixel clock:

USB-C to VGA
Delock 62994 with Realtek RTD2169U (not tested)
Delock 62796 with ANX9847 (not tested)
Plugable USBC-VGA with ANX9847 (tested up to 330-335 MHz no issue)
Sunix C2VC7A0 with ANX9847 (not tested)
Delock 63923 with Chrontel CH7212 (not tested)
Delock 63924-63925 with ITE IT6562 (not tested 10 bit DAC)
J5 Create USBC-VGA (tested up to 245MHz with no issue. I'd test higher, but my G90f gives me a "frequency over range" error with any resolution higher than about 2.6 megapixels (even when they're well below the G90f's max horizontal scan frequency). I'll update this when I have access to my Hitachi SuperScan Elite 751, which I'm pretty sure will try to display any resolution I send it, so long as it's under the max horizontal frequency, of course.)

DP to VGA
Delock 62967 with ANX9847 (tested)
With some cards it can't handle HBR2 mode so no more than 180 MHz, changing the cable should solve the problem, when it works is perfect up to 340-350 MHz with any resolution and refresh rate.

Cheap adapters from Amazon (HDMI to VGA)
Benfei and Rankie HDMI to VGA, we don't know the model code and the chipset, they should handle up to 330 MHz but it is necessary to set the output to YCBCR 444 and this can cause problems to some users because sometimes and with some drivers that option is not present.
Tendak Female HDMI to Male VGA ~ https://www.amazon.com/dp/B01B7CEOVK/?ref=exp_retrorgb_dp_vv_d ~ (verified to work with no issues up to 210MHz. I could test it more extensively next time I have access to my SuperScan Elite 751 (the adapter isn't compatible with my G90f due to the G90f's fixed cable). Also, there's no need to set the output to YCBCR 444).
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
.... Remember that ultimately, beauty is in the eye of the beholder...
And for good reason! I read a while back people have different amounts of these things in their eyes called "cones." Most women tend to have more cones than men which allows them to see more colors. So there IS a physical reason as to why the term "beauty is in the eye of the beholder" is more truthful than most people realize.
 
Hi all, I’m a bit out of my depth here. Was lucky enough to get a GDM FW900 in the past few months but I’m really having trouble finding a cable from USB-C, HDMI, or DisplayPort that’ll let me access the full range of resolutions and refresh rates for the monitor. At the moment I’m limited to 1280x1024 and it looks awful.

Not absolutely essential, but it’d be preferred if there’s any way to have this cable work over 5 metres (16’5”). I don’t mind any cost within reason.

Thank you!
 
Hi all, I’m a bit out of my depth here. Was lucky enough to get a GDM FW900 in the past few months but I’m really having trouble finding a cable from USB-C, HDMI, or DisplayPort that’ll let me access the full range of resolutions and refresh rates for the monitor. At the moment I’m limited to 1280x1024 and it looks awful.

Not absolutely essential, but it’d be preferred if there’s any way to have this cable work over 5 metres (16’5”). I don’t mind any cost within reason.

Thank you!
Well, check the list I just updated a few posts up. I use a J5 Create USBC-VGA that works well. But, I recommend something that can do at least 300MHz of pclk for the FW900- I could only verify that the J5 Create does 245MHz due to the limitations of my current CRT. I'll see what happens with my other one.
Also, as for the length issue, just get a displayport-compatible male to female type c extender cable. They're pretty common.
 
And for good reason! I read a while back people have different amounts of these things in their eyes called "cones." Most women tend to have more cones than men which allows them to see more colors. So there IS a physical reason as to why the term "beauty is in the eye of the beholder" is more truthful than most people realize.
True! However, in the case of displays, only men with color blindness would ever see less colors than women. You'd have to find some crazy display that has something a like a 250%+ Adobe RGB gamut in order for that to make a difference- and I'm likely lowballing the numbers, even then.
 
from my understanding, pixel clock limit is something exclusively of the digital to analog converter chipset, either if its integrated (like in your motheboard vga output or a videocard with native analog output) or from external adapter inside chipset (rankie, sunix, delock, etc) , and is not related to the monitor, whose limits are related to things like its maximum vertical and horizontal frequencies instead.

according to info found in the internet, your monitor horizontal and vertical limits are around 130khz and 170hz respectively, its hard to tell whats the "Absolute" limit on crts since they dont have one specific native resolution and are more imited with the other things like those mentioned, so to figure out what are its "max" limits (better call it you own desired max limits) you may try to create custom resolutions and refresh rates inside those limits and inside adapter pixel clock limits to figure out those by yourself when you get your adapter,

god luck with your rankie adapter, hope it suit your needs, since you havent decided to buy the delock yet and in case you consider getting it, i dont remember if you have asked but i want to let you know that its chipset, as with the sunix dpu3000 and icy box have some weird issues when using some resolutions and refresh rate combos hence some patience dealing - learing to setup those is needed, for more info about those issues search this thread for "dpu3000" "Icy Box" "Delock 87685".
Yeah i bought a DELOC and im waiting for it from Germany lol, or was it a Russian site, idk lol. they have my money and I'll report back if i get a working DELOCK unit :)

This begs the question... what teh HIGHEST vertical and horizontal limits on CRTs? I heard of the FW-900 having 140kHz which is nice,.. and i heard of a NOKIA CRT that was capable of like 340Hz horizontal because they didn't lock the signal limit? Ah.. It is the Nokia 445 Pro. I'd LOVE one of those, i can see it now, CRT at 340Hz...wow :) Was there ever any OTHER CRT that had unlocked vertical and horizontal Hz limits?

EDIT: I found one!, the Iiyama Vision Master Pro 514 - CRT monitor - 22" Series:

It seems to be a Diamondtron U2 panel with higher horizontal and vertical Hz , with the same size as the DELL P1130, and its Sony original CPD-G520 21" models..
  • Max V-Sync Rate 200 Hz
  • Max H-Sync Rate 142 KHz
The Nokia 445 Pro seemed to be a Sony Trinitron panel that Noka just didnt lock the V-sync rate on.. would be be able to extract a Nokia's firmware and upload it to a matching panel Trinitron? to unlock the Vsync rate? anyone tried this?

In short and not going too deep into details.... The GDM-C520/520K has a different chassis than both the P1130 and the CPD-G520 (there is no CPD-G520K), which in fact have the same chassis. Besides that, the GDM-C520/520K had a high grade picture tube with a different phosphor than the CPD-G520 and the P1130 which basically had the same tube. In addition to this, the GDM-C520/520K came with a pigtail USB/VGA cable that made possible for the monitor to be color calibrated using the Sony Artisan Color Reference System (Sony Custom Color Calibration Application along with a color calibrator which was known as "the puck"). Other Sony monitors were not compatible with the Sony Custom Color Calibration Application.

The Sony Artisan (GDM-C520/520K), when properly calibrated was color accurate and much better than any monitor I've used for this purpose. That is why it was the standard monitor widely used in all service bureaus, color processing houses, commercial printers (besides the BARCO), and for professional photographers as well as videographers. As of today, it pretty much rivals any LCD/LED displays in terms of color accuracy, color rendition and black levels. Again, this is my professional opinion. Remember that ultimately, beauty is in the eye of the beholder...

Hope this helps...

Sincerely,

Unkle Vito!
That does help, thanks!
 
Last edited:
fw900 vertical and horizontal limits extracted from the manual:
• Horizontal scan range 30 kHz to 121 kHz
• Vertical scan range 48 Hz to 160 Hz
 
Looks like on the new RTX 3000 series graphics cards the USB C port is gone.
 
Looks like on the new RTX 3000 series graphics cards the USB C port is gone.
Could have to do with their miniaturization plan to squish more things together to be able to have fans blow through a few areas of the PCB.
 
To calculate the pixel clock go here, after putting the resolution and refresh rate click on the white area of the page, look the calculated pixel clock under CVT timings and check also the horizontal frequency (H Freq)
488 MHz is for 2304x1440 100 Hz, for 80 Hz is 383 MHz, the FW900 specs are:
-Vertical refresh 48 to 160 Hz
-Horizontal frequency 30 to 121 kHz
You can't go under and over these limits, but you can do whatever you want between them.
For 1920x1200 120 Hz is required a horizontal frequency of 154 kHz, so impossible with a FW900.
The Startech DP2VGAHD20 can do 375 MHz on the sample tested by Flybye , usually other samples can do more and others less, so 375 MHz is not a perfect fixed frequency of all DP2VGAHD20.
Probably with 375 MHz you can get the 2304x1440 80 Hz reducing the total blanking, 79 Hz for sure.
2560x1600 60 Hz needs a pixel clock of 348 MHz and a horizontal frequency of 99 kHz so you can do that, but if you want to do things like 2560x1600 73 Hz or higher resolutions with very high pixel clock, you need an adapter based on the Synaptics chipset. (check here)
Take into consideration the fact that all these adapters have a DAC that from specs goes at 200 MHz, the faster is the ANX9847 with a default speed of 270 MHz, so everything you get going beyond it is out of specs.

3200x2400? Interlaced maybe..

So the Synaptics can definitely do a pixel clock of 429? Also I meant 3200 x 2000, and that would be under 121khz, sorry for the confusion.

Here is the link to my reply with the testing that I did.

Just keep in mind that I am pushing these converters to their limit, and they all do different and funky things at their limit. The limit of my converter could be a hair different than the limit on one if you got one.

Sorry for not replying sooner, I was waiting to see what Nvidia announced with the 30 series. And it seems like they are doing away with the USB so getting a DP converter might be the best way to go. If the DP2VGAHD20 can do 375mhz then why are people going for the Synaptics? I know if want 1440p80 over say 75 but it seems a bit excessive, £20 vs £100+. Another option would be to get a 9 series card, now that the 30 series is about to come out might might be possible to get a Titan X for around £125, which is just a little more than the Delock, or a 980\ti since those are easier to come by.

As for the monitor itself, I was just wondering how much effort is required in getting the image really good. Or more to the point is using WinDAS and something like a SpyderX Pro required, or can one just wing it with the OSD?

Sorry for all the questions, I do know a thing a two I just like researching everything before spending money and look into things to the next level. I know there is a lot of information in this thread already, but it seems like this thread has become the random CRT threads, its very hard to find information here. So many different streams of conversation, most that are about other monitors or just random CRT talk.
 
If the DP2VGAHD20 can do 375mhz then why are people going for the Synaptics?

The Synaptics can almost get to 550mHz, which actually blows Nvidia's analog cards like the 980Ti out of the water. Those topped out at 400mHz, and couldn't be unlocked like AMD's cards. But if you only have 19" monitor, 375mHz is enough for most resolutions you could run.

But on my 21" LaCie, I can run 2880x2160 @ 60hz which is around 535mHz.
 
I found an adapter that can potentially have the same performance of DPU3000.
The chipset is the Lontium LT8711X-B which has the same specs as the Synaptics VMM2322.
I asked Lontium about the DAC precision and seems to be 10 bit,but it's a thing that must be tested.

Wonder if this means that the Sunix DPU3000 and Delock 87685 may be capable of 10 bit DAC precision.
 
The Nokia 445 Pro seemed to be a Sony Trinitron panel that Noka just didnt lock the V-sync rate on.. would be be able to extract a Nokia's firmware and upload it to a matching panel Trinitron? to unlock the Vsync rate? anyone tried this?
I am also very interested in this- in unlocking CRT Vsync limits in general. I bet a lot more than just that Nokia could easily go beyond their written limit, with the right firmware or the like.
 
EDIT: I found one!, the Iiyama Vision Master Pro 514 - CRT monitor - 22" Series:

It seems to be a Diamondtron U2 panel with higher horizontal and vertical Hz , with the same size as the DELL P1130, and its Sony original CPD-G520 21" models..
  • Max V-Sync Rate 200 Hz
  • Max H-Sync Rate 142 KHz
Wow, a CRT that can do 992x620@200Hz? Dang, what an awesome monitor. Sure would like to own one of those!

Oh, wait a minute, I can!

https://www.ebay.com/c/55513583
Screenshot_20200903_015611.png

Screenshot_20200903_015634.png


Just gotta sell my car first :^)
 
As an eBay Associate, HardForum may earn from qualifying purchases.
Wow, a CRT that can do 992x620@200Hz? Dang, what an awesome monitor. Sure would like to own one of those!

Oh, wait a minute, I can!

https://www.ebay.com/c/55513583
View attachment 275836
View attachment 275837

Just gotta sell my car first :^)

:ROFLMAO:

I had 3 of these several years ago. I ended dismantling them for parts and throwing away the casing/tubes. Want to know why ? Because for whatever reason the corners of the display become blurry with age and there's apparently nothing you can do about it. Bad focus in corners might actually have been a problem somewhat present even when they were new.
So yeah, the specs are impressive but this model is garbage IMO.
 
As an eBay Associate, HardForum may earn from qualifying purchases.
So the Synaptics can definitely do a pixel clock of 429? Also I meant 3200 x 2000, and that would be under 121khz, sorry for the confusion.

Sorry for not replying sooner, I was waiting to see what Nvidia announced with the 30 series. And it seems like they are doing away with the USB so getting a DP converter might be the best way to go. If the DP2VGAHD20 can do 375mhz then why are people going for the Synaptics? I know if want 1440p80 over say 75 but it seems a bit

excessive, £20 vs £100+. Another option would be to get a 9 series card, now that the 30 series is about to come out might might be possible to get a Titan X for around £125, which is just a little more than the Delock, or a 980\ti since those are easier to come by.

As for the monitor itself, I was just wondering how much effort is required in getting the image really good. Or more to the point is using WinDAS and something like a SpyderX Pro required, or can one just wing it with the OSD?

Sorry for all the questions, I do know a thing a two I just like researching everything before spending money and look into things to the next level. I know there is a lot of information in this thread already, but it seems like this thread has become the random CRT threads, its very hard to find information here. So many different streams of conversation, most that are about other monitors or just random CRT talk.

Enhanced Interrogator explanation is one of the reasons, also the Synaptics based chip capabilities, especially the dpu3000 were discovered some years ago, the DP2VGAHD20 has been recently tested, there are few good quality adapters among too many generic, crappy quality ones on the internet with innacurate or very limited technical specs (specialy about pixel clock capabilities) for the vga output, that makes difficult to know what adapter is ideal, there are user like Derupter that have made very helpfull summary of good tested and potentialy good untested adapters to use, for more info, search his posts. in my country for example you mostly see generic unbranded low quality adapters which seem made to output analog signals like 1024x768@60hz, so i had to import the dpu3000 from usa when i bought it about year and a half ago thanks to info about its cons and pros provide by users like him. also if you are in a country were you can return a product due to insatisfaction, i belive thats the best think you can do to test by your self what adapter suit your needs, since manufaturers are not being too helpfull with crt users about their real adapter specs.

you can get very good results via osd specially via expert menu. but to get the best out of the monitor of course hardware colorimeter calibration is the best thing to do, search spacediver user Windas White Point Balance (WPB) guide to see instructions and get and idea of how complex it can be.



Wonder if this means that the Sunix DPU3000 and Delock 87685 may be capable of 10 bit DAC precision.
excuse my ignorance, but is there a way to check that without the need of a hardware colorimeter?
 
Last edited:
:ROFLMAO:

I had 3 of these several years ago. I ended dismantling them for parts and throwing away the casing/tubes. Want to know why ? Because for whatever reason the corners of the display become blurry with age and there's apparently nothing you can do about it. Bad focus in corners might actually have been a problem somewhat present even when they were new.
So yeah, the specs are impressive but this model is garbage IMO.
Huh, bummer.
I wonder if any other Diamondtron U2 based CRTs could actually do 200Hz with some tweaking.
 
excuse my ignorance, but is there a way to check that without the need of a hardware colorimeter?

possibly, but you'd probably need some instruments to probe the circuits. Not sure if there's a simple way to check through software, though I might be wrong.

You could also possible use a multimeter to measure the voltages coming out of the output, and test the precision that way.
 
I will be offering two (2) GDM-FW900 from my personal collection. Anyone interested, please PM me... I kindly ask for serious inquiries...

Take care and keep safe!

Unkle Vito!
 
Looks like on the new RTX 3000 series graphics cards the USB C port is gone.

This is from Anandtech:
Meanwhile VirtualLink ports, which were introduced on the RTX 20 series of cards, are on their way out. The industry’s attempt to build a port combing video, data, and power in a single cable for VR headsets has fizzled, and none of the big 3 headset manufacturers (Oculus, HTC, Valve) used the port. So you will not find the port returning on RTX 30 series cards.

So the only way to use USB-C adapters with new cards is this or this (and other methods that I don't know)

Wonder if this means that the Sunix DPU3000 and Delock 87685 may be capable of 10 bit DAC precision.

Same specs because they both have a 17.28 Gbit/s digital input bandwidth and a 200 MHz DAC, but Synaptics is 8 bit and Lontium 10 bit , information obtained from the respective manufacturers (can be bullshit)
 
This is from Anandtech:
Meanwhile VirtualLink ports, which were introduced on the RTX 20 series of cards, are on their way out. The industry’s attempt to build a port combing video, data, and power in a single cable for VR headsets has fizzled, and none of the big 3 headset manufacturers (Oculus, HTC, Valve) used the port. So you will not find the port returning on RTX 30 series cards.

The situation gets more dire.

I wonder what John Linneman is going to switch to. He was using the Vention USB-C adapter with his FW900. I don't know how many people were using it but I didn't hear any complaints.

HD Fury really needs to bail us out, but they're too busy raking in the cash on their home theater and production focused gadgets.
 
possibly, but you'd probably need some instruments to probe the circuits. Not sure if there's a simple way to check through software, though I might be wrong.

You could also possible use a multimeter to measure the voltages coming out of the output, and test the precision that way.


i have a multimeter in which i can test the vga output voltage from my dpu3000 if you think it can be helpfull for your wondering, just confirm me 😉



The situation gets more dire.

I wonder what John Linneman is going to switch to. He was using the Vention USB-C adapter with his FW900. I don't know how many people were using it but I didn't hear any complaints.

HD Fury really needs to bail us out, but they're too busy raking in the cash on their home theater and production focused gadgets.

wouldnt an uneexpensive usb-c to displayport cable be enough, since both manage digital signal so no need of an kind of active adapter for signal conversion? well just wondering, im not expert on this, can be nonsense words from me.

those hd fury adapters seem very expensive, even much more than synaptics based ones, if that HDF1 Nano GX costs $100 with just max 165 mhz for vga output, (discontinued by the way, they suggest x3 or x4, $179 and $289 respectively and just 225mhz each 😱) i cannot imagine how brutally expensive would be something from they around 400 mhz :hungover:

also i dont think the current situation with adapters is that bad as it was when nvidia released gtx 1000 series without analog support anymore,, at least there are another cheap adapters, to name some, reported to do the job well up to 330+ mhz such the benfei and rankie brand ones.

im personaly satisfied with synaptics based even with its weird issues which definitelly are workaround-able, it has to be seen how will those behave on rtx 30 series.
 
Last edited:
Perhaps if the teachings of how great CRTs are made more known, maybe in an alternate universe video card providers would reintroduce analog connectors. ;)
 
Perhaps if the teachings of how great CRTs are made more known, maybe in an alternate universe video card providers would reintroduce analog connectors. ;)
Yeah, and perhaps in this same alternate universe, Applied Nanotech Inc's patents never killed SED displays, and thus all would be well in the world ;P
 
Enhanced Interrogator explanation is one of the reasons, also the Synaptics based chip capabilities, especially the dpu3000 were discovered some years ago, the DP2VGAHD20 has been recently tested, there are few good quality adapters among too many generic, crappy quality ones on the internet with innacurate or very limited technical specs (specialy about pixel clock capabilities) for the vga output, that makes difficult to know what adapter is ideal, there are user like Derupter that have made very helpfull summary of good tested and potentialy good untested adapters to use, for more info, search his posts. in my country for example you mostly see generic unbranded low quality adapters which seem made to output analog signals like 1024x768@60hz, so i had to import the dpu3000 from usa when i bought it about year and a half ago thanks to info about its cons and pros provide by users like him. also if you are in a country were you can return a product due to insatisfaction, i belive thats the best think you can do to test by your self what adapter suit your needs, since manufaturers are not being too helpfull with crt users about their real adapter specs.

you can get very good results via osd specially via expert menu. but to get the best out of the monitor of course hardware colorimeter calibration is the best thing to do, search spacediver user Windas White Point Balance (WPB) guide to see instructions and get and idea of how complex it can be.

Thanks for your replay, some of the Synaptics seemed to be a reasonable price a while ago, but now they are not. I think I'll try the DP2VGAHD20 since its from and sold by Amazon, so I can send it back if it doesn't work.

Thanks for the info but I do not have a URL/link to that website...

Sincerely,

Unkle Vito!

I think he's referring to this, https://www.reddit.com/r/crtgaming/.

Where are you from out of interest?
 
Last edited:
What kind of luminance are you guys getting with your calibrated 6500k screens?

My 2060u is getting 73cd on 'text mode' with the RGB settings cranked just before color bleeding

On 'normal mode' it goes up to about 87cd at 6500k. Normal mode at 77% brightness is only usable for gaming because it washes out the image to an ugly yellowish color and has to absolutely crush the blacks to achieve the brightness level.

Then if I switch to the 'graphics-1 model' it switches to 9300k and goes up to say 103cd.
 

Attachments

  • Screenshot_20200905-080810.png
    Screenshot_20200905-080810.png
    1.9 MB · Views: 0
Last edited:
What kind of luminance are you guys getting with your calibrated 6500k screens?
Why dont you calibrate it via displaycal calibration (before profiling screen) by using service menu? Those mitsubishi tubes are very long lasting, my 20k hour 22b4 easily calibrated to 100cd/mm2 on every color temp.


Scored an F520 with 20k hours on it, and boy it's a sharp one.
Had some trouble setting the right G2 in WPB, but it calibrates nicely.
Funny how it's considerably smaller than a 22b4, I thought the difference would be barely noticeable.
I've figured out my problems with WPB, had to lower G2 a bit more. Perfect blacks now.
But there's still a funny issue with this one. When running high res/hz combo it starts to blur the picture when contrast is set high enough. At lower rez/hz it does not do this / the blur is very negligable.
1384x1038@120hz, 255 Mhz - blurs at over 60 contrast.
1152x864@145hz, 216Mhz - blurs a bit at 100 contrast, almost perfect picture.
IMG_20200905_013122.jpgIMG_20200905_014421.jpgIMG_20200905_014430.jpg
 
Why dont you calibrate it via displaycal calibration (before profiling screen) by using service menu? Those mitsubishi tubes are very long lasting, my 20k hour 22b4 easily calibrated to 100cd/mm2 on every color temp.

I don't know honestly why I have't got that far with profiling, won't I just end up with a more color accurate but lower brightness screen? I want to maximise cd on default windows settings.

I know you electron blue 22 is the best of the best of the superbright diamondtron tubes. The 2060u is the best of the non-superbright diamondtron tubes. What sort of cd and color temps can you get out of the 300cd superbright mode?




I've figured out my problems with WPB, had to lower G2 a bit more. Perfect blacks now.
But there's still a funny issue with this one. When running high res/hz combo it starts to blur the picture when contrast is set high enough. At lower rez/hz it does not do this / the blur is very negligable.
1384x1038@120hz, 255 Mhz - blurs at over 60 contrast.
1152x864@145hz, 216Mhz - blurs a bit at 100 contrast, almost perfect picture.
 
Huh...this is interestering. This post from 2003 claims their calibrated 6500k 255, 255, 255 2070sb was only putting out 72cd on I'm assuming the non-superbright modes.

I'd say I'm doing pretty well with 73cd on the 2060u's text mode with deep blacks.

What do you guys think?
 

Attachments

  • Screenshot_20200905-093308.png
    Screenshot_20200905-093308.png
    1.1 MB · Views: 0
  • Screenshot_20200905-080810.png
    Screenshot_20200905-080810.png
    1.9 MB · Views: 0
This is from Anandtech:
Meanwhile VirtualLink ports, which were introduced on the RTX 20 series of cards, are on their way out. The industry’s attempt to build a port combing video, data, and power in a single cable for VR headsets has fizzled, and none of the big 3 headset manufacturers (Oculus, HTC, Valve) used the port. So you will not find the port returning on RTX 30 series cards.

So the only way to use USB-C adapters with new cards is this or this (and other methods that I don't know)



Same specs because they both have a 17.28 Gbit/s digital input bandwidth and a 200 MHz DAC, but Synaptics is 8 bit and Lontium 10 bit , information obtained from the respective manufacturers (can be bullshit)

Thanks for clarification. It will be a while before I get round to testing the bit depth of those adaptors, but it is something I hope to get done before I die :p
 
Back
Top