24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Check out this post:
https://hardforum.com/threads/24-wi...ived-comments.952788/page-435#post-1044652495
Alongside what it lists, I can personally vouch for this HDMI female to VGA male adapter: https://www.amazon.com/dp/B01B7CEOVK/?ref=exp_retrorgb_dp_vv_d
It's good to at least a 235Mhz pixel clock- I might test it for higher clocks at some point, despite my CRT not supporting anything higher, just to help contribute to a list of valid adapter choices. Unfortunately, most adapters seldom have DAC pixel clock listed in the specs due to how they're geared more towards crappy old LCD monitors and projectors more than anything. 1920x1080@60p, with its puny pclk of 173, is pretty much always the max for those things.

I thought the problem with the 10 series is that it doesn't output an analogue signal, which is why there is no DVI-I or why the VGA to HDMI cable I tried didn't work. I'm presuming then it is a converter rather than just an adaptor.

The USB-C output of motherboards is not good for video adapters because they usually don't support Displayport alternate mode and when is supported it works only with the integrated graphic of the CPU, so no way to connect it to your GTX 1060.
About the adapters try with the StarTech DP2VGAHD20, it has been tested by Flybye and seems to be a very good DP adapter with a pixel clock of 375 MHz, if you want more try with one of the Synaptics based.

Why are USB solutions suggested then?

Forgive me for my lack of knowledge (my background isn't electrical engineering), but 2304 x 1440 at 100hz or anything just over spec (as I don't really want to stain my rare and valuable piece of equipment) will be no more than a pixel clock of 350 MHz so as long as the converter can do that pixel clock it should be fine.

Also I'm in the UK and that DP one you suggested is quite expensive compared to some others I have seen.

Thanks.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
I thought the problem with the 10 series is that it doesn't output an analogue signal, which is why there is no DVI-I or why the VGA to HDMI cable I tried didn't work. I'm presuming then it is a converter rather than just an adaptor.



Why are USB solutions suggested then?

Forgive me for my lack of knowledge (my background isn't electrical engineering), but 2304 x 1440 at 100hz or anything just over spec (as I don't really want to stain my rare and valuable piece of equipment) will be no more than a pixel clock of 350 MHz so as long as the converter can do that pixel clock it should be fine.

Also I'm in the UK and that DP one you suggested is quite expensive compared to some others I have seen.

Thanks.

2304x1440 100 Hz is not supported by the FW900 and even if it were,it would require 488 MHz.
USB-C adapters are for video cards with that output, like the Nvidia RTX, you can connect your card to a USB-C dongle, but only using an additional special card that costs 80 and more euro.
What resolutions do you want to use more?
The Startech DP2VGAHD20 on Amazon UK costs £27.59, it is a good price.
If you want to spend less you can try with the Delock 62967 or the cheap Benfei and Rankie HDMI to VGA adapters, but with these you can have problems.
 
Have you guys seen on the CRT Collective facebook page lately, this bloke found a 28 inch Intergraph...the legendary John Carmack screen...
Screenshot_20200821-055041.png
Screenshot_20200821-060557.png
Screenshot_20200821-060739.png
Screenshot_20200821-060755.png
 
Last edited:
I'm curious.

In the age of Variable Frame Rate screens (G-Sync / FreeSync, etc.) what are the perceived benefits of continuing to use one of these?

It would seem it would be a lot of tradeoffs.

Absolute input lag would certainly be lower, but to take advantage of it you'd either need to put up with tearing or use vsync which would negate the input lag advantage, wouldn't it?

I miss CRT's from a nostalgic perspective, but I'm not sure I could go back to one, and I don't miss having a warm red face :p
 
In the age of Variable Frame Rate screens (G-Sync / FreeSync, etc.) what are the perceived benefits of continuing to use one of these?

99.5% of gsync displays are still sample and hold. Which means significantly reduced motion clarity against strobed displays


Absolute input lag would certainly be lower, but to take advantage of it you'd either need to put up with tearing or use vsync which would negate the input lag advantage, wouldn't it?

RTSS has a new feature called "scanline sync" that basically gives the user control over the vsync-off tearing line, so they can put it at the edge of the screen, or all the way off-screen in the blanking interval. So this is vsync with vsync-off input lag.

But there are ways to minimize lag with traditional vsync as well. Like capping your framerate to 0.002hz below your refresh rate, which will cut out the frame buffers with a side effect of a 16ms stutter once every few minutes
 
I'm curious.

In the age of Variable Frame Rate screens (G-Sync / FreeSync, etc.) what are the perceived benefits of continuing to use one of these?

It would seem it would be a lot of tradeoffs.

Absolute input lag would certainly be lower, but to take advantage of it you'd either need to put up with tearing or use vsync which would negate the input lag advantage, wouldn't it?

I miss CRT's from a nostalgic perspective, but I'm not sure I could go back to one, and I don't miss having a warm red face :p


for me, when i agree crts are not essential equipment for competitive gaming anymore and modern monitors are fast enought to enjoy video games from a competitive perspective,
from a beauty realistic immersive graphical gaming perspective modern monitors are still too flawled agains crts to my likes, specially in the motion handling department, motion blur has been a disgrace for games on modern displays to my tastes, it destroy the inmersion of games, games with nice looking detais such lighting, high res textures, landspaces, colors etc are lost when moving due to that, in real life i dont see the word being blurred and washed out as i move as it is on modern displays and brain knows that, so i dont feel the game world as realistically immersive as it feel on crts which motion handling is just as clean as i perseive real life.

unfortunately as said, gsync and freesync motion is also blurred, i am aware there are some modern monitors that try to offer better crt like motion handling but all of them i have read, seen, heard from have tradeoff side flaws when enabling their motion handling improved option (strobing) such very dim picture making it boringly vividless and hence ruining the motion quality improvement experience, even considerable lower bgithness than crts, without the user being able to adjust brightness and the few ones that allows strobing brighntess adjustment, also decrease motion quality, few ones having better brightness requiere high strobing frequencies, (= high fps) crosstalk artifacts (double images at top and / or botton of the screen when image is moving) more aggresive strobing flicker than crts at same frequency forcing the user to use much higher strobing frequencies than crts to percieve flicker free, and to reduce crosstalk and hence requiring more computer performance power to match framerate at high frecuencies, hence reducing the chances to get constant framerate to match the refresh rate even with high end hardware, which constant matching framerate is critical for a life like motion quality. also there is the poor blacks from modern monitors such "ips" nasty glow bleed, "tn" notable backlight ruining dark games inmersion ,"va" seem to have better blacks but still has those motion handling issues previously mentioned, oled comes close, but are very expensive and still limited motion handling with similar flaws mentioned as well.

also just as mentioned, things like scanline sync makes gsync and freesync laking on crts not really a necessity
well, those are some of my reasons going further than just nostalgia reasons on what i still prefer crts monitors over modern ones, even for modern gaming.
 
Last edited:
2304x1440 100 Hz is not supported by the FW900 and even if it were,it would require 488 MHz.
USB-C adapters are for video cards with that output, like the Nvidia RTX, you can connect your card to a USB-C dongle, but only using an additional special card that costs 80 and more euro.
What resolutions do you want to use more?
The Startech DP2VGAHD20 on Amazon UK costs £27.59, it is a good price.
If you want to spend less you can try with the Delock 62967 or the cheap Benfei and Rankie HDMI to VGA adapters, but with these you can have problems.
The DP2VGAHD20 is based on IT6564

edit. found that this was already mentioned. oops.

Thanks for your reply, how are you calculating the pixel clock? If it is 488 Mhz then then DP2VGAHD20 wouldn't be able to do 2304 x 1440 at 80hz. But the monitor should be able to do resolutions and refresh rates just over spec, such as 2560 x 1600 at 60 or 1920 x 1200 at 120hz, even saw someone say it could do 3200 x 2400 a few years ago. I don't want to go massively over spec, but a converter that can do a little over specs, such as 2304 x 1440 at 80hz + 25% seems like a sensible choice.
 
Thanks for your reply, how are you calculating the pixel clock? If it is 488 Mhz then then DP2VGAHD20 wouldn't be able to do 2304 x 1440 at 80hz. But the monitor should be able to do resolutions and refresh rates just over spec, such as 2560 x 1600 at 60 or 1920 x 1200 at 120hz, even saw someone say it could do 3200 x 2400 a few years ago. I don't want to go massively over spec, but a converter that can do a little over specs, such as 2304 x 1440 at 80hz + 25% seems like a sensible choice.

To calculate the pixel clock go here, after putting the resolution and refresh rate click on the white area of the page, look the calculated pixel clock under CVT timings and check also the horizontal frequency (H Freq)
488 MHz is for 2304x1440 100 Hz, for 80 Hz is 383 MHz, the FW900 specs are:
-Vertical refresh 48 to 160 Hz
-Horizontal frequency 30 to 121 kHz
You can't go under and over these limits, but you can do whatever you want between them.
For 1920x1200 120 Hz is required a horizontal frequency of 154 kHz, so impossible with a FW900.
The Startech DP2VGAHD20 can do 375 MHz on the sample tested by Flybye , usually other samples can do more and others less, so 375 MHz is not a perfect fixed frequency of all DP2VGAHD20.
Probably with 375 MHz you can get the 2304x1440 80 Hz reducing the total blanking, 79 Hz for sure.
2560x1600 60 Hz needs a pixel clock of 348 MHz and a horizontal frequency of 99 kHz so you can do that, but if you want to do things like 2560x1600 73 Hz or higher resolutions with very high pixel clock, you need an adapter based on the Synaptics chipset. (check here)
Take into consideration the fact that all these adapters have a DAC that from specs goes at 200 MHz, the faster is the ANX9847 with a default speed of 270 MHz, so everything you get going beyond it is out of specs.

3200x2400? Interlaced maybe..
 
Last edited:
I'm curious.

In the age of Variable Frame Rate screens (G-Sync / FreeSync, etc.) what are the perceived benefits of continuing to use one of these?

It would seem it would be a lot of tradeoffs.

Absolute input lag would certainly be lower, but to take advantage of it you'd either need to put up with tearing or use vsync which would negate the input lag advantage, wouldn't it?

I miss CRT's from a nostalgic perspective, but I'm not sure I could go back to one, and I don't miss having a warm red face :p

One of the key properties of a CRT that many love is the 'pixel softness'. Because of the way that the image is physically rendered, the boundaries of the pixels are not sharp transitions, but rather more gaussian ones that blur together information from neighbouring pixels. This loss of information, and thus reduced image fidelity, is a weakness of CRTs in applications where being able to see minute details with high precision is at a premium (e.g. medical radiometry, where small details contain useful diagnostic information, or in certain graphic design applications, and perhaps in reading text). But many images do well with a smoothing effect, such as those in movies, gaming, and much visual art & photography. This pooling, or integration of information across small spaces can also enhance the apparent color bit depth of an image, since the range of physically expressed colors exceeds the range of digitally encoded colors. (If you had a CRT that only showed two colors, say two different shades of red, and you rendered an image of a checkerboard using these colors, then there would actually be several intermediate shades of red that may be visible to the eye, depending upon the pixel resolution and viewing distance. This is probably true with fixed pixel LCD displays too, but I think the pooling regions are smaller.
 
Did you check D206 is OK ?
That could be an issue with IC403 as well

I've checked the IC403 and the D206, they are OK.
So the problem lies in a KG-G1 or KG-H shorted tube. Sad.

I'll try hacking up my unused 2.1 amp to get 5V AC heater voltage and measure the cathodes

upd.
KG-G1 - 10mA
KR/KB-G1 - 6mA

So there's a KG-G1 short indeed. Or is it heater - cathode?
 
Last edited:
Well, I don't really understand what you intended to do or where you're getting that idea from, but supplying the tube's heater pin with 6.5V AC when it's supposed to run on 4.7V DC is certainly not a safe thing. It may well cause/have caused the damage you thought there may be.

If you suspect there's a tube damage, then use a proper tube tester/regenerator to check that out and fix it if it's fixable. It's as simple as that.

Oh, and just to be perfectly clear about IC403: the method I described allows to find some obvious failures I've encountered on ICs of that type, but passing that rough test doesn't guarantee at all the IC is good. It just means you didn't find anything wrong. ;)
 
Last edited:
Well, I don't really understand what you intended to do or where you're getting that idea from, but supplying the tube's heater pin with 6.5V AC when it's supposed to run on 4.7V DC is certainly not a safe thing. It may well cause/have caused the damage you thought there may be.

I'm sorry as this was really dumb indeed.

I've isolated the heater voltage from ground by using an external power supply, and now the monitor is not flashing with bright green on startup. That intermittent short is obviously still there, but it does not matter as I've got a "floating" heater voltage now.
The screen is still greenish, so there's probably another short to the G1.

That's a point where I'll need a tube tester to continue troubleshooting.
 
Thanks Derupter, between the icybox at 45€ and the delock at 105€ is it a safe bet to go with the icybox for similar performance?
Cause the 980 Ti is good on a sub 200 dollar budget and a big upgrade over my laptop's GTX 1050.
But yeah, I still might get a Sunix DPU3000 or Delock/ICE's version for upgrades down the road while I still can.
so.. stupid question.. where can i get the Icybox IB-SPL1031 at 45€?

and where can i buy the others (i cant seem to find them anywhere):
Delock 87685
Sunix DPU3000

Please help, thanks everyone!
 
i have the feeling that the sunix dpu3000 is discontinued, about 4 months ago i asked sunix about availability, and they aswered to stay tuned to amazon for "soon" restocking, but havent seen any unit again anywere as today, i asked them again day 18 of this month but no answer from them as today.

here, at the moment i posted this, the Delock 87685 seems to be available in amazon uk: https://www.amazon.co.uk/Tragant-De...1&keywords=delock+87685&qid=1598267545&sr=8-1
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Have you guys seen on the CRT Collective facebook page lately, this bloke found a 28 inch Intergraph...the legendary John Carmack screen...
I can't even imagine the grin on that person's face when they picked it up or had it delivered!

And then we have people like this one who has 6 of them...

i have the feeling that the sunix dpu3000 is discontinued, about 4 months ago i asked sunix about availability, and they aswered to stay tuned to amazon for "soon" restocking, but havent seen any unit again anywere as today, i asked them again day 18 of this month but no answer from them as today.....
I asked Sunix countless times in the past to never really get a straight answer. I would miss everytime a Sunix popped up ebay. I finally lost my patience and ended up buying from eBay the Delock straight from Delock Germany.

Thanks for your reply, how are you calculating the pixel clock? If it is 488 Mhz then then DP2VGAHD20 wouldn't be able to do 2304 x 1440 at 80hz. But the monitor should be able to do resolutions and refresh rates just over spec, such as 2560 x 1600 at 60 or 1920 x 1200 at 120hz, even saw someone say it could do 3200 x 2400 a few years ago. I don't want to go massively over spec, but a converter that can do a little over specs, such as 2304 x 1440 at 80hz + 25% seems like a sensible choice.

Here is the link to my reply with the testing that I did.

Just keep in mind that I am pushing these converters to their limit, and they all do different and funky things at their limit. The limit of my converter could be a hair different than the limit on one if you got one.

Hello everyone, my FW900 is greenish, and the OSD menu color is normal. I have seen many people's sony FW900s that seem to have the same problem. This seems to be a common problem.
Does anyone know how to solve this problem? Is it because of capacitor failure? Can replace the capacitor to solve the problem?
I know I am a little late replying to this, but I wanted to mention: I had a similar problem. It looked like I lost a primary color. The BNC cable ended up going bad 🤷‍♂️. It seriously cracks me up how a cable that is never moved and is in a temperature controlled house just suddenly fails. I went from BNC to VGA cable to then be happy that the FW900 is fine. Then I bought a few more BNC cables from ebay.
 
I've isolated the heater voltage from ground by using an external power supply, and now the monitor is not flashing with bright green on startup. That intermittent short is obviously still there, but it does not matter as I've got a "floating" heater voltage now.

Well.. I tried to calibrate it using WinDAS again and I'm actually able to calibrate the 9300K cutoff step in WPB procedure.
So happy, haha.
Here are some photos.

So the floating heater voltage crt guys used back in the days to fix up trinitron crt tvs really worked wonders :)

Assembly + full WPB step by step next
IMG_20200824_235451.jpg
IMG_20200825_001211.jpg
IMG_20200825_001203.jpgIMG_20200825_001155.jpg
 
Well.. I tried to calibrate it using WinDAS again and I'm actually able to calibrate the 9300K cutoff step in WPB procedure.
So happy, haha.
Here are some photos.

So the floating heater voltage crt guys used back in the days to fix up trinitron crt tvs really worked wonders :)

Assembly + full WPB step by step next
Well, I'm very sceptical about that. Some 5V DC can't disturb a signal of about 100V (that's the amplitude of the KG one). If there were some leakage between the two lines that would be the other way around. On the top of that, the leakage would remain the same whatever power supply you use, it's a difference of potential combined with an unwanted connection that causes that, not the fact some ground is shared or not.

Please think carefully about what happened and when exactly. Because the color flashing thing is what happened for some time before my faulty IC403 lost completely one channel. And it is a known thing that intermittent shorts in integrated circuits can sometimes be fixed by soldering heat.
So it would be totally plausible that the problem really was with IC403, and the soldering job did somewhat turn a dying IC into a half broken one. (BTW I hope you did place it back properly with thermal paste on its heatsink, the thing heats and really needs it ;) ).
 
Hello everyone, I encountered a strange phenomenon when using windas for white balance adjustment Sony CPD-G520.
When I go to step 48, das prompts to use a white image. When I display a white pattern on the full screen, the brightness of the entire screen will immediately decrease. The white brightness of DTP-94 test is about 50 nits.
I don't know where is the problem? When I calibrate my other Dell P1110, everything works fine.
Has anyone encountered the same problem?

Hey Hike, were you able to find a fix to this? I am also using a Sony CPD-G520. I am unable to finish the white balance "Drive" step as every time i increase brightness to try and reach a certain Y target, the display automatically decreases in brightness...
 
Hey Hike, were you able to find a fix to this? I am also using a Sony CPD-G520. I am unable to finish the white balance "Drive" step as every time i increase brightness to try and reach a certain Y target, the display automatically decreases in brightness...
Hey Hike, were you able to find a fix to this? I am also using a Sony CPD-G520. I am unable to finish the white balance "Drive" step as every time i increase brightness to try and reach a certain Y target, the display automatically decreases in brightness...
Yes, I found 2 very reliable methods. But the root cause is the aging of the display tube.
Method 1. There is no need to calibrate. After entering das, export the DAT file to modify the G2 value to ensure that the black is normal, and then warm the machine for 30 minutes and do a color correction.
Method 2. During the calibration process, I found that if the full screen is white, the brightness will decrease. However, the brightness can be maintained when the white picture is windowed and other background colors are dark. In this way, through WBP calibration, and finally save the DAT file, the problem can be solved by modifying another line of DAT value.

The above is translated by Google.
 
Yes, I found 2 very reliable methods. But the root cause is the aging of the display tube.
Method 1. There is no need to calibrate. After entering das, export the DAT file to modify the G2 value to ensure that the black is normal, and then warm the machine for 30 minutes and do a color correction.
Method 2. During the calibration process, I found that if the full screen is white, the brightness will decrease. However, the brightness can be maintained when the white picture is windowed and other background colors are dark. In this way, through WBP calibration, and finally save the DAT file, the problem can be solved by modifying another line of DAT value.

The above is translated by Google.

Hey to add to your problem I was able to fix it! In Windas event viewer you can see the ABL (Auto brightness limiter) values. The ABL values was very low and I changed the values to my original.dat back up.
Using NotePad++ I changed the values, checked in event viewer to confirm that all values are valid. (Gotta make sure its in the right format, correct spaces apart). Wasn't sure which one to change specifically so i changed all 3. So far so good! No more brightness decrease on a white full image.

1598335211025.png
 
Hey to add to your problem I was able to fix it! In Windas event viewer you can see the ABL (Auto brightness limiter) values. The ABL values was very low and I changed the values to my original.dat back up.
Using NotePad++ I changed the values, checked in event viewer to confirm that all values are valid. (Gotta make sure its in the right format, correct spaces apart). Wasn't sure which one to change specifically so i changed all 3. So far so good! No more brightness decrease on a white full image.

View attachment 272902
Wow, nice. I did not view it in Windas Event Viewer. I found that using the original DAT data does not automatically reduce the brightness, so after WBP, I used the text software to open the original DAT and the new DAT file to compare the difference, and finally modified a single value. It has been used well so far. So it's still your way
 
Well, I'm very sceptical about that. Some 5V DC can't disturb a signal of about 100V (that's the amplitude of the KG one). If there were some leakage between the two lines that would be the other way around. On the top of that, the leakage would remain the same whatever power supply you use, it's a difference of potential combined with an unwanted connection that causes that, not the fact some ground is shared or not.

Please think carefully about what happened and when exactly. Because the color flashing thing is what happened for some time before my faulty IC403 lost completely one channel. And it is a known thing that intermittent shorts in integrated circuits can sometimes be fixed by soldering heat.
So it would be totally plausible that the problem really was with IC403, and the soldering job did somewhat turn a dying IC into a half broken one. (BTW I hope you did place it back properly with thermal paste on its heatsink, the thing heats and really needs it ;) ).

I found out why it was playing nicely yesterday and I could lower the green signal - the monitor was cold enough. I've just fired it up after night and could perfectly see how the green level is climbing up in HCFR right after start.
Actually it was climbing slowly and jumped really high just now, with black pattern going from ~bluish blacks with G_BKG_MAX to 0 in WinDAS to full green screen - now the slider is not doing anything till it's set to 80 just like the original issue was.
If the pattern is white - G_BKG_MAX setting kinda does something from 0 to 80.

Here's a video of how the color displayed on the screen affects the green level.

I feel so impulsive with this monitor, it's dumb.
 
NBAPwns13

Congratulations, you just bypassed SAFETY values. :facepalm:

Really, I think you should remove your post before giving stupid ideas to other people ... These values are meant to be set automatically during WPB and for good reasons ...
 
So lots of people are suddenly having problems with that part of WPB, are these tubes really all shot or is this a bug in the program?

Tangentially this seems like a scam but the description on this recent Ebay listing is interesting. The seller has good feedback and all, seems legit but this is unheard of.
[SONY FW900 GDM-FW9012 (FOR PARTS, NOT WORKING). Condition is For parts or not working. Stock Photo - bezel may have scratches and/or cracks. Sony disables the tube after a set number of hours of use. We provide a programming cable, software, and programming manual to reactivate and configure the FW900, but it is up to buyer to figure it out. Local pickup or buyer arranged shipping only. We have a custom padded box but this is an 80+ pound item so we will NOT arrange the shipping. Absolutely no returns. Absolutely no guarantees.

https://www.ebay.com/itm/SONY-FW900...cc9e7e1:g:tSUAAOSwzVxfF-7W#vi__app-cvip-panel
 
Last edited:
As an eBay Associate, HardForum may earn from qualifying purchases.
So lots of people are suddenly having problems with that part of WPB, are these tubes really all shot or is this a bug in the program?
I'd rather think that the recent buzz around that kind of screen attracted people getting sold screens in bad condition, wanting it working at all costs, while being also significantly less informed than the average of the ones who had interest in CRTs previously. This combination gives that kind of result I suppose.

As for the ebay description, there's no time counter on the FW900 and the other screens of the same generation, so how could a tube be disabled after a set number of hours ? Funny bullshit. :ROFLMAO:
 
i have the feeling that the sunix dpu3000 is discontinued, about 4 months ago i asked sunix about availability, and they aswered to stay tuned to amazon for "soon" restocking, but havent seen any unit again anywere as today, i asked them again day 18 of this month but no answer from them as today.

here, at the moment i posted this, the Delock 87685 seems to be available in amazon uk: https://www.amazon.co.uk/Tragant-De...1&keywords=delock+87685&qid=1598267545&sr=8-1
Thanks... ouch, that's a pricey adapter.. oh well, i bought it. I'm just about to pull out my trusty ol' 21" DELL P1130 from my closet after 10 years without use. It's specs are.. 2048x1536@80 Hz, 130kHz horizontal refresh, 170Hz vertical refresh, 0.24 mm dot pitch! ... it seems to be the DELL re-brand of the 21" Sony GDM-C520K! (EDIT: correction, re-brand of the CPD-G520) I never truly calibrated it.. so wish me luck yall! If any of you guys have a DELL P1130, and have any tips or tricks for me, please share! Thank you. EDIT: Also, I DID actually find a nice guide (with pictures!) on setting up WinDAS and a USB to ttl cable for my P1130 here.

Also I found this neat video with a neat trick for reducing the signal degredation from VGA cables while using a SUNIX-based DAC, anyone tried this? I think I will be trying it.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
CubanLegend, hope enjoy your crt monitor!, yes, those adapters are sadly expensive, i suspect due that those are not just digital to analog converters but are also multi output splitters, and seem to be the few ones, if not the only ones that support high pixel clocks mhz. (splitting capabilities one could care nothing, but since they are allowing to use high end crt monitors like the fw900 at its high capable resolution and high refresh rates with modern video cards)

i tried that trick from that youtube user with fw900 and sunix dpu3000 (which required to retire the cover plastic box from the monitor to allow the sunix wiring side cables to fit) but i did not perceive any difference or improvement. i believe as long as the vga cable is a good quality one and is in good condition, there is no need to connect the adapter that way.
 
Last edited:
Thanks... ouch, that's a pricey adapter.. oh well, i bought it. I'm just about to pull out my trusty ol' 21" DELL P1130 from my closet after 10 years without use. It's specs are.. 2048x1536@80 Hz, 130kHz horizontal refresh, 170Hz vertical refresh, 0.24 mm dot pitch! ... it seems to be the DELL re-brand of the 21" Sony GDM-C520K! I never truly calibrated it.. so wish me luck yall! If any of you guys have a DELL P1130, and have any tips or tricks for me, please share! Thank you. EDIT: Also, I DID actually find a nice guide (with pictures!) on setting up WinDAS and a USB to ttl cable for my P1130 here.

Also I found this neat video with a neat trick for reducing the signal degredation from VGA cables while using a SUNIX-based DAC, anyone tried this? I think I will be trying it.



The Dell P1130 is not a rebrand of the Sony GDM-C520K... It is a rebrand of the CPD-G520... It has the same exact tube and internal components, except for a few different parameters...

Hope this helps...

Sincerely,

Unkle Vito!
 
For the TTL cable part - the 5v and GND could often mess up the connection between WinDAS and the monitor sometimes, so you can also try without them, just TX and RX.
Ah i see.. thank you for the advice!

CubanLegend, hope enjoy your crt monitor!, yes, those adapters are sadly expensive, i suspect due that those are not just digital to analog converters but are also multi output splitters, and seem to be the few ones, if not the only ones that support high pixel clocks mhz. (splitting capabilities one could care nothing, but since they are allowing to use high end crt monitors like the fw900 at its high capable resolution and high refresh rates with modern video cards)

i tried that trick from that youtube user with fw900 and sunix dpu3000 (which required to retire the cover plastic box from the monitor to allow the sunix wiring side cables to fit) but i did not perceive any difference or improvement. i believe as long as the vga cable is a good quality one and is in good condition, there is no need to connect the adapter that way.
Yes i turned it on and.. well (i shouldve taken pictures) it looked like it had serious G2 drift, but thankfully i waited 30 minutes for it to warm up, and did the COLOR RETURN, and it showed some grey on screen and flipped on and off a little and then it turned back on with what seemed like PERFECT G2 levels! I'm surprised from 10 years in storage (albeit climate controlled and dust-free) it didnt seem to exhibit any G2 drift at all. I was able to get some REALLY high refresh rates just from my VGA out on my Lenovo Thinkpad X220t, i was able to get 153Hz at 1024x768 and up to 2560x1920 @ 50Hz, seemed my VGA out on my laptop is maxed out at 350MHz, anything above that was "out of range" of the Intel Integrated Graphics it has... i was just testing it before I get my DAC to hook it up to my GTX 1080..

By the way, the link i purchased from on Amazon.de, the seller 'Hawks Photo Video" emailed me the next day that the Icy Box IB-SPL1031 i ordered, was OUT OF STOCK due to it being discontinued, and they refunded my order.. :( so I'l have to go for a slightly more expensive Delock if i want the super high refresh rates and resolutions on my P1130... honestly I'm moreso loving the lower resolutions with the higher refreshes, the smoothness of motion is just UNREAL after having used VA panels for 10 years exclusively.. so i like lower res and higher refresh - moreso than the higher resolutions.. so i MAY not actually NEED the Delock.. we'll see (i may just get one for shit's and giggles, just so i can see what MHz Pixel clock the P1130 will max out at), I ordered a Rankie 1080P Active HDTV HDMI to VGA Adapter , just as a stop-gap while i wait to consider if i want to go for the Delock...

Does anyone on here KNOW the maximum Pixel Clock the DELL P1130 or Sony GDM-C520K can reach? Please let me know, this will REALLY affect my decision of either spending $130+ on a Delock or keeping the $8 Rankie, lol..



The Dell P1130 is not a rebrand of the Sony GDM-C520K... It is a rebrand of the CPD-G520... It has the same exact tube and internal components, except for a few different parameters...

Hope this helps...

Sincerely,

Unkle Vito!
oh wow, thank you! How did you know this? and are you familiar with it's max Pixel Clock?.. or any other little things about it?

By the way, how come it's not a rebrand of the GDM-C520K? it looked like they has the same specs? what's the difference between the GDM-C520K and the CPD-G520K?

Also, I'd like to ask you (or anyone that knows..) the following about my P1130..
-What brightness & Contrast numbers should i be running this monitor at during and after calibration?
-What COLOR MODE should i be on, Easy where i just select from 5000K to 11000K, PRESET which is 5000k, 6500k, 9300k, or Expert where i have all the different RGB Gan/Bias levels to edit manually, which mode should i run during and after calibration?
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Back
Top