24" Widescreen CRT (FW900) From Ebay arrived,Comments.

datspike

n00b
Joined
Jul 24, 2017
Messages
11
That would require 389 MHz (GTF). 360 MHz is the max pixel clock rate (8 bpc) for 2 lane DisplayPort 1.2 used by the ANX9833, ANX6211, and ANX6212.
Do these support 6 bpc input for higher pixel clocks (480 MHz) using Nvidia control panel? Does AMD control panel have a 6 bpc option?
Nope. All I've managed to squeeze from it at 6bpc is ~367 MHz.
 

datspike

n00b
Joined
Jul 24, 2017
Messages
11
The driver accepts 480 MHz timing, but the adapter outputs black from the DACs or no syncs at all?
For some reason I lost EDID detect today while swapping between my FP955 and 22b4, so I will only be able to test that when I will get back home from my country house in 2 days on my another 22b4
 

Derupter

Limp Gawd
Joined
Jun 25, 2016
Messages
180
I got delock 62967 (ANX9847) today and really pleased for the price. pixel clock is more than high enough, at around 1800 x 1350 is where pixel clock limit start to overtake the horizontal scan limit of my DP2070sb which is where it already becomes kind of pointless to trade refresh rate for more resolution due to the blur and dot pitch limit. 1600x1200 @110Hz is the sweet spot
I cant directly compare but I think this adapter actually handles higher scan rates far better than my GTX750Ti DAC did, despite the lower P clock limit.
There was more pronounced ringing and noise artifacts at higher rates rates with 750ti, image is much cleaner now.
Im planning on extending the DP cable and using female to male adapter to connect the adapter 's VGA port directly into the CRT's port for the best possible signal integrity, whether it makes any difference or not compared to standard VGA cable will be interesting.

Also I seen 10 bit support with CRTs mentioned before, are there any options for 10bit adaptors? searched the thread and found the answer
have HDfury X4 or the 10 bit Delock USB C adapters been tested with 10bit output to CRT by anyone?
Extending the DP cable may cause performance degradation with that adapter like no stable HBR2 mode, so be warned.
About 10 bit adapters read here
Also chipset like LT8711X-B or LT8712X of the Vention adapters may have 10 bit DAC.
No one did 10 bit test with these adapters.

That's Vention HBFBB with ANX9833 swapped to ANX6212. Will test it in ~7 hours and edit the message :)
I've also got another sample of HBFBB on hands and 2 ANX9847 coming from aliexpress, will be quite interesting if that one would not work

upd. IT'S WORKING! 360 Mhz 8bpc
A very nice job, 360 MHz with 8 bit is the max with that chipset, the best sample i have with ANX9847 can do 355 MHz, the average is 345-347 Mhz.
I'd be curious to see the results with the ANX9847, look at the PCB of the Delock 62967
DSC05806.JPG
very similar isn't it?
 
Last edited:

jt55

n00b
Joined
Feb 28, 2020
Messages
2
Extending the DP cable may cause performance degradation with that adapter like no stable HBR2 mode, so be warned.
About 10 bit adapters read here
Also chipset like LT8711X-B or LT8712X of the Vention adapters may have 10 bit DAC.
No one did 10 bit test with these adapters.
The DP cable on it actually seems to be decent quality so not going to bother with that.

I still got the VGA gender adapter though, basically had to set up my PC so that it was behind the CRT with its GPU outputs facing it to be able to plug it in with existing short DP cable on the converter.
The improvement to image quality was quite pronounced with this direct connection, this is compared to Profigold VGA cable too, not a cheap VGA cable.
Getting almost LCD levels of sharpness at 60-70Hz and realising I've totally underestimated high resolutions on the CRT.
The incredible thing is how you dont lose the near perfect motion clarity at lower refresh rates, a sharp high res image along with this level of motion clarity is even more impressive, on top of that you still have the excellent contrast and colours and you can even appreciate the contrast far more with better defined edges.

Hopefully they eventually release a cheap HDMI or DP version with that chipset. With around 1800 x 1350 @ 60Hz with a 200Mhz p clock it still seems usable.
 
Last edited:

XoR_

Gawd
Joined
Jan 18, 2016
Messages
862
That would require 389 MHz (GTF). 360 MHz is the max pixel clock rate (8 bpc) for 2 lane DisplayPort 1.2 used by the ANX9833, ANX6211, and ANX6212.
Do these support 6 bpc input for higher pixel clocks (480 MHz) using Nvidia control panel? Does AMD control panel have a 6 bpc option?
Why would anyone want to use 6bit?
With good dithering it looks ok but it is still additional noise and all you get in return is going higher resolution while still being way beyond where these displays can display pixel perfect image.

Getting almost LCD levels of sharpness at 60-70Hz and realising I've totally underestimated high resolutions on the CRT.
The incredible thing is how you dont lose the near perfect motion clarity at lower refresh rates, a sharp high res image along with this level of motion clarity is even more impressive, on top of that you still have the excellent contrast and colours and you can even appreciate the contrast far more with better defined edges.
60Hz on CRT? You mad bro?
Motion will be sharp but you need v-sync and input lag will be horrendous not to mention it is bad for eyes because it flickers like hell.

The only reason to use CRT these days is gaming and for that it is best to keep high refresh rate of about 100Hz and preferably higher. Thus no crazy high resolutions anyway independent of adapter/output used.
 

joevt

Weaksauce
Joined
Jul 28, 2017
Messages
79
Why would anyone want to use 6bit?
Curiosity. We have these adapters that can do 330, 340, 360 MHz and I wonder if they can go faster using 6 bit.

With good dithering it looks ok but it is still additional noise and all you get in return is going higher resolution while still being way beyond where these displays can display pixel perfect image.
I understand that the adapters while they can accept pixels at 360 MHz, that their DACs may only give good results up to 270 MHz or whatever. Of course CRTs have a limit on their resolution based on the dot pitch of the phosphors or shadow mask or aperture grille but we can use the bandwidth to increase refresh rate instead. If the DACs were perfect, then does increasing refresh rate produce a less perfect image?
 

3dfan

Weaksauce
Joined
Jun 2, 2016
Messages
108
60Hz on CRT? You mad bro?
Motion will be sharp but you need v-sync and input lag will be horrendous not to mention it is bad for eyes because it flickers like hell.
when it's a personal and respectable point of view, 60hz crt gaming is far from being a "flicker like hell" experience generalized as you state, as a matter of fact one of the strong points of crts againts today motion clarity technicles used by modern monitors (strobing back lights or black frame insertion sofware) is precisely its rolling scan image generation nature agains modern monitor black at once strobing periods or black frame insertion software to produce better motion clarity which result in a notable worse flicker than crts, i have personally witnessed it, being necesary to use higher refresh rates than crt to achieve the same flicker free perception (and so, more expensive gpus) at the point of manufacturers are locking modern monitors refresh rates below 75 - 85hz when strobing backlights due to "flicker like hell" worse than crt flicker experience feeling on these. also i have been playing a lot of games at 60hz on crts for over 20 years and still do on my fw900 (even some at 55hz on mame) and have never notice the flicker nor have had eye strain issues.

im not pretending to state i am a special flicker inmune man here, i know there are many gamers that enjoy 60hz crt gaming with no issues (even 60hz modern monitor strobing) as well an other that dont, no offence but i just find senseless to generalize this as you do.

as for the input lag, for single player is quite decent and enjoyable with vsync on crt even at 60hz for me, no need to generalize it as well, also there are ways to reduce it and preserve vsync on tearless experience like using things like fast sync, low latency modes (nvidia or amd equivalent) scanline sync (Rivatuner Statistics Server software) etc..;)
 
Last edited:
Joined
Nov 4, 2010
Messages
56
as for the input lag, for single player is quite decent and enjoyable with vsync on crt even at 60hz for me, no need to generalize it as well, also there are ways to reduce it and preserve vsync on tearless experience like using things like fast sync, low latency modes (nvidia or amd equivalent) scanline sync (Rivatuner Statistics Server software) etc..;)
Scanline Sync is godlike on a CRT ;)
 

XoR_

Gawd
Joined
Jan 18, 2016
Messages
862
3dfan
It could be even said that for 60fps locked games it makes sense to run them at 60Hz monitor, eg. all emulators, console ports, some id software games, etc. and yes, CRT is much much better for this than trying to run 60Hz strobed on LCDs

My comment was rather than prioritize resolution sacrificing refresh rate one should find middle ground so for example with 200MHz clock budget I would want eg. 1280x960 with as high refresh rate possible which would be close to 120Hz rather than eg. 1800 x 1350 @ 60Hz which seems unusable both in terms of actually being able to properly resolve each pixel clearly and also very flickery, bad for eyes and with inferior input lag and motion fluidity. Cause yes, even though 60Hz on CRT with proper synchronization is very clear and seems fluid, 120Hz on CRT is so much better.

Scanline Sync is godlike on a CRT ;)
True
 

XoR_

Gawd
Joined
Jan 18, 2016
Messages
862
Curiosity. We have these adapters that can do 330, 340, 360 MHz and I wonder if they can go faster using 6 bit.
For curiosity it makes sense
For actual usage it could also make some sense if dithering is good enough eg. Radeons do have great dithering. On Nvidia there are ways to enable dithering and eg. get on 8bit connection/displays proper banding-less gamma control, but unfortunately it is not enabled by default and the last time I tried it it was very finnicky and got disabled after eg. putting computer to sleep.

I understand that the adapters while they can accept pixels at 360 MHz, that their DACs may only give good results up to 270 MHz or whatever. Of course CRTs have a limit on their resolution based on the dot pitch of the phosphors or shadow mask or aperture grille but we can use the bandwidth to increase refresh rate instead. If the DACs were perfect, then does increasing refresh rate produce a less perfect image?
My Delock does ~340-50MHz on 8bit and image quality is almost the same as on native outputs from graphic cards such as GTX780 or Radeon 7950
Increasing refresh rate does in fact decrease sharpness but it is I believe more due to how CRT works than due to DAC quality. Very long time ago when I used CRTs for desktop I would use resolutions such as 1280x960 @ 85Hz instead of 100Hz or 120Hz because it gave much sharper image even though I could do something like 1600x1200@100 or even 120Hz.

Today it does not matter though. For games it is much less important and for desktop we have LCDs which are superior in every single aspect for this usage scenario.
 
Joined
Mar 23, 2013
Messages
941
Games are fine at 60hz, after all, we all watched 60hz CRT TV's during our childhood. 60hz only bothers me when browsing the web and stuff like that.

And besides scanline sync, vsync input lag can also be reduced by capping frame rate to a a few thousandths of a frame below the refresh rate, and also using Nvidia and AMD's new "low latency" modes that reduce CPU buffer time. Not saying 60hz is the way to go for competitive games, but I played all of Control at 1792x1344 and it was great.

And in DX9 games, you can use GeDoSaTo to delay CPU buffer time to a fraction of the overall frame time.
 

datspike

n00b
Joined
Jul 24, 2017
Messages
11
we all watched 60hz CRT TV's during our childhood.
I was under assumption that TV CRT's are using a much more "long lasting" phosphor, making them save the picture for a longer period and eliminating flicker that way.
I get headaches and my eye's hurt if I run my 22b4 under 90 Hz :/

BTW, I got the Delock 62967 and it's capable of just 340 Mhz, also has a short dp connector without a lock. Worse than a ANX6212 swapped Vention in every way, eh. Guess I'll try to sell it.
And my ANX9847 shipment got canceled. Seem's like I would not be able to test a Vention with ANX9847, sorry Derupter

I've got 6 Vention HBFBB and 6 ANX6212 shipped to me, so testing them will be interesting at least.
 

jbltecnicspro

Supreme [H]ardness
Joined
Aug 18, 2006
Messages
5,649
I was under assumption that TV CRT's are using a much more "long lasting" phosphor, making them save the picture for a longer period and eliminating flicker that way.
I get headaches and my eye's hurt if I run my 22b4 under 90 Hz :/

BTW, I got the Delock 62967 and it's capable of just 340 Mhz, also has a short dp connector without a lock. Worse than a ANX6212 swapped Vention in every way, eh. Guess I'll try to sell it.
And my ANX9847 shipment got canceled. Seem's like I would not be able to test a Vention with ANX9847, sorry Derupter

I've got 6 Vention HBFBB and 6 ANX6212 shipped to me, so testing them will be interesting at least.
As far as I can tell, CRT TV's use the same phosphors as the professional monitors. I think that TV's don't flicker as much because:

1. The resolution was far lower than PC monitors, so you don't need to draw as much dots in a single shot.
2. Most of the time you'd never see a true 60hz progressive signal unless running at half the vertical resolution of 60i (otherwise known as 240p). At that low a resolution, you wouldn't see hardly any flicker - even on large screens.

Just my thoughts. I've owned all of the high-end GDM monitors and have a PVM right now, so I'm speculating based on those experiences.
 
Joined
Mar 23, 2013
Messages
941
I get headaches and my eye's hurt if I run my 22b4 under 90 Hz :/
So to be clear, you're not talking about only typical desktop usage, 60hz in video games gives you a headache too?

Because there are so many games that are capped at 60fps (Raman Legends, Mega Man 11, Street Fighter 5) that I would hate to not be able to play at 60hz and get the 1:1 frame rate to refresh rate smoothness.
 

datspike

n00b
Joined
Jul 24, 2017
Messages
11
60hz in video games gives you a headache too?
I have not tested the 60 hz on a CRT in games as I rarely play anything other than Apex, but I've once accidentally run 60 Hz strobed on my LG 24GM79B in Apex and it was just as bad as desktop at < 85 Hz on a CRT.
Maybe I'm just weird, I dont want to start a war here :)

I think that TV's don't flicker as much because:

1. The resolution was far lower than PC monitors, so you don't need to draw as much dots in a single shot.
2. Most of the time you'd never see a true 60hz progressive signal unless running at half the vertical resolution of 60i (otherwise known as 240p). At that low a resolution, you wouldn't see hardly any flicker - even on large screens.
I really like those arguments, may very well be the truth.
 

Derupter

Limp Gawd
Joined
Jun 25, 2016
Messages
180
BTW, I got the Delock 62967 and it's capable of just 340 Mhz, also has a short dp connector without a lock. Worse than a ANX6212 swapped Vention in every way, eh. Guess I'll try to sell it.
And my ANX9847 shipment got canceled. Seem's like I would not be able to test a Vention with ANX9847, sorry Derupter

I've got 6 Vention HBFBB and 6 ANX6212 shipped to me, so testing them will be interesting at least.
The first prototype of 62967 was better, both the connector and the adapter housing, 340 MHz is also the speed of it.
About the ANX9847 test you can try to swap the chip from the 62967 to the HBFBB.
I think that the electrical diagram of the ANX9833 is like that of the 9847-6211-6212 but there are different PCB design and i think that the one of your Vention HBFBB is better than Delock 62967 (wider traces of the main lanes and probably a better ground and power plane)
Wow 6 Vention HBFBB and 6 ANX6212, maybe some users might be interested in having one of your "special" modded adapter capable of 360 MHz.
 

XoR_

Gawd
Joined
Jan 18, 2016
Messages
862
Games are fine at 60hz, after all, we all watched 60hz CRT TV's during our childhood. 60hz only bothers me when browsing the web and stuff like that.
And besides scanline sync, vsync input lag can also be reduced by capping frame rate to a a few thousandths of a frame below the refresh rate, and also using Nvidia and AMD's new "low latency" modes that reduce CPU buffer time. Not saying 60hz is the way to go for competitive games, but I played all of Control at 1792x1344 and it was great.
And in DX9 games, you can use GeDoSaTo to delay CPU buffer time to a fraction of the overall frame time.
I do not say 60Hz is unusable but that it is not worth it and it is better to throw your money at higher refresh rate at expense of resolution, no matter if it is pixel clock, horizontal sync rate or even GPU performance.
There is sweet spot for these CRTs and it is imho around 900-1000 lines.
 
Last edited:

datspike

n00b
Joined
Jul 24, 2017
Messages
11
Wow 6 Vention HBFBB and 6 ANX6212, maybe some users might be interested in having one of your "special" modded adapter capable of 360 MHz.
3 of them are going to my friends, but the other ones will probably be up on ebay for something like 20$ plus shipping.
I'm replacing the chips now by the way, will test them in the evening.
 
Joined
Mar 23, 2013
Messages
941
I do not say 60Hz is unusable but that it is not worth it and it is better to throw your money at higher refresh rate at expense of resolution, no matter if it is pixel clock, horizontal sync rate or even GPU performance.
There is sweet spot for these CRTs and it is imho around 900-1000 lines.
Some games are capped at 60fps, so you gotta run at 60hz if you want judder-free motion
 

datspike

n00b
Joined
Jul 24, 2017
Messages
11
3 of them are going to my friends, but the other ones will probably be up on ebay for something like 20$ plus shipping.
I'm replacing the chips now by the way, will test them in the evening.
Wow 6 Vention HBFBB and 6 ANX6212, maybe some users might be interested in having one of your "special" modded adapter capable of 360 MHz.
So.. Guess I was just lucky with my first 360 Mhz capable modded adapter:
I've sent one adapter to one of my friends and I don't even know if it's working, ouch.
2 of the adapter are not working. I will reflow the chips sometime soon.
3 other adapters are each different, and capable of 355, 340 and 335 Mhz.

So the only profits of such a mod over getting the delock is a questionably better PCB, an ability to bin the ANX chips (330Mhz is already plenty imo, so that's not actually a pro), and a locking DP connector.
Plus the obvious ability to get a high quality adapter for cheap if the delock is not available in your country.
 
Last edited:

Derupter

Limp Gawd
Joined
Jun 25, 2016
Messages
180
So it's like the ANX9847, some chip are better than others, max pixel clock of my eight samples:
340-342-345-346-346-347-354-355
Another thing is that 62967 doesn't work with all the video cards and when this happen the DP cable must be replaced.
It would be interesting to know if the same thing happen with the HBFBB, an adapter that works on your computer but who stop at 180 MHz on another would prove this.
 

datspike

n00b
Joined
Jul 24, 2017
Messages
11
So it's like the ANX9847, some chip are better than others, max pixel clock of my eight samples:
340-342-345-346-346-347-354-355
Another thing is that 62967 doesn't work with all the video cards and when this happen the DP cable must be replaced.
It would be interesting to know if the same thing happen with the HBFBB, an adapter that works on your computer but who stop at 180 MHz on another would prove this.
The 62967 connector is short, it's barely working on my RX 5700, I have to plug it VERY hard and snug into the port. But the HBFBB DP connector is standard length and has a lock, so I assume it would work properly on every card. I will test it on my friends pc with a 1080 Ti tomorrow. Well, more like 3 of them which I have working.
 
Joined
Apr 2, 2020
Messages
5
Interesting. So far I've been able to successfully use WinDAS on a sony FW900, the HP rebrand, an IBM P275, and a Dell P1130. So I'm guessing mine isn't barebones by default (tho I understand u made yours barebones intentionally).
I know this post is 6 years old but can you please run me through the process of WinDASing your IBM P275? Mine doesnt seem to like it. When i hook it up to the USB To RS232 TTL UART PL2303HX, the monitor stays on standby. If I hook it up while its displaying an image, WinDAS still makes no connection. I tried 3 different WinDAS copies. I tried XP and 7. I put the .ocx in system/system32/syswow64 and reg'd it. Selected the correct com port. SG manual. It just won't connect. Its a shame because the monitor looks great besides the high g2 voltage.
 

spacediver

2[H]4U
Joined
Mar 14, 2013
Messages
2,570
I know this post is 6 years old but can you please run me through the process of WinDASing your IBM P275? Mine doesnt seem to like it. When i hook it up to the USB To RS232 TTL UART PL2303HX, the monitor stays on standby. If I hook it up while its displaying an image, WinDAS still makes no connection. I tried 3 different WinDAS copies. I tried XP and 7. I put the .ocx in system/system32/syswow64 and reg'd it. Selected the correct com port. SG manual. It just won't connect. Its a shame because the monitor looks great besides the high g2 voltage.
Hm, have you connected the wires in the correct order?

My notes say: black red green white (top to bottom)

And did you get the drivers for the PL2303HX?
 
Joined
Apr 2, 2020
Messages
5
Hm, have you connected the wires in the correct order?

My notes say: black red green white (top to bottom)

And did you get the drivers for the PL2303HX?
Yes I used PL2303_CheckChipVersion_v1006 to verify the chip, then used PL2303_Prolific_DriverInstaller_v1200 (supposedly used in all windows besides 10). I also have PL2303-W10RS3RS4-DCHU-DriverSetup_v1192_20180503 but apparently its for windows 10 only. IN terms of the wires, I tried Black, Red, White, Green and Black, Red, Green, White. Of course the first is directly how they are labeled(even inside the usb). I also tried the default windows 7 drivers for it. Im really at a loss.
 

spacediver

2[H]4U
Joined
Mar 14, 2013
Messages
2,570
You said the monitor stays on standby.

Is it being fed a video signal by your main PC, or are the only connections to the monitor the main power cable and the TTL connector?
 
Joined
Apr 2, 2020
Messages
5
You said the monitor stays on standby.

Is it being fed a video signal by your main PC, or are the only connections to the monitor the main power cable and the TTL connector?
I tried 3 combos. an xbox 360 via VGA, from the same PC I am working from(dual monitor), and no inputs
 
Joined
Apr 2, 2020
Messages
5
So at what point does the monitor go into standby?
basically if its hooked up to the USB To RS232 TTL UART PL2303HX before being turned on, it doesnt come out of standby, and the power switch seems to do nothing. I have got some weird behavior by fucking around. I hooked it up while it was on(dual monitor mode), shut the monitor off, and turned it back on. When it turned back on it had a severe rainbow thing going on, but it was uniform. Also, sometimes if I unplug the USB, the second I do the monitors light shuts off, or sometimes turns green etc etc. If you are at a loss don't sweat it. Maybe I could try a diff cable.
 

spacediver

2[H]4U
Joined
Mar 14, 2013
Messages
2,570
Just do this.

Hook up your monitor as you normally would, turn on your PC and the monitor (i.e. so you can see windows desktop on it).

Then connect the USB-TTL cable to a laptop, or your main pc.

Then run WinDAS from the device that is connected to the monitor via the USB-TTL cable.

btw have you seen my WinDAS WPB guide? If not, take a look at the workflow diagram that shows how things should be set up (although if you just want to adjust G2 voltage, and aren't concerned about properly calibrating it, you won't need a colorimeter or a laptop).

Also try a different USB port (so long as it maps from somewhere between COM1 and COM4), and then make sure you choose the correct port in WinDAS.

Also, in WinDAS, make sure you select the right model for your monitor.
 
Joined
Apr 2, 2020
Messages
5
Just do this.

Hook up your monitor as you normally would, turn on your PC and the monitor (i.e. so you can see windows desktop on it).

Then connect the USB-TTL cable to a laptop, or your main pc.

Then run WinDAS from the device that is connected to the monitor via the USB-TTL cable.

btw have you seen my WinDAS WPB guide? If not, take a look at the workflow diagram that shows how things should be set up (although if you just want to adjust G2 voltage, and aren't concerned about properly calibrating it, you won't need a colorimeter or a laptop).

Also try a different USB port (so long as it maps from somewhere between COM1 and COM4), and then make sure you choose the correct port in WinDAS.

Also, in WinDAS, make sure you select the right model for your monitor.

thank you for trying to help me. i did try all that. didn't work sadly. i have a theory that it might be locked out until you reach the warm up period. however just now i was fucking with contrast, brightness, and color return. got a picture im happy with. so ill have to tackle this in the future. :( (when i reached the warm up period I had already unhooked everything)
 
Top