• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

24" Widescreen CRT (FW900) From Ebay arrived,Comments.

So I figured I’d throw this out there, I bought a pg27uq and have been using it for a couple months now, my old Pc still has the fw900

The fw900 blows the pg27uq out of the water when it comes to blacks and contrast

If modern GPU’s had DAC’s I’d probably use it on my new rig despite 4K hdr etc
 
Regarding the pixel clock limits, I was able to get anything upto 500MHz - precisely, so <499 works and >500 does not work.

I run 2880x2160 on my monitor sometimes, that's a 523mHz pixel clock. This is off a GTX 1080. Though it wouldn't register in CRU, probably due to Windows-imposed limitation related to my EDID, I did get it to work in Nvidia's custom resolution tool.

And my trick to getting the adapter to work is to shut down your PC, then plug in the displayport cable before you turn it back on. It seems the Sunix doesn't like hot-swapping.

If modern GPU’s had DAC’s I’d probably use it on my new rig despite 4K hdr etc

Plenty of adapters out there that get the job done. I'm running a GTX 1080 with a 21" LaCie CRT.
 
Last edited:
What is the internal pixel clock limit inside FW900? How could one find this number for other CRTs too?
 
Holy CRT subculture thread Batman!

I-Love-CRT-steins-gate-37932065-1280-720.jpg


http://steins-gate.wikia.com/wiki/Yuugo_Tennouji
 
By the way, I just purchased a Samsung gaming VA monitor CFG-73. So far it's pretty good. I need to calibrate it. The most annoying thing about it though, has got to be the fact that with the strobe light on you CANNOT adjust the brightness. Lame!
 
iv'e been thinking about a VA for a secondary screen in portrait mode. also hoping the RTX cards come out MXM form hopefully with the "N" series which is a desktop parts slapped on an MXM card. had i know that i'd have gotten the 1070 N instead of the 980M which is like 40-45% slower than the 1070N.
 
What is the internal pixel clock limit inside FW900? How could one find this number for other CRTs too?

CRT's don't have pixel clocks limitations. Only horizontal and vertical frequency limits. You're basically unlimited in the number of pixels on a horizontal electron beam. This is why 1920x1200 and 1600x1200 are the same frequencies.
 
I have CHG90 as alternate monitor, wanted something to complement a CRT so its of course huge 49 inches vs 22 inch CRT, and has around 350 nits vs 110 on CRT. Thats about it. If you want to engross yourself in a game, it works really well given its width (120cm), or if you want to see far away targets in some shooters. Its fun switching them every now and then, when I switch to CRT I am blown away by the image quality and the other way around its like wow this thing is huge and bright. VA on CHG90 provides nice contrast but the black levels are really bad. The backlight bleeds from the sides so much it really washes the colors there (no I dont have a faulty piece) plus the fact that the curvature is too small (unless you are sitting 1+ meter away) means you are not looking at the sides head-on but rather at an angle and VA has poor viewing angles. Its a really fun (second) monitor but I could see some people being disappointed in some of the negative aspects. Also, the pixels from up close remind me of TV rather than a monitor (could be the QLED doing that, dunno).


CRT's don't have pixel clocks limitations. Only horizontal and vertical frequency limits. You're basically unlimited in the number of pixels on a horizontal electron beam. This is why 1920x1200 and 1600x1200 are the same frequencies.

Yeah, that makes sense, maybe I should have asked differently, its a little confusing to me. Horizontal scan rate tells how long it takes to deflect the beam from left to right and back. Vertical refresh rate tells you how many full screens it can paint per second but it relies heavily on how many lines are being drawn, correct? So if I want to draw too many lines I will in return get only very low vertical refresh rate? So the overall limiting factor is still the horizontal scan rate?
 
So the overall limiting factor is still the horizontal scan rate?

Like if you could remove all software limitations and just let a CRT run at crazy high rates until it overheats, then yeah, horizontal rate would probably be the limiting factor.

But the firmware on CRT monitors sets an upper limit on both vertical and horizontal frequencies. It's commonly around 170 or 180Hz vertical.
 
Ah, I think I got it now. So the max number of lines is rather defined with a dynamic formula that must count in the vertical refresh rate as well. Roughly (not counting blanking etc):

121,000 / 40 (Hz vert) = a maximum of cca 3025 lines

121,000 / 160 (Hz vert) = a maximum of cca 756 lines

There are the inner limits (possibly on the firmware level as you have said) such as maximum 160Hz vertical refresh rate for FW900 and I think around 40-50Hz of minimum vertical refresh rate IIRC.

In other words, horizontal refresh rate is the maximum number of lines it can draw per second, no matter what vertical refresh rate. So you can draw 121,000 lines per second and if you want to do that 60 times per second (the vertical refresh rate) then you will get 2016 vertical "pixels".

I think I was confused mainly because of the name horizontal refresh rate which does not describe it very well in my opinion.
 
Holy CRT subculture thread Batman!

It won't be as much of a thing after MicroLED screens with some kind of built in flicker/strobing become common. The things CRTs have over LCDs are viewing angles, color quality, and low persistence. That comes with tradeoffs in geometry, size, and resolution, though.

Low persistence is pretty much a solved problem. All you have to do is flicker or strobe the display. The Achilles heel of LCDs are the terrible backlights. OLED solved it but has burn in so it's useless for computer monitors. MicroLED solves it without burn in issues. MicroLED solves viewing angles and color quality without other compromises while still maintaining the advantages of LCDs (geometry, size). It really is what people have been waiting for.

As of today, though, there are legitimate reasons that CRTs are attractive to people.
 
Unless the original Duck Hunt start working with OLED or CRT there will also be Nostalgia and posterity reasons as well. Also if not an HD CRT the price is often just right at the low price of free.
 
CRT's don't have pixel clocks limitations. Only horizontal and vertical frequency limits. You're basically unlimited in the number of pixels on a horizontal electron beam. This is why 1920x1200 and 1600x1200 are the same frequencies.

It should be noted that even though it's near unlimited horizontal, the dot pitch is what matters in terms of actually seeing a difference. Meaning 3200 pixels horizontal might be doable, but be pointless as the dot pitch isn't small enough to see all 3200 pixels

Edit: Relevant video:
 
Last edited:
today (or tomorrow actually...) biggest advantage of CRT is ability to run lower resolution for Ray-Tracing
and it might force me to transport my FW900 to where I currently live

FW900 + RT = Gloriour :cool:
 
today (or tomorrow actually...) biggest advantage of CRT is ability to run lower resolution for Ray-Tracing
and it might force me to transport my FW900 to where I currently live

FW900 + RT = Gloriour :cool:

I'd love to see some video of this whenever Tomb Raider or Battlefield 5 patch in ray tracing.

It should be noted that even though it's near unlimited horizontal, the dot pitch is what matters in terms of actually seeing a difference. Meaning 3200 pixels horizontal might be doable, but be pointless as the dot pitch isn't small enough to see all 3200 pixels

Edit: Relevant video

Yeah, for sure. In my understanding, the electron beam would be fast enough to switch to 3200 separate pixels, but they will ultimately be mixed together by the aperture grille.

Still, in certain situations, like in Street Fighter 5 where you're hard locked at 60fps, you might as well run these higher resolutions, since it basically acts like analog super-sampling. And it appears slightly sharper, since the aperture grille allows for more lines on the vertical axis. Though I'm sure with enough research you could find a point where you start getting diminishing returns there as well.
 
Last edited:
iv'e been thinking about a VA for a secondary screen in portrait mode. also hoping the RTX cards come out MXM form hopefully with the "N" series which is a desktop parts slapped on an MXM card. had i know that i'd have gotten the 1070 N instead of the 980M which is like 40-45% slower than the 1070N.

I'm pretty happy about mine so far. It definitely has its drawbacks, but I find it a pretty decent substitute for a CRT. Now that I no longer have my own house, space is an issue.

In a nutshell:

+ Blacks are decent (they're no CRT but no TN or IPS)
+ sRGB mode seems to be good (a Godsend because native color gamut is way overdone)
+ Motion clarity mode looks great
+ Gamma correction!

- Why in the holy f--- is it curved?
- Viewing angles
- Blacks still aren't CRT
- Cannot adjust backlight brightness in strobe mode
- Uniformity isn't as good as IPS

Other than that, I'm pretty happy with this. I remember the old Eizo Foris FG2421 and not jumping on one. I regret it, and so I figured I'd jump on a decent gaming VA monitor. I'm very picky when it comes to image quality, and I still think it's a keeper. Give it a shot - you may like it.
 
- Why in the holy f--- is it curved?
Because after a long struggle to manufacture flat CRT screens that avoid geometric distortions due to the curve of the display surface, in late 201* some marketting asshole jumped in and decided flat display was obsolete. We, customers, needed something new to increase sales and replace the screens we already own.
And here came the CURVED screen. It's new, it's hype, it's useless. But buy it and shut up. Next step is the RGB curved screen, with multicolored blinking leds all over the casing. Brace yourselves, it's coming ... :ROFLMAO:
 
It should be noted that even though it's near unlimited horizontal, the dot pitch is what matters in terms of actually seeing a difference. Meaning 3200 pixels horizontal might be doable, but be pointless as the dot pitch isn't small enough to see all 3200 pixels

Edit: Relevant video:


there was or is some thing to make your games render at 2k or 4k even if you have a 1080p monitor, they call it down sampling? supposedly it makes it look better? the pics screen shots looked decent. wouldn't it be similar concept but, on the hardware monitor level?

Because after a long struggle to manufacture flat CRT screens that avoid geometric distortions due to the curve of the display surface, in late 201* some marketting asshole jumped in and decided flat display was obsolete. We, customers, needed something new to increase sales and replace the screens we already own.
And here came the CURVED screen. It's new, it's hype, it's useless. But buy it and shut up. Next step is the RGB curved screen, with multicolored blinking leds all over the casing. Brace yourselves, it's coming ... :ROFLMAO:

yeah and are they just bending flatscreens i hear that glass is pretty flexible when ran through rollers in the factory OR does it require more money for special tooling over normal 1080p and 4k panels cut from the same glass?
 
there was or is some thing to make your games render at 2k or 4k even if you have a 1080p monitor, they call it down sampling? supposedly it makes it look better? the pics screen shots looked decent. wouldn't it be similar concept but, on the hardware monitor level?

Not exactly, since the shadow mask physically are blocking the light. Take for example in the video when they were talking about the tiny CRT and the text is unreadable. The tiny CRT was firing the same resolution, but due to the shadow mask, you couldn't read the text. What you were talking about is rendering higher and then using image downsizing. That looks better because you are taking more information and then averaging the extra information to create a better image while still displaying the same amount of pixels. There is a limit how much better it will look, of course. Down scaling from 8K to 320x240 would probably look the same as down scaling from 4k to 320x240, for example. Basically you can only average so much before it all looks the same, lol.
 
I'm pretty happy about mine so far. It definitely has its drawbacks, but I find it a pretty decent substitute for a CRT. Now that I no longer have my own house, space is an issue.
FW900 is not that huge :ROFLMAO:
You forgot to mention the monitor model.


there was or is some thing to make your games render at 2k or 4k even if you have a 1080p monitor, they call it down sampling? supposedly it makes it look better? the pics screen shots looked decent. wouldn't it be similar concept but, on the hardware monitor level?
On NV cards it is called DSR - it basically allow you set to higher resolution to do Super Sampling Anti-Aliasing without the need to implement normal SS which usually does not work.

It is not entirely known if DSR uses sub-pixel rendering for scaling or not. Depending on this it might be more or less comparable to CRT image when going past pixel dot size limit.

For Trinitron vertical resolution is kinda unlimited... usability of it to produce sharper image is still limited by beam height (which dependents on its brightness - or rather emission) imho but at least position of beam is unlimited. Horizontally we are guaranteed to have sub-pixel-like rendering and even more precise than size of one colored spot on screen. It is nicely visible on closeups on linked video on monitor with low color resolution where some of these spots which looked like pixels were obviously not uniformly light up. This effect however like vertical resolution is limited by beam width and frankly it is so much so "not much" that it becomes pointless to increase resolution just because these effects do exists.

It would be pointless even with 1GHz pixel clock adapter for these monitors because of H-Sync and V-Sync limits which make it actually preferable to use DSR and lower resolution to get higher refresh rates.
I also do not really like how those very high resolutions look and always found lower resolutions with SSAA to look better.

For FW900 resolution of 1920x1200 produce no visible gaps between lines where going lower does start to make this effect. Because of this I assume this is the optimal resolution. Going lower resolution and getting "scanline" and higher refresh rates makes more sense than going higher, especially since this scanline effect look great.
 
Not exactly, since the shadow mask physically are blocking the light. Take for example in the video when they were talking about the tiny CRT and the text is unreadable. The tiny CRT was firing the same resolution, but due to the shadow mask, you couldn't read the text. What you were talking about is rendering higher and then using image downsizing. That looks better because you are taking more information and then averaging the extra information to create a better image while still displaying the same amount of pixels. There is a limit how much better it will look, of course.

It's tough, the example of green text he was showing was on a really low-res TV, I'm guessing less than 300 TVL. That's under half the information that a line at 480i could actually carry. For a typical 21" 4:3 CRT, you'd have to be running between 3500 and 4000 pixels to get twice the pixel pitch in a similar fashion, right?

And since we're talking about mostly Trinitrons and Diamondtrons, you gotta remember we're not quite as limited on the vertical plane. I haven't extensively tested it, but so far "analog" downsampling looks better to me than digital downsampling.
 
It's tough, the example of green text he was showing was on a really low-res TV, I'm guessing less than 300 TVL. That's under half the information that a line at 480i could actually carry. For a typical 21" 4:3 CRT, you'd have to be running between 3500 and 4000 pixels to get twice the pixel pitch in a similar fashion, right?

And since we're talking about mostly Trinitrons and Diamondtrons, you gotta remember we're not quite as limited on the vertical plane. I haven't extensively tested it, but so far "analog" downsampling looks better to me than digital downsampling.

Yea, but you don't want to get to that point. So 3500 - 4000 is unusable, but what is the max usable range then? My 21in CRTs can do 2048x1536 in specs, but when using that resolution, I already don't think the text is too sharp anymore. This is assuming default 96 DPI from Windows (aka 100% scaling). It's why I stick with 1600x1200. I guess in terms of games, having it sharp is less of a concern so a slight blur is okay with higher resolution as long as the UI scales fine.

Edit: Actually, I just went and calculated around what pixels my monitor should show with my dot pitch (0.2 mm horizontal, 0.14 mm vertrical on a shadow mask ViewSonic G225f) and it turns out I should be able to do 2048x1536 fine! Thanks for making me rethink this. I'll try it out for a bit. My Sony G500 Trinitron has a bigger dot pitch in which I can't do the same resolution =(.

Edit 2: I miscalculated, it won't do 2048x1536 fine, but 1920x1440 works =D.
 
Last edited:
What is the dot pitch of the FW900, assuming that's what you took it of.

0.23mm in the middle and 0.27mm in the edge I think.

EDIT: But remember that dot pitch isn't the only thing that dictates a display to be sharp or blurry. CRT spot beam size and bandwidth play a huge role in it as well.
 
Last edited:
What is the dot pitch of the FW900, assuming that's what you took it of.

I'm actually on a LaCie electron22blueIV. 2560x1920 is 4:3.

It's actually not the sharpest monitor sometimes. It will be blurry at one resolution/refresh rate, but then pretty sharp at another. Like JBL said, beam size and bandwidth come into play. And then my Lacie seems to have aggressive moire cancellation which effects some combos worse than others.
 
Just got in that HDMI adapter CommandoSnake mentioned earlier, and it's pretty awful.

While it'll display an image at a 345 MHz pixel clock, the quality is terrible. In fact, anything past 181 MHz shows green lines and dots on the dark parts of the image. The auxiliary USB power made no difference. I'm not sure why we're getting such different results, but for now, even though it's cheap, don't bother with this one.
 
Just got in that HDMI adapter CommandoSnake mentioned earlier, and it's pretty awful.

While it'll display an image at a 345 MHz pixel clock, the quality is terrible. In fact, anything past 181 MHz shows green lines and dots on the dark parts of the image. The auxiliary USB power made no difference. I'm not sure why we're getting such different results, but for now, even though it's cheap, don't bother with this one.

Can you try importing the HDMI.dat file with CRU?

You'll see it on the first post here:

https://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU
 
So after spending some time at 1920x1440 and actually taking pictures of my monitor, I settled back at 1600x1200 since it wasn't as clear as I liked at 1920x1440. Oh well, it was fun trying it out and I was really tempted in staying at 1920x1440 since the extra space is so nice.
 
i run 2304x1440 all the time i don't sit 5 feet away form the monitor though.

I sit about 2 feet away from my monitor. I don't think it's the distance, I think I understood dot pitch for shadow mask wrong in which the optimal resolution is still 1600x1200. After all, the diagonal dot pitch for my shadow mask monitor is actually 0.25 mm, which is quite different from the 0.2 mm horizontal and the 0.14 mm vertical.
 
Maybe windows 10's UI scaling is helping me out then?

Nah, I think what jbltecnicspro said also factors it. Even if the dot pitch allows for all the pixels to show fine, the spot beam size may still make it blurry. I'm content in keeping it at 1600x1200, though I do have these "what ifs" at 1920x1440, lol.
 
It's a pretty simple calculation for me. When on the desktop, I'll use a lower resolution to make text bigger and more readable. But for games, where you're not trying to read text made of individual pixels, then I can use higher resolutions.

Just like 1440p LCD monitors, depending on the size, many people don't use 1:1 text scaling in windows, because the text is too tiny. This is especially true for 4k monitors. I doubt many people at all use 1:1 text scaling at 4k, unless they have like a 40 inch monitor and sit a couple feet away.
 
It's a pretty simple calculation for me. When on the desktop, I'll use a lower resolution to make text bigger and more readable. But for games, where you're not trying to read text made of individual pixels, then I can use higher resolutions.

Just like 1440p LCD monitors, depending on the size, many people don't use 1:1 text scaling in windows, because the text is too tiny. This is especially true for 4k monitors. I doubt many people at all use 1:1 text scaling at 4k, unless they have like a 40 inch monitor and sit a couple feet away.

Why not just use the higher resolution and then use Windows 10 scaling for text? Windows 10 finally has it as native support.
 
Why not just use the higher resolution and then use Windows 10 scaling for text? Windows 10 finally has it as native support.

It's not consistent across all programs, it looks good in new windows apps but less so in old ones.

And my monitor probably uses a little less electricity at 1600x1200@85hz instead of 1920x1440@85hz. Probably a fraction of a watt, but who knows.
 
Enhanced Interrogator The adapter's default extension block reports a max HDMI bandwidth of 225 MHz, but that seems to be ignored by the Nvidia driver. Even with no modification, it'll display... an "image" at 345 Mhz. I tried changing the max to 400 or even 600 MHz, but nothing changed with the image; It's still distorted and green. I also tried the CRU hdmi.dat and the Nvidia pixel clock patcher. Neither did anything.

Let me show you what I mean about the green dots/lines. Here's the adapter at 1600x1200x85 (235.14 MHz):

matrix_quake_9jxs2vkqj0.jpg


I don't think DM4 is supposed to look like that. That green distortion isn't there at 181 MHz, but it starts creeping in as the pixel clock increases.


Related to the conversation here abut high resolutions, I'm in the camp that likes the look of the "analog" super-sampling, but I thought I should actually test it. At the character select screen in Quake Champions you can sit and compare fine details on the models, so I tried two different resolutions: 2496x1872 at 100% render scale, and 1600x1200 at 200%. The latter should have an advantage, but they were almost indistinguishable. 2496 looked a tad bit better only because there was less of a gap between horizontal beam traces.

Anyhow, I got out my macro lens to see what can actually be resolved at various resolutions. This is from a Dell P1130:

CRT_res_comparison_5xbz0uod8b.png


So I guess we should stick with 1024x768!
 
  • Like
Reactions: Meeho
like this
Back
Top