24" Widescreen CRT (FW900) From Ebay arrived,Comments.

Unfortunately not. It's the coating coming off. Be very gentle if you clean these areas to avoid this getting worse.
 
I'm gonna slowly dip my toe into beggining the process that'll eventually get me to colour correct my FW900, and, also correct the G2 voltage when needed.

So, my first questions are, will a virtual windows xp computer work ? Does the colorimeter need to be a specific one ? And, where can I find a windas cable ?

Also, can windas be downloaded onto windows xp ? Or, do I need to find a CD ?
 
Btw, I have both the delock displayport to vga converter, and the splitter that goes above 375mhz, and, today, after getting somewhat fed up with the bugs of the splitter, I decided to try the other one and see if there's a difference to the image, besides the bugs.
I think I notice that the image is brighter, but, it's hard to tell when I don't have two monitors to compare.

Do any of you notice a difference in brightness too ? And, also, is this the fastest bug free converter there is on the market right now (including very expensive ones) ?

And, I remember someone saying there was a converted that goes up to 375mhz. My delock goes up to 340-350, and, I thought it would be the one that goes up 375mhz, so, let me know if I got unlucky.

Also, to whoever I said it didn't have bugs before, I didn't notice the jittering at first, and the inverted splitting that happens vertically didn't happen until after we had our conversation.
So, it's not a unicorn, haha.
 
Last edited:
Something weird... in final fantasy 7 intergrade, I can set the resolution up to 3840x2160, despite my delock not being able to go above 340-350mhz.
 
The game runs in borderless windowed I'm pretty sure, so it would just be downsampling
You mean upscaling ? If it is so, then, it's working very well (looks great)... I wonder what method they're using, because, as far as I can tell, Nvidia's DSR doesn't look as good, unless it's at 4x.
 
You mean upscaling ?

Upscaling is low internal resolution>high output resolution.

You are doing high internal resolution>low output resolution. So that's downsamping/downscaling.

Just bring up your monitor's OSD. It'll say the same kHz as your desktop resolution, most likely.
 
Upscaling is low internal resolution>high output resolution.

You are doing high internal resolution>low output resolution. So that's downsamping/downscaling.

Just bring up your monitor's OSD. It'll say the same kHz as your desktop resolution, most likely.
Internal is 2304x1440 and output is 3840x2160 in this case, so it's upscaling.
 
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.

3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.

Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
 
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.

3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.
Ah, I thought internal was the native resolution. Good to know.
 
Downsampling seems paradoxical to me... how does 3840x2160 sampled down to 2304x1440 look better than native ?
And, is there a program I could use to run games with downsampling ?

Since I'm losing refresh rate with the bug-free converter, it would be nice to run a game at 2560x1440 downsampled to whatever I can run at at least 80hz.
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.

3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.

Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
 
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.

3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.

Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
Nevermind, I just learned that that's what DSR does.
 
Your GPU is sending a 2304x1440 signal to your monitor. That is the output resolution.

3840x2160 is the resolution the game is rendering at, internally, before being downsample to 2304x1440 to output to your monitor.

Like I said, bring up your monitor's menu and tell me what the horizontal sync kHz reading is. I guarantee it's the same as 1440p
Came around full circle on this. I used to think upscaling meant supersampling, so, this helped clear a lot of confusion. Thanks.
 
Downsampling seems paradoxical to me... how does 3840x2160 sampled down to 2304x1440 look better than native ?
And, is there a program I could use to run games with downsampling ?

Since I'm losing refresh rate with the bug-free converter, it would be nice to run a game at 2560x1440 downsampled to whatever I can run at at least 80hz.
Nevermind, 3D rendering makes sense that it'd look better.
 
Looks like setting a custom resolution on nvidia's control panel, and keeping a lower resolution on the timing's options works well. 2560x1600 and 3840x2400 downsampled to 1920x1200 looks great in game. In 2D content though, there is a difference, but, not so much.
 
Looks like setting a custom resolution on nvidia's control panel, and keeping a lower resolution on the timing's options works well. 2560x1600 and 3840x2400 downsampled to 1920x1200 looks great in game. In 2D content though, there is a difference, but, not so much.
Do you know about DLDSR yet?



Basically saves a bunch of GPU resources by making 2.25x look like 4x DSR. And for games that support DLSS, you can get even better performance by using DLSS to do the internal resolution.
 
Do you know about DLDSR yet?



Basically saves a bunch of GPU resources by making 2.25x look like 4x DSR. And for games that support DLSS, you can get even better performance by using DLSS to do the internal resolution.

Yes, although, I have a 1080 ti, so, no tensor/AI cores for either of those.

If I have the money, I will upgrade once the 4000 series come out.
 
Did anyone encounter an issue when you calibrate the monitor and set up G2 their black level will eventually be grey instead
Even if you set up G2 to pitch black when you can barely see anything at the end of the calibration everything goes up and way too bright in relation to value I set
I also do not have 3rd step (green gun adjustment) but the model is different so maybe that's fine
 
There is no such thing as "pitch black" on CRT
You need to raise black level to certain value to get proper gamma of 2.4.

Unless you want to play with using ICC profiles which I personally do not recommend as there is too many issues with these. And realistically all you can get is visible difference in black level between fully black screen and when anything is displayed on it. It is imho better to have low level glow and more consistent black level and gamma without having to deal with anything related to gamma sliders and especially ICC profiles. It is pretty much how CRT's were always calibrated and if you set to configure brightness/contrast by eye until image looked correct (note: correct != best black level on fully black screen) you would end up with higher black level and gamma close to 2.4
 
There is - no need to discriminate CRT. They can produce some stunning results maybe not as deep as OLED goes but still pretty close. And that wasn't my question.
I experience issues with raised blacks they are clearly LIGHT GRAY after WPB is finished no matter the settings I select - did you even read? It doesn't matter which settings I would use or if I use ICC or not the result will be the same - raised black levels and incorrect gamma and I don't have that issue on my good ol' 900
 
Did anyone encounter an issue when you calibrate the monitor and set up G2 their black level will eventually be grey instead
Even if you set up G2 to pitch black when you can barely see anything at the end of the calibration everything goes up and way too bright in relation to value I set
I also do not have 3rd step (green gun adjustment) but the model is different so maybe that's fine
When you set G2, the monitor might cycle, and not give you the final consequence of that change. Try setting the G2 even lower, then wait and see what happens after you get out of the menus.

What does your colorimeter measure as black level of the monitor after you've done G2 and waited half an hour..
 
I noticed today that circles aren't entirely round at 16:10 resolutions. I stretch the image as close to the size of the screen as possible, but, it seems that the screen itself is not exactly 16:10.

Do any of you happen to know what aspect ratio is exactly ?

Edit : I'm speaking of the FW900.
 
Maybe I'm being nitpicky, nevermind.
I use:

1600 by 1024 at 100 Hz
1880 by 1200 at 85 Hz

No reason not to tweak the resolutions to accord with the screen's actual dimensions. Somewhat uniquely these devices are true multi-scan. Why not live it up? :)
 
Morning all. So I thought I'd let you know that I bit the bullet and ordered a Viewsonic XG2431. It's Blurbusters 2.0 approved meaning it gives the ability to tweak the backlight strobe to perfection (or as close to perfect as you can get) and single-strobes all the way down to 60hz, giving it compatibility with game consoles.

It's supposed to arrive Monday. I'll share my thoughts on it after I give it a good ole test drive. :)
 
Last edited:
I use:

1600 by 1024 at 100 Hz
1880 by 1200 at 85 Hz

No reason not to tweak the resolutions to accord with the screen's actual dimensions. Somewhat uniquely these devices are true multi-scan. Why not live it up? :)
Well, my reason is that I don't want to change the resolution to something other than 16:10 or 16:9 because I want to use the native resolution of whatever video is playing on screen.
 
There is no such thing as "pitch black" on CRT
You need to raise black level to certain value to get proper gamma of 2.4.

Unless you want to play with using ICC profiles which I personally do not recommend as there is too many issues with these. And realistically all you can get is visible difference in black level between fully black screen and when anything is displayed on it. It is imho better to have low level glow and more consistent black level and gamma without having to deal with anything related to gamma sliders and especially ICC profiles. It is pretty much how CRT's were always calibrated and if you set to configure brightness/contrast by eye until image looked correct (note: correct != best black level on fully black screen) you would end up with higher black level and gamma close to 2.4

The Gamma Crush problem can be corrected with 3DLut for the majority of PC games and movie playback.

Raising black level to compensate for wonky gamma tracking is really bad, because you're clipping the CRT's main Super Power, great dark-scene contrast.

The only time you need to compromise is with game consoles. <if you're made of money> you can also fix that too with an inline lut-box on digital consoles.

NOT using 3DLut with CRT, is being a filthy casual. :D
 
The Gamma Crush problem can be corrected with 3DLut for the majority of PC games and movie playback.

Raising black level to compensate for wonky gamma tracking is really bad, because you're clipping the CRT's main Super Power, great dark-scene contrast.

The only time you need to compromise is with game consoles. <if you're made of money> you can also fix that too with an inline lut-box on digital consoles.

NOT using 3DLut with CRT, is being a filthy casual. :D
I mean… 2.4 gamma on a GDM monitor, assuming no issues, should be at least 5000:1 contrast. Can someone confirm? My Viewsonic (low to mid range) gets about 3500:1 with 2.3 gamma.
 
I mean… 2.4 gamma on a GDM monitor, assuming no issues, should be at least 5000:1 contrast. Can someone confirm? My Viewsonic (low to mid range) gets about 3500:1 with 2.3 gamma.
There is no _AT LEAST_ when it comes to CRTs. Their contrast fluctuation is similar to that of Projectors.

VA LCD monitors have an ANSI-checkerboard contrast of 3000:1, VA-Televisions have 5000 to 7000:1

IPS of all type tops out at around 1500:1

CRTs, with ANSI-checkerboard go as low as 75:1 which is just trash. CRT does not produce a very vivid looking bright scene in general.

CRT will only beat LCD in contrast on very dark and very sparse scenes. It can go well above 5000:1, For example, Your basic CRT using the crush gamma trick can go below 0.003 nits, the limit of an i1d3, the cheapest usable colorimeter. It can also do ~100 nits at these settings, so that's already 33,333:1.. But you throw in some bright elements, and it will tank.

The color purity also drops off when the screen gets bright and busy because of light pollution. Reds turn brown.

It's also not typically possible to achieve balanced high contrast performance on CRT without 3DLut compensation. If you run CRT's default gamma tracking, it typically follows SRGB which gives you greyish blacks and pale looking dark colors.
 
Last edited:
There is no _AT LEAST_ when it comes to CRTs. Their contrast fluctuation is similar to that of Projectors.

VA LCD monitors have an ANSI-checkerboard contrast of 3000:1, VA-Televisions have 5000 to 7000:1

IPS of all type tops out at around 1500:1

CRTs, with ANSI-checkerboard go as low as 75:1 which is just trash. CRT does not produce a very vivid looking bright scene in general.

CRT will only beat LCD in contrast on very dark and very sparse scenes.

The color purity also drops off when the screen gets bright and busy because of light pollution. Reds turn brown.

It's also not typically possible to achieve balanced high contrast performance on CRT without 3DLut compensation. If you run CRT's default gamma tracking, it typically follows SRGB which gives you greyish blacks and pale looking dark colors.
I wasn’t talking ANSI contrast. We all know that CRT sucks in this realm due to the glass reflections. I was talking full on / full off. Sorry if I wasn’t clear.
 
I wasn’t talking ANSI contrast. We all know that CRT sucks in this realm due to the glass reflections. I was talking full on / full off. Sorry if I wasn’t clear.
I edited my post to remark on that, CRTs can go well below 0.003 nits, by using the crush trick, <lower contrast (sometimes brightness)> all the way down, The dark colors will be crushed, But then you use the VCGT (gamma table) to Uncrush it, while preserving the deep blacks.
 
I edited my post to remark on that, CRTs can go well below 0.003 nits, by using the crush trick, <lower contrast (sometimes brightness)> all the way down, The dark colors will be crushed, But then you use the VCGT (gamma table) to Uncrush it, while preserving the deep blacks.
I see. So to reiterate, when I calibrate my Viewsonic monitor I think I have it at 80 nits white. I lower the blacks to be inky but I make sure I don’t crush it. Usually I settle on a measured gamma of 2.3. I don’t like 2.4 unless I’m in a blacked-out room - my projector, in other words. :) I don’t remember the exact black level but I think it brought me to 3400:1 contrast or slightly above that.

Edit - the following reflects my personal preference and is not meant to imply that I’m right or that others should emulate me:

My personal opinion is that above 5000:1 contrast, for most viewing environments, it’s very diminishing returns. For me it’s pointless to hit those contrast levels or above unless I’m in a totally blacked-out room. Because unless I lower the white levels to compensate, it hurts my eyes.

I only care about extremely high contrast when I’m watching a movie on a projector in a theater room.

But again - this is personal preference! You’re certainly free to squeeze as much performance out of your screens as you please. :).
 
I see. So to reiterate, when I calibrate my Viewsonic monitor I think I have it at 80 nits white. I lower the blacks to be inky but I make sure I don’t crush it. Usually I settle on a measured gamma of 2.3. I don’t like 2.4 unless I’m in a blacked-out room - my projector, in other words. :) I don’t remember the exact black level but I think it brought me to 3400:1 contrast or slightly above that.

My personal opinion is that above 5000:1 contrast, for most viewing environments, it’s very diminishing returns. For me it’s pointless to hit those contrast levels or above unless I’m in a totally blacked-out room. Because unless I lower the white levels to compensate, it hurts my eyes.
For CRT just turn contrast (or sometimes brightness) all the way down. Let it crush as long as it improves black level. The software can uncrush it, it's analog so it's fine.

The meter must be an xrite meter, otherwise the spyders can't read below 0.05-ish nits, People who have tried spyders got measurement results that do not make any sense. Even the newest spyderx can't read reliably below 0.05.

The xrite studio and pro are good down to 0.003 nits.

Also make sure you've set the measurement window as small as possible with black background for accuracy and to deal with the ABL in crts.

The CRT should read Infinite:1 (On/Off) contrast ratio if everything is set up properly.
 
I've always returned to CRT. In real life the ANSI seems fine and the difference in dynamic range and ability to resolve black very noticeable between the two technologies to me.

I do like LCD with FALD though. I probably need to replace my old TV. It will be between OLED and FALD LCD and I'm a bit concerned about the burn in aspect with the former. Want a TV I can just leave on sometimes and not have to worry about station logos and such. With my monitors I've been pretty aggressive with a screen saver, but that's not what I want for a TV.
 
I've always returned to CRT. In real life the ANSI seems fine and the difference in dynamic range and ability to resolve black very noticeable between the two technologies to me.

I do like LCD with FALD though. I probably need to replace my old TV. It will be between OLED and FALD LCD and I'm a bit concerned about the burn in aspect with the former. Want a TV I can just leave on sometimes and not have to worry about station logos and such. With my monitors I've been pretty aggressive with a screen saver, but that's not what I want for a TV.
Plasma is a good alternative. Decent used TV’s can still be had and they have phosphors so they look like CRT.
 
I've always returned to CRT. In real life the ANSI seems fine and the difference in dynamic range and ability to resolve black very noticeable between the two technologies to me.

I do like LCD with FALD though. I probably need to replace my old TV. It will be between OLED and FALD LCD and I'm a bit concerned about the burn in aspect with the former. Want a TV I can just leave on sometimes and not have to worry about station logos and such. With my monitors I've been pretty aggressive with a screen saver, but that's not what I want for a TV.
There's no perfect display device

Oled is reasonable burn-in proof as long as it stays around 100nits, If you can keep all your desktop use within that, it's OK as a monitor.

FALD has halo'n anytime you push above 300 nits, it gets glow'y.

CRT has alot more problems really, it's great for the games in terms of motion clarity, but nearly every other aspect of modern videography is compromised, Especially color.
 
Back
Top