24" Widescreen CRT (FW900) From Ebay arrived,Comments.

As a side note I dug back some old posts of Xor, and despite his polarizer not being a suitable solution IMO, it's still interesting with some perspective. In particular, the resultat he obtained was probably not as problematic as expectable regarding transmittance because he used a low quality polarizer with polarization efficiency in the 92-96% range. That allowed a transmittance of 48% which is higher than most if not all quality polarizer sheets.
Not sure polarization level of film I used.
My film was bad because it had adhesive and I have no way (or rather tools and skill...) to put it properly.

Optically image quality is very good. Image is sharp as are colors. My guess would be that optically polarizer films will be better than any other films...
Reflection reduction with polarizer is very good. There is vastly less glare than on other Trinitron CRTs. I would say every single CRT I saw have terrible glare and reflection reduction. In this regard polarizer is in its own class.
There seems to be reduction in electrostatic effect with polarizer. I have not tested it really but very aggressive effects that appeared after removing original coating are gone making it more similar to other CRTs.

And of course and this is the main point: FW900 with polarizer can be used with light in room and contrast ratio is perceptually improved just like on LCDs. Even with main roof lights on black is black.

Circular polarizer might be hard to come with proper size to fit FW900. Someone in this thread (do not remember name) bought sheet I recommended to him but it had some diagonal lines on parts of film making it unsuitable to put on screen. Linear polarizer is it seems our only option. But given already low transmittance and great glare reduction going circular might be overkill...
 
I pulled off the original plastic sheet from the screen of my FW900 the other day. (I had inadvertently damaged one of the corners of it.)

With overhead lights on, I agree it's unusable now. With my normal lower ambient light, it's just kind of more of the same. More reflective than I'd like, but it already was.

The raw glass is kind of beautiful I would say. And still a beautiful picture.

Got some polarizer film from place XoR_ mentioned, which I will probably play around with.
 
Reflection reduction with polarizer is very good. There is vastly less glare than on other Trinitron CRTs. I would say every single CRT I saw have terrible glare and reflection reduction. In this regard polarizer is in its own class.
There seems to be reduction in electrostatic effect with polarizer. I have not tested it really but very aggressive effects that appeared after removing original coating are gone making it more similar to other CRTs.
Well, I'm a bit sceptical about that. A polarizer will reduce (linear) or get rid entirely (circular) of light reflected inside the screen, but there's nothing to prevent reflection on the external surface of the polarizer. Polarizers are usually built with a film of polyvinyl alcohol sandwitched between two films of cellulose triacetate, and AFAIK PVA might be electrically conductive, but CTA is not, there shouldn't be any way to ground that kind of film. And without grounding, there's no shielding, and static electricity.

I pulled off the original plastic sheet from the screen of my FW900 the other day. (I had inadvertently damaged one of the corners of it.)

With overhead lights on, I agree it's unusable now. With my normal lower ambient light, it's just kind of more of the same. More reflective than I'd like, but it already was.

The raw glass is kind of beautiful I would say. And still a beautiful picture.

Got some polarizer film from place XoR_ mentioned, which I will probably play around with.
How is your screen regarding electrical fields with film off ? Does it behave like I described or differently ? I unfortunately didn't check that extensively before putting a film on, I need someone else to tell me.
 
Well, I'm a bit sceptical about that. A polarizer will reduce (linear) or get rid entirely (circular) of light reflected inside the screen, but there's nothing to prevent reflection on the external surface of the polarizer. Polarizers are usually built with a film of polyvinyl alcohol sandwitched between two films of cellulose triacetate, and AFAIK PVA might be electrically conductive, but CTA is not, there shouldn't be any way to ground that kind of film. And without grounding, there's no shielding, and static electricity.


How is your screen regarding electrical fields with film off ? Does it behave like I described or differently ? I unfortunately didn't check that extensively before putting a film on, I need someone else to tell me.

Via the back of hand test -- very strong when I first turn it on. This diminishes rapidly within the first minute to something much less, but still easily discernable. About 40 minutes on now and there's still I think a very slight effect, but much less still. Just barely discernable maybe.


(I should note, conductive tape, glass to metal chassis, is removed on mine.)
 
OK thanks !

Don't worry about the conductive tape, it's useful only when there is a conductive film on, to connect it to the chassis. ;)
 
So I picked up a GTX 1080, here's my early impressions of compatibility with the Sunix DPU 3000 (or should I say Synaptics VM2322).

First off: hook up your Sunix before you boot your PC. It seems hot plugging doesn't work well here. When I hot plugged the Sunix, I was getting tons of artifacts and would lose sync randomly for short and long periods. But I was able to restore picture by rebooting

With custom resoutions, I'm getting the same issue from AMD in Windows where I can't get very high pixel clocks from CRU because I can't override the max pixel clock in the monitor's EDID. But on Nvidia cards, thankfully, it won't restrict you as long as you use Nvidia's custom resolution program. No limits there so far (with progressive scan, that is).

But one issue I've run into is that at very high resolutions, like 2304x1728@60hz, the Sunix can sometimes misfire on the horizontal blanking. Meaning the left side of the image is suddenly shifted right, and now the right side of the image is on the left side of the monitor. I'm usually able to fix this by alt-tabbing out of the game and coming back.

I had no luck enabling interlaced resoutions. If you add them in CRU, they don't show up. If you try to make them with Nvidia CP, it says "display is not compatible". Hopefully we can figure out some workaround here.
 
So I picked up a GTX 1080, here's my early impressions of compatibility with the Sunix DPU 3000 (or should I say Synaptics VM2322).

First off: hook up your Sunix before you boot your PC. It seems hot plugging doesn't work well here. When I hot plugged the Sunix, I was getting tons of artifacts and would lose sync randomly for short and long periods. But I was able to restore picture by rebooting

With custom resoutions, I'm getting the same issue from AMD in Windows where I can't get very high pixel clocks from CRU because I can't override the max pixel clock in the monitor's EDID. But on Nvidia cards, thankfully, it won't restrict you as long as you use Nvidia's custom resolution program. No limits there so far (with progressive scan, that is).

But one issue I've run into is that at very high resolutions, like 2304x1728@60hz, the Sunix can sometimes misfire on the horizontal blanking. Meaning the left side of the image is suddenly shifted right, and now the right side of the image is on the left side of the monitor. I'm usually able to fix this by alt-tabbing out of the game and coming back.

I had no luck enabling interlaced resoutions. If you add them in CRU, they don't show up. If you try to make them with Nvidia CP, it says "display is not compatible". Hopefully we can figure out some workaround here.


Thanks for the update! Hopefully it will help out the NVidia users.
 
I had no luck enabling interlaced resoutions. If you add them in CRU, they don't show up. If you try to make them with Nvidia CP, it says "display is not compatible". Hopefully we can figure out some workaround here.
Hopefully we do not :droid:

Well, I'm a bit sceptical about that. A polarizer will reduce (linear) or get rid entirely (circular) of light reflected inside the screen, but there's nothing to prevent reflection on the external surface of the polarizer. Polarizers are usually built with a film of polyvinyl alcohol sandwitched between two films of cellulose triacetate, and AFAIK PVA might be electrically conductive, but CTA is not, there shouldn't be any way to ground that kind of film. And without grounding, there's no shielding, and static electricity.
On normal CRT sitting in front of monitor in normal lighting conditions (ambient light somewhat illuminating my face) I could always see my face. With polarizer it is almost not there.
As for static charge build-up reduction I am not sure what causes it. Maybe adhesive is somewhat conductive? I just noticed it is somewhat less than bare screen, that is all. This charge is least important factor in monitor. Image quality is most important and this is where polarizer excels.
 
just picked up a sunix. probably time to update my video drivers as well especially after x number of windows 10 updates. As far as I know the OS shouldn't not care about the sunix adapter or any adapter for that matter since it's at the signal level post video card if i'm not mistaken.


EDIT: heard one get got black and white interlaced 4k to work on a FW once.
 
As far as I know the OS shouldn't not care about the sunix adapter or any adapter for that matter since it's at the signal level post video card if i'm not mistaken

The issue I've run into is that custom resolutions created in CRU for displayport are restricted somewhere near the max pixel clock read from the display EDID. This is on both AMD and Nvidia. Nvidia can override this via its own custom resolution program, but I haven't found a workaround for AMD.

The best solution I can think of at the moment is to run the Sunix to a VGA splitter or processor (like an Extron RGB) that can have a spoof EDID running into one of the outputs.
 
i thought someone mentioned an adapter in here that kick started that could spoof EDID? what OS is this on? What is a processor? is the adapter still worth the money vs a cheapy? i just ordered a second sunix as i have CRTs on 2 newer PCs, should i send them back?
 
i thought someone mentioned an adapter in here that kick started that could spoof EDID? what OS is this on? What is a processor? is the adapter still worth the money vs a cheapy? i just ordered a second sunix as i have CRTs on 2 newer PCs, should i send them back?

No, don't send them back, you probably won't even be interested in some of the super high resolutions I'm talking about. Like did you have your heart set on 2560x1920@70hz? No? Then you'll probably be ok.

Really, the issue is mostly with that Windows and/or the drivers are setting limitations on CRU overrides, and the limitation seems to be based on the max pixel clock reported by the monitor when using displayport. So open CRU, hit "Edit" next to your monitor's name to open display properties, and you'll see the max pixel clock in the EDID. Changing it in CRU doesn't seem to have an effect.

That said, I think I remember that AMD let's you create custom resolutions in their app up to 100mHz over the reported max. And Nvidia's app ignores it altogether, which means really there isn't a limit on Nvidia cards, though you miss out on some of the convenience of using CRU.

And like said, I haven't reprogrammed my spoof adapter yet to increase the pixel clock (either via powerstrip if you can find a working version, or edid-rw in linux) because it's a little convoluted, but I think splicing in the edid from the spoof adapter will remove all these limitations since it a piece of hardware that will always be read by the GPU via the Sunix.
 
oh heck i was hoping for 2304x1440 above 70/75 hz which is either EDID or RAMDAC limited. i was hoping since it costed a little more that it didn't rob too many frames or add much latency.

i'm also slowly moving over to *buntu based distributions, nvidia tends to just work and form what hear AMD is catching up in that regard.

guess a spoof adapter doesn't really have any effect on latency or otherwise. is that a home made / home brew type of device? that could be handy for more than just pushing CRTs.


EDIT: until windows 10 updated i think i got 70 or 75 hz, i think the fw900 can do 80 maybe even 85 @ 2304x1440p if my memory isn't too bad.
 
I have a picture a few pages back of how I have the spoof adapter set up. I made it a little more complex than it needs to be, but it should get the job done once I have the proper EDID programmed in the spoof plug.

A simpler way would to be just make a VGA coupler with two wires going to the ID chip in the spoof plug instead of passing through to the ID chip in the display.

But just to be clear, I'm unsure if the FW900 even needs help in this area. It may have a high enough pixel clock listed in its EDID. My LaCie monitor has a 430mHz pixel clock listed so that only has crazy high resolutions banned when I plug in via the Sunix.
 
i thought someone mentioned an adapter in here that kick started that could spoof EDID? what OS is this on? What is a processor? is the adapter still worth the money vs a cheapy? i just ordered a second sunix as i have CRTs on 2 newer PCs, should i send them back?
EDID data is read by GPU from I2C bus just like you would interface to standard I2C EEPROM. In fact most monitors simply have I2C EEPROM IC inside. To make adapter that allow you to have any arbitrary EDID data all you need is I2C EEPROM IC and really nothing else.

I made such 'spoof adapter' in a day. Used analog DVI-VGA adapter that I treated with knife and soldering iron. To program EEPROM I used Arduino as it was most readily available to me device with interface to I2C. RaspberryPi can also be used or some I2C converter for USB.

In my plans there was 5xBNC cable with EDID hidden inside (those chips can be pretty small). I ended up using VGA cable and 'hacked' passive DVI-VGA adapter as it was just 'good enough'

I used this EDID emulator to correct CIE color information for FW900 when connected to Radeon card to be able to use 'hidden' sRGB gamut emulation functionality. Just like max clock these color information are one of these things that changing in Windows seems impossible. Added my custom resolutions such as 1920x1200@95,904Hz while at it.

GDM-FW900 supports 2304x1440@80Hz natively so I guess it should work out of the box if it was something with pixel clock as it should already have required 400MHz limit set. Though how it is set I do not know because I have never checked it.

4K is impossible as there is no way to draw 2160 lines 60 times a second. With interlacing (which sucks and is increasingly unsupported by GPUs) it should be possible though this would not be true 4K UHD but flickering interlaced mess. I generally fail to see purpose of any of these high resolutions. Refresh rate is more important and this is limiting factor here due to monitors horizontal sync limit at 131KHz.
 
Last edited:
I generally fail to see purpose of any of these high resolutions.

Well on Battlefield 1, my overclocked CPU can only maintain 60fps without periodic drops. That leaves my GTX 1080 with too much headroom at lower resolutions. So in this instance I crank the resolution as much as my monitor can handle, and it does give a noticeable increase in a quality: a little sharper and a lot less aliased, it's sort of an analog super sampling
 
EDID data is read by GPU from I2C bus just like you would interface to standard I2C EEPROM. In fact most monitors simply have I2C EEPROM IC inside. To make adapter that allow you to have any arbitrary EDID data all you need is I2C EEPROM IC and really nothing else.

I made such 'spoof adapter' in a day. Used analog DVI-VGA adapter that I treated with knife and soldering iron. To program EEPROM I used Arduino as it was most readily available to me device with interface to I2C. RaspberryPi can also be used or some I2C converter for USB.

In my plans there was 5xBNC cable with EDID hidden inside (those chips can be pretty small). I ended up using VGA cable and 'hacked' passive DVI-VGA adapter as it was just 'good enough'

I used this EDID emulator to correct CIE color information for FW900 when connected to Radeon card to be able to use 'hidden' sRGB gamut emulation functionality. Just like max clock these color information are one of these things that changing in Windows seems impossible. Added my custom resolutions such as 1920x1200@95,904Hz while at it.

GDM-FW900 supports 2304x1440@80Hz natively so I guess it should work out of the box if it was something with pixel clock as it should already have required 400MHz limit set. Though how it is set I do not know because I have never checked it.

4K is impossible as there is no way to draw 2160 lines 60 times a second. With interlacing (which sucks and is increasingly unsupported by GPUs) it should be possible though this would not be true 4K UHD but flickering interlaced mess. I generally fail to see purpose of any of these high resolutions. Refresh rate is more important and this is limiting factor here due to monitors horizontal sync limit at 131KHz.


60 times a second!? OH HELL NO, the guy said it was like 24 or 30 hz or some shit. then again my brain is getting more and more fired these days but, hey some people claim bs. ANYHOW, am i wrong in assuming that IF the OS and drivers played ball the EDID emulator wouldn't really be needed?

Yeah high res tends to make difference, there is even a thing to render 4k and down sample to your monitors resolution, and i remember the screenshots looking pretty good at the time of the article / demo where i read about it.

EDIT: maybe large photo editing on super hi res pics?
 
EDIT: maybe large photo editing on super hi res pics?

The issue is that you won't be seeing the full detail anyway due to the finite dot-pitch on the tube. So no good for critical work, but for games, you can see the results for yourself.

And you can do 4k interlaced at 96hz. Windows will report it as "48hz interlaced", but your monitor is in fact refreshing 96 times. Unfortunately, since Nvidia and AMD are buggy with interlaced resolutions over displayport under Windows 10, I haven't been able to try it out to see how it actually looks. Should work in Ubuntu though. It also won't work over an older Nivida with an analog connection, since it caps out at 400mHz pixel clock on VGA.
 
i heard the pixel clock on these sunix can do over 400mhz RAMDAC? not that i'd ever run 4k but, would be cool to see what's available @ 2304x1440. IDK if you all have an opinions on running over 2304 x 1440?
 
i heard the pixel clock on these sunix can do over 400mhz RAMDAC? not that i'd ever run 4k but, would be cool to see what's available @ 2304x1440. IDK if you all have an opinions on running over 2304 x 1440?

From my limited testing in Ubuntu, yea it they seem to be able to run resolutions and refresh rates that require higher than 400 MHz. I did only test those modes temporary, like for 10 mins, so I don't know how stable it is.
 
60 times a second!? OH HELL NO, the guy said it was like 24 or 30 hz or some shit. then again my brain is getting more and more fired these days but, hey some people claim bs.
No need to go that low as you can do 2160 lines in 54Hz on FW900. 55Hz with some tinkering with blanking.
Obviously watching blu-ray movies at 47,952Hz is should be possible
Since you do not need that much lines and for watching movies you could aim at 3840x1645 @ 71.928Hz for ultimate FW900 cinema experience... as long as you have device which can output >600MHz pixel rate

I tested how 2880x2160 @ 50Hz looks on Dell P1110 and of course everything is blurry because it is way past dot pitch size.
FW900 @ 3840x2160 would be even more blurry obviously...

Yeah high res tends to make difference, there is even a thing to render 4k and down sample to your monitors resolution, and i remember the screenshots looking pretty good at the time of the article / demo where i read about it.
It is called DSR and you can enable it in Nvidia Control Panel. AMD cards probably have similar feature but I am not sure.
While using DSR it is useful to also enable FXAA for best AA effect.

EDIT: maybe large photo editing on super hi res pics?
IPS LCDs are far superior for this purpose:
- better color accuracy (gamut, gamma)
- no halos around every damn bright object
- screens looking pretty much like printed photos
- bigger size
- higher resolution
- resolve each pixel perfectly

Only use for CRT these days is for playing games, especially older games and emulators and to hook it up to something like Open Source Scan Converter and use instead of 15KHz CRT for older consoles and computers.

IDK if you all have an opinions on running over 2304 x 1440?
Games play better with higher refresh rate than with higher resolution.
Imho highest resolution that make any sense on FW900 is 1920x1200@96Hz and it is already past this monitor ability to resolve it perfectly. It works fine on Delock 62967 which seems to be the best DP-VGA converter available today.
 
From my limited testing in Ubuntu, yea it they seem to be able to run resolutions and refresh rates that require higher than 400 MHz. I did only test those modes temporary, like for 10 mins, so I don't know how stable it is.
Unfortunately Linux is not the best OS to play games...
 
Unfortunately Linux is not the best OS to play games...

Yea, I know. It's one of my biggest reason not moving to Linux permanently. I the flip side, I think emulation in Linux for console games is pretty good, lol.
 
IPS LCDs are far superior for this purpose:
- better color accuracy (gamut, gamma)
- no halos around every damn bright object
- screens looking pretty much like printed photos
- bigger size
- higher resolution
- resolve each pixel perfectly
Piss poor contrast and smeared milk over the screen, aka IPS panel glow, ruin any advantages an IPS LCD might have.
 
i heard the pixel clock on these sunix can do over 400mhz RAMDAC?

It can, but we haven't done enough testing to see exactly how stable. I'll give an update over the next couple weeks since I'll be playing many games in excess of 400mHz thanks to the GTX 1080.

Not that i'd ever run 4k but, would be cool to see what's available @ 2304x1440. IDK if you all have an opinions on running over 2304 x 1440?

In games, at the same refresh rate, 2304x1440 will look better than 1920x1200. Mainly because of less aliasing, and it will look slightly sharper (very slightly since you're already over the max horizontal resolution).

BUT you can run a higher refresh rate at 1200p, so that's the best way to go if your CPU and GPU can handle it. But in situations where you're CPU limited below 90fps, and your GPU has headroom, then you should try 1440p.

That's where I'm at in BF1 right now. My i5 is holding the game back, but my GTX 1080 isn't, so I'm just running 2560x1920@60hz to keep the 1080 close to 100% utilization (and the image incredibly sharp and vivid). Once I upgrade my CPU though, I'm going to drop that resolution back and go higher with the refresh rate, maybe 1920x1440@90hz if the i7 can handle it (1920x1440 is probably close to the max fully resolvable resolution on my LaCie)

What is most important when you have a CRT, is to match your frame rate to your refresh rate. From either vsync or a frame cap. Adaptive vsync in Nvidia is the best option, since it will stay vsynced until your GPU can't keep up, then it turns off and allows tearing.
 
Last edited:
I'll give an update over the next couple weeks since I'll be playing many games in excess of 400mHz thanks to the GTX 1080

man, can you please test 1920x1200 @ 90hz on your GTX1080 FW900? ;)


highest resolution that make any sense on FW900 is 1920x1200@96Hz and it is already past this monitor ability to resolve it perfectly. It works fine on Delock 62967 which seems to be the best DP-VGA converter available today.

excuse me, but are you sure about that? have you really tested a Delock adapter on FW900 by yourself? because maybe i missed something, (hopeful) but from what i remember from past reading about some users testing the Delock adapter behaviour is that its current revisions are still flawed with stability issues and poor quality internal materials (wiring, cables, pins, etc)
 
Last edited:
Piss poor contrast and smeared milk over the screen, aka IPS panel glow, ruin any advantages an IPS LCD might have.
LCD have higher ANSI contrast ratio and good IPS panels do not have any 'IPS glow'

Every time I see CRT on I have two impressions: one is that contrast ratio is very high giving OLED-like impressions but second is opposite of that and I cannot not notice that blacks are actually pretty washed out and most of the time I get irritated by black level and lack of contrast. After all CRT have like 100-150:1 ANSI contrast ratio so nothing more can be expected... obviously monitor with native 1000:1 ANSI contrast ratio will be better in a lot of images.

I did direct comparison of my LCD (HP DreamColor LP2480zx) to CRT (FW900 with polarizer and Dell P1110 - both much better than stock FW900 when it comes to contrast ratio) and it is like comparing hardware from different decade... XD
On LCD side image is toned down but never washed out. Colors are always right on the spot. Dark details never hide behind bright halos around bright objects. Also with properly set ambient light LCD have perceptually better blacks than CRT in any lighting conditions. Black is not breath taking but it also is never noticeable. Black things are just black. Black scenes are how you see in the dark.
CRT can show some good contrast ratio but black gets quickly washed out in brighter images. Black basically does not exist near bright objects and these halos seems to mask dark details when black level is set to be very low and to correct that it have to be risen. Black is actually big problem on CRT... it is inconsistent.

You use cheap panels on cheap monitors and compare them to pinnacle of CRT technology and wonder why images look better on the latter... go figure :astronaut:
 
You use cheap panels on cheap monitors and compare them to pinnacle of CRT technology and wonder why images look better on the latter... go figure :astronaut:
Price has nothing to do with it and I've used all kinds of IPS panels, from top to bottom price points. It's a technological thing. Yes, halos and degraded blacks around bright objects do exist on CRTs, but in real life use I've found IPS not just bad, but unusable. If there is any panel that can compete with static content, it's VA, but then gets destroyed in motion. CRTs are far from perfect, FW900 especially so, but it still blows any LCD when it comes to motion clarity and perceived contrast in most scenarios. It will never outhalo the IPS glow, and whatever it displays will be more consistent across the screen.
 
Price has nothing to do with it and I've used all kinds of IPS panels, from top to bottom price points. It's a technological thing. Yes, halos and degraded blacks around bright objects do exist on CRTs, but in real life use I've found IPS not just bad, but unusable. If there is any panel that can compete with static content, it's VA, but then gets destroyed in motion. CRTs are far from perfect, FW900 especially so, but it still blows any LCD when it comes to motion clarity and perceived contrast in most scenarios. It will never outhalo the IPS glow, and whatever it displays will be more consistent across the screen.
On CRT black objects on white backgrounds completely loose all details and black bars when watching bright scenes are far more illuminated than IPS glow that is visible in corners of large IPS panels.

CRT are good for games, videos, etc. but for photo editing none of their strong points is of any use. Just display dark photo next to something white like toolbar and you got large halo around it... What do we even talk about here? CRT is terrible compared to even cheapo IPS for any image editing tasks and especially when we talk about photo editing for print where no CRT have proper gamut volume to be able to display all colors. Also no one use CRT anymore so seeing how images designed for web look like on them is pretty useless.

Motion of CRT is awesome when you have exactly the same video frame rate as monitor refresh rate otherwise it isn't that great. Actually I dislike motion performance of CRT on 24/1.001, 25 and 30/1.001fps videos compared to monitor I use daily for that. Excellent motion performance of CRT is only imho useful for games and it is only use I can find for them. No matter how good some aspects of contrast ratio of CRT are for eg. watching movies you are better off with OLED or even cheap used PDP... and those are like 50" easily and not 22"...

But I guess that if you prefer presentation of VA panels over IPS then I guess further discussion about image quality is completely pointless :hungover:

my IPS lights up the whole dang room on an all black screen.
Are you one of these people who use monitors at 100% brightness in pitch black rooms ? :confused:
You should use monitors at 100cd/m2 peak white not 100%... just sayin...
 
On CRT black objects on white backgrounds completely loose all details and black bars when watching bright scenes are far more illuminated than IPS glow that is visible in corners of large IPS panels.
I once tried to judge video processing quality by looking at movie screenshots from darker scenes on an IPS panel. I laughed and then cried, and never tried such a foolish thing again.

CRT are good for games, videos, etc. but for photo editing none of their strong points is of any use. Just display dark photo next to something white like toolbar and you got large halo around it... What do we even talk about here? CRT is terrible compared to even cheapo IPS for any image editing tasks and especially when we talk about photo editing for print where no CRT have proper gamut volume to be able to display all colors. Also no one use CRT anymore so seeing how images designed for web look like on them is pretty useless.
At worst, they're as useless as IPS. You can get around creating halos sometime, IPS glow is constant and ruins both dark and brighter parts. For pure consumption, as in viewing fullscreen content of various brightness and DR, it's not even close.

Motion of CRT is awesome when you have exactly the same video frame rate as monitor refresh rate otherwise it isn't that great. Actually I dislike motion performance of CRT on 24/1.001, 25 and 30/1.001fps videos compared to monitor I use daily for that.
You can have perfectly sync refresh to any video format. 60 Hz LCD's are a non starter here. Higher refreshnones fare better, but are still held back by their sample and hold panels.

Excellent motion performance of CRT is only imho useful for games and it is only use I can find for them. No matter how good some aspects of contrast ratio of CRT are for eg. watching movies you are better off with OLED or even cheap used PDP... and those are like 50" easily and not 22"...
There aren't useful OLED monitors on the market and they are still inferior in motion resolution. Overall are finally better than CRTs, though.

But I guess that if you prefer presentation of VA panels over IPS then I guess further discussion about image quality is completely pointless :hungover:
Yeah, a like my blacks black and not grey and areas around the center not milky. Not really a shocking preference.


Are you one of these people who use monitors at 100% brightness in pitch black rooms ? :confused:
You should use monitors at 100cd/m2 peak white not 100%... just sayin...
Whatever and IPS outputs at 0 brightness is enough for a blank screen to glow dissapointingly grey.
 
so whats the max refresh rate for 2304x1440? is it limitation of the fw or the sunix/adapter/RAMDAC?
you can overclock it even on 400MHz DAC but not by much as CRT require A LOT blanking and reducing it too much will lead first to geometry issues on sides and if you overdo it you will be unable to display whole desktop area. Some reduction from standards is possible and worth pursuing

simplest solution for higher refresh rate is of course using lower resolution...

technically FW900 tube would most probably have no issues displaying 4K at even higher refresh rate than 60Hz. With improvements to technology it could also be possible to make CRT that have small ennough dot pitch to display it with adequate sharpness. Unfortunately everyone abandoned sinking ship. Market wanted slim displays with small bezels.

Meeho
Did you forget we were discussing 'photo editing' and not general image quality?

VA panels have tons of gamma shift and cannot be calibrated because of that as you calibrate single point for each eye and everything else have whatever colors it wants. Worst image critical monitor you can imagine, especially since quality of viewing angles on these type of panels declined drastically since S-PVA times. I cannot stand using them as I see compression artifact everywhere. I have trauma from my VA times. Bought them for contrast ratio and it was life choice.

For all my displays I use madVR with automatic resolution/refresh_rate switching so stuttering or loosing sync is never an issue. CRT is very good but actually slightly worse in perceived sharpness of low refresh rate videos than some of my other displays. Likewise 30fps console games look better on some other displays than on CRT.

CRT is this peculiarity from the past which surpasses most modern product in some areas and every tech in others but also sucks in some things and in contrast ratio is beaten by both plasma and OLED. Certainly is not better choice for editing images than dedicated professional monitors with IPS panels or watching movies than plasma or OLED.

Hopefully 120Hz OLEDs with no input lag and VRR will bring us closer to perfection. Until then best display is... using few displays for different things
 
Meeho
Did you forget we were discussing 'photo editing' and not general image quality?

VA panels have tons of gamma shift and cannot be calibrated because of that as you calibrate single point for each eye and everything else have whatever colors it wants. Worst image critical monitor you can imagine, especially since quality of viewing angles on these type of panels declined drastically since S-PVA times. I cannot stand using them as I see compression artifact everywhere. I have trauma from my VA times. Bought them for contrast ratio and it was life choice.
I understand and agree with those VA limitations, I just find them a lesser evil than IPS'es. Even for photo editing, it mostly matters when color is important. When it comes to dynamic range and detail examination, IPS is much worse. Even with color you're dealing with different picture in the center compared to corners. I just can't stand the IPS glow on larger screens. It ruins almost every usage scenario, while VA ruins some.

When overall picture quality is concered, I would score IPS 1/10, VA 2/10, and CRT 7/10.
 
OLED for photo editing, who a give a crap about movement resolution or latency. we need to hit up Wendell and have him do a NEW latency shoot out with OLED!



and if we're gonna talk about making NEW CRTs i'm gonna quote myself from another thread here.

When you look at 3d printing and custom PCBs and open source software and the crazy things folks are making these days. i'm kind of wondering if we give it enough time if some young guns with the help of a wise old man won't have some sort of crowd funding going on one day?

https://hardforum.com/threads/highest-contrast-displays-currently.1963740/#post-1043754805


so what is the general refresh limit 2304x1440? 80hz? 85hz tops?
 
With comments like this, how the fuck can any reasonable discussion be had?
We are talking about these panels here
ilmtkvb7ecfrvibs1oeeuop.jpg


Source picture
stripes0lkay.png


On every IPS panel I displayed this image it looks how it was intended
IPS glow looks like there was some light in front monitor and it was just a reflection. Minor annoyance when working with it and after years of using CRT's I am well used to much more reflections, glare and ambient light totally ruining blacks.

VA on the other hand make level of dark details rise quickly when viewed off-angle and it change dark details which should gently fade to black and be barely be seen to be way too much visible. In videos compression artifacts are clearly visible. Places where there are black patches on screen making it always 'there is black, there is something drawn' being seen constantly. Of course this is sub-pixel level being affected so colors are also affected.

On IPS I have no idea what is still part of displayed image and what is black. This is how display should display image! It provide better immersion than VA despite worse contrast ratio. Actual color is always preserved. There is some degradation of colors with subpixels set to 0 on sides but it is to far lesser degree than on VA and looks like reflection. VA panel simply changes colors just like TN panels do. And do it even more horizontally than VA.

Contrast ratio is good enough on IPS to provide adequate preview of how printed photos will look under even best print technology and lighting conditions possible. Neither CRT nor VA can even show how printed material will look as they have their quirks that are far more degrading than some mostly reflection-like IPS glow.
 
Last edited:
and if we're gonna talk about making NEW CRTs i'm gonna quote myself from another thread here.
3d printing could help you with designing and make new cases, nothing more
Making new custom CRT tube... FORGET IT
There are probably no factories left in operation making these and this is quite expensive, complicated and labor intensive process....

I would like to see custom boards to drive CRT tubes with support of wider range of supported frequencies like 15KHz - 180KHz and with high band-witch digital inputs

But no one even cared enough to make high speed HDMI/DP converter. All you needed was to slap HDMI/DP input IC and VGA DAC IC. These IC's were available for long time and complexity level is such it required only some effort and not some extraordinary skills. Yet, no one cared to make these... :dead:
 
Well I agree with what you said about 3d printers my intent was to explain that as manufacturing technologies continue to advance and even come to home hobbyist more opportunities open up.

If you watch AVE on youYouT and see the videos with cool experiments you know what I mean. Especially if you consider what tools where used in how it's made videos for CRTs. Far far more advanced than the come back of tubes and radio tubes indeed for sure!

but, it makes one wonder, can a glass mould for a cry be machined out on 5 axis cnc etc etc
 
Back
Top