24" Widescreen CRT (FW900) From Ebay arrived,Comments.

I'm not sure what you are trying to say with that, but the simple fact is that OLEDs are currently working as sample-and-hold devices and are poor at displaying motion (compared to CRT/plasma). They also have less accurate colors. This is based on TVs, though, I am not sure if they've managed to make monitors impulse driven or implement a form of BFI.

the fact is that it's not technically difficult to strobe oled displays to ~1ms persistence , and the lack of such monitors on the market right now in no way implies the converse
 
Oh, I'm intrigued. It's like the mini LaserVue I always wanted :D

Can you use it with a computer?

You can use the older model Showwx and Showwx+ with a VGA port, but it looks like the newer Pico-P projectors may or may not have external inputs. If you read further in the thread, Light23 is working on a monitor-sized rear projection style display with Pico-P style scanning laser tech, so something to watch. He is getting ready to have his custom speckle-free laser projection screens available for purchase pretty soon, though.
 
How can there possibly still be active support for CRT in 2015

what do you mean by "active support"? So long as video cards still have a built in RAMDAC with support for analog out, and the general scanning paradigm for video signals remains, CRTs will work fine.
 
How can there possibly still be active support for CRT in 2015

Because despite being in the minority, there are (thankfully) still people who appreciate the most important aspects of image quality that LCD has so nonchalantly dumped in favor of thinness, light weight and size.

You know, there's a comment I read some 4 or 5 years ago on a YouTube video which IMO perfectly sums up the monitor situation, it's just as valid now as it was then, here it is:

"Yes, IPS is ridiculous... ridiculously overrated and overpriced. It's not much better than TN considering the viewing angles are still shit. Not in the same way, but it's still shit. A different flavor of shit. It's like, do you want horse shit or bird shit - which do you prefer? IPS's (and VA's) are slower than TN's, too. LCD, itself, is an inherently broken technology. Can't be fixed. For something broken and shitty you better pay as little as possible. "

That's exactly how I feel. Why would I pay $600 and STILL get a monitor with glowing corners, bad black levels, and only marginally better colors and viewing angles that still don't match up to a CRT? I'll rather pick up a used TN for $80 from the ads and be happy I didn't blow a hole in my wallet on something that's still not going to be up to par to my 2003 Trinitron tube.

You can use the older model Showwx and Showwx+ with a VGA port, but it looks like the newer Pico-P projectors may or may not have external inputs. If you read further in the thread, Light23 is working on a monitor-sized rear projection style display with Pico-P style scanning laser tech, so something to watch. He is getting ready to have his custom speckle-free laser projection screens available for purchase pretty soon, though.

Really cool. I wonder if it's a huge requirement to use these in a dark room. If you got one of those really black, matte projecting surfaces, wouldn't that work just as well? I mean, as long as the surface that the image lands on is as black as possible the contrast should be great, even in daylight.
 
the fact is that it's not technically difficult to strobe oled displays to ~1ms persistence , and the lack of such monitors on the market right now in no way implies the converse

Hence, "not there yet". I don't think it is trivial, though, as they would have to compensate with much higher light output. I hope we will finally get a decent technology with OLED to replace the CRT (SED was my hope, ah, still can't get over that one), I really hate LCD with a passion and cry a little whenever I sit in front it.
 
what do you mean by "active support"? So long as video cards still have a built in RAMDAC with support for analog out, and the general scanning paradigm for video signals remains, CRTs will work fine.

Even without an integrated RAMDAC, external active adapters can be used.
 
Because despite being in the minority, there are (thankfully) still people who appreciate the most important aspects of image quality that LCD has so nonchalantly dumped in favor of thinness, light weight and size.
.

I guess I just don't get it. IQ on all of my IPS devices ranging from smartphone to tablet to laptop/desktop panels seems way more than adequate. I read diagnostic medical images all day and we use LCD monitors (very high end). No way I'd ever consider going back to an ugly, heavy, power hungry, comparatively low-res, massive monitor with huge bezels and small usable image space.

Props to you guys though for being able to appreciate the differences in IQ. I'll remain blissfully ignorant in plato's cave
 
what do you mean by "active support"? So long as video cards still have a built in RAMDAC with support for analog out, and the general scanning paradigm for video signals remains, CRTs will work fine.

I meant support from people who want to use them in 2015.
 
I guess I just don't get it. IQ on all of my IPS devices ranging from smartphone to tablet to laptop/desktop panels seems way more than adequate. I read diagnostic medical images all day and we use LCD monitors (very high end). No way I'd ever consider going back to an ugly, heavy, power hungry, comparatively low-res, massive monitor with huge bezels and small usable image space.

Props to you guys though for being able to appreciate the differences in IQ. I'll remain blissfully ignorant in plato's cave

The difference is hardly subtle. There is a massive and immediately evident difference in contrast, black level, motion, viewing angles, color reproduction and multiple resolution support, none of which are important in your use case where size and power take preference.
 
I guess I just don't get it. IQ on all of my IPS devices ranging from smartphone to tablet to laptop/desktop panels seems way more than adequate. I read diagnostic medical images all day and we use LCD monitors (very high end). No way I'd ever consider going back to an ugly, heavy, power hungry, comparatively low-res, massive monitor with huge bezels and small usable image space.

Props to you guys though for being able to appreciate the differences in IQ. I'll remain blissfully ignorant in plato's cave

Yea for medical imaging, high end LCDs would be superior, as having deep blacks and fast response times are not at all required. The important thing about medical imaging is clear distinction between pixels and gray levels. The raster addressability ratio of a regular CRT would also be too high, especially at high resolutions (the spot size is slightly larger than the addressable pixel size, which is fine for photos and videos but not fine for diagnostic imaging).

But a well calibrated studio grade CRT has to be seen in person to be appreciated.
 
I guess I just don't get it. IQ on all of my IPS devices ranging from smartphone to tablet to laptop/desktop panels seems way more than adequate. I read diagnostic medical images all day and we use LCD monitors (very high end). No way I'd ever consider going back to an ugly, heavy, power hungry, comparatively low-res, massive monitor with huge bezels and small usable image space.

Props to you guys though for being able to appreciate the differences in IQ. I'll remain blissfully ignorant in plato's cave

Nobody would use a CRT for that...
We use CRTs for gaming, videos and things that require accurate color reproduction. And in those things CRTs are superior to LCD.
 
I guess I just don't get it. IQ on all of my IPS devices ranging from smartphone to tablet to laptop/desktop panels seems way more than adequate. I read diagnostic medical images all day and we use LCD monitors (very high end). No way I'd ever consider going back to an ugly, heavy, power hungry, comparatively low-res, massive monitor with huge bezels and small usable image space.

Props to you guys though for being able to appreciate the differences in IQ. I'll remain blissfully ignorant in plato's cave

don't be so patronizing. You got no clue what you talk about.

I agree that for work conditions lcd are fine, but for gaming CRT is still best. Period. Try playing counterstrike on a lcd at 800x600@160hz. FAIL. CRT can do that. Thats just one example.

You are thinking of low res 14 inch 60hz CRT right?

This thread is about the Pinnacle of CRT tech, the sony gdm-fw900. 24 inch widescreen CRT, Native resolution of 1920x1200@85hz that can be boosted upwards even to 2340x1400@80hz or downgraded to 800x600@160hz or so for an edge in online shooters.

Do your homework in plato cave before you frown on your parents for still listening to elvis. Parents are usually right, in the end. (sorry for sarcasm, i can be patronizing too)
 
New tech does not equal to superior tech.

Take a look at the slew of million+ DPI mice with three dozen weight customization and 12 different glowing patterns that have came out in the past decade.

Yet progamers, if given the choice, would stick with classic MS Intellimice with raw 1:1 tracking. None of the new mice in the past decade offers the level of precision and latency a 10 dollar WMO (made for office users) from 2001.

Same thing with all the garbage Creative Labs have put out in the last 15 years. They buried A3D, a technology so superior that it is still unsurpassed to this very day.

Even in construction, many of the apartment high rises of today are put together like Lego blocks.

CRT vs LCD is no different.

I just feel sorry for those who think newer = better.
 
don't be so patronizing. You got no clue what you talk about.

I agree that for work conditions lcd are fine, but for gaming CRT is still best. Period. Try playing counterstrike on a lcd at 800x600@160hz. FAIL. CRT can do that. Thats just one example.

You are thinking of low res 14 inch 60hz CRT right?

This thread is about the Pinnacle of CRT tech, the sony gdm-fw900. 24 inch widescreen CRT, Native resolution of 1920x1200@85hz that can be boosted upwards even to 2340x1400@80hz or downgraded to 800x600@160hz or so for an edge in online shooters.

Do your homework in plato cave before you frown on your parents for still listening to elvis. Parents are usually right, in the end. (sorry for sarcasm, i can be patronizing too)

Your panties got twisted up pretty fast there. Chill out bro.

24" 800x600 makes me cringe.
 
Your panties got twisted up pretty fast there. Chill out bro.

24" 800x600 makes me cringe.

Don't know why you're picking an argument here when you don't know what you're talking about.

You're wrong; you should have read the thread; get over it.

Don't start digging holes.
 
Don't know why you're picking an argument here when you don't know what you're talking about.

You're wrong; you should have read the thread; get over it.

Don't start digging holes.

Chill out. There is nothing to be "wrong" about. You clearly value an IQ factor that I can't appreciate while I value size, resolution, energy efficiency, portability, aesthetics etc. We can both appreciate different things.

Have followed this thread on and off for years. Don't be so offended that I find it surprising that some people are actually still so interested in CRT.

Wish I would have paid more attention earlier this year. Just trashed 2 very "nice" CRT monitors because I was reasonably sure that I wouldn't be able to find a buyer willing to pay more than the cost of shipping.
 
Chill out. There is nothing to be "wrong" about. You clearly value an IQ factor that I can't appreciate while I value size, resolution, energy efficiency, portability, aesthetics etc. We can both appreciate different things.

Have followed this thread on and off for years. Don't be so offended that I find it surprising that some people are actually still so interested in CRT.

Wish I would have paid more attention earlier this year. Just trashed 2 very "nice" CRT monitors because I was reasonably sure that I wouldn't be able to find a buyer willing to pay more than the cost of shipping.

You know you're wrong, hence the damage control. Don't worry, I'm not offended at all. It's pretty funny actually. "Oh, I threw away two of your precious CRTs because I don't give a fuck hurr-durr". Could you be a little more passive-aggressive? :p
 
in my opinion, all things considered, oled/future technologies > crt > lcd

lcd advantages over crt:
they are actually being sold
(potentially) wider gamuts
brighter
darker blacks when there is significant ambient light
higher ansi (checkerboard) contrast
has a native resolution that is perfectly sharp with perfect geometry
capable of high refresh rates at high resolutions e.g. 1920x1080@144hz
capable of higher pixel density

crt advantages over lcd:
too many to list

oled advantages over crt:
wider gamut
brighter (or at least they will be in the future)
higher contrast ratio in all circumstances
has a native resolution that's sharp and has perfect geometry
no phosphor trail
capable of higher pixel density

crt advantages over oled:
heats up room
you can actually buy them used right now in 2015 for a reasonable price
guaranteed 0 input lag (possible with oleds but dependent on the controller)
scales lower resolutions more pleasantly than most forms of digital scaling
 
Last edited:
why do lcd's have darker blacks when there is significant ambient light? Is it to do with the fact that CRTs have a glass plate (as a dielectric), which means they reflect more than the materials used to coat LCD screens?
 
You know you're wrong, hence the damage control. Don't worry, I'm not offended at all. It's pretty funny actually. "Oh, I threw away two of your precious CRTs because I don't give a fuck hurr-durr". Could you be a little more passive-aggressive? :p

Seriously bro, relax. Why do you so desperately need for me to be "wrong" (about what, anyway)? The world isn't black and white. What I value in a monitor is different than what you value and it doesnt make either of us absolutely "wrong".
 
http://i.imgur.com/Yg0SFHC.jpg
cpd-g520p with antiglare vs my thinkpad

the phosphor layer is quite reflective (diffuse reflections). by using an antiglare film that's partially opaque, these reflections are cut down by quite a bit, though the luminance of the monitor itself drops as well.

idk much about how lcds work but i think the stack of polarizers ensures that the light that goes through the air-monitor interface doesn't get reflected significantly
 
Seriously bro, relax. Why do you so desperately need for me to be "wrong" (about what, anyway)? The world isn't black and white. What I value in a monitor is different than what you value and it doesnt make either of us absolutely "wrong".

OK, bro. Getting serious about relaxing, huh? :D
 
http://i.imgur.com/Yg0SFHC.jpg
cpd-g520p with antiglare vs my thinkpad

the phosphor layer is quite reflective (diffuse reflections). by using an antiglare film that's partially opaque, these reflections are cut down by quite a bit, though the luminance of the monitor itself drops as well.

idk much about how lcds work but i think the stack of polarizers ensures that the light that goes through the air-monitor interface doesn't get reflected significantly

ah right, damn phosphor layer.

and yea, ANSI contrast of CRTs have always been a challenge - there's a tradeoff between the longevity of a thicker phosphor layer, and the amount of light scattering that occurs as the light travels through this layer; and then there's internal reflections off the glass, and electron back scatter. Would have been interesting to see, in a parallel universe, any potential engineering solutions to these problems :)
 
imo ansi contrast isn't really a big deal... i wonder how high the ansi contrast would have to be for it to be completely indistinguishable from infinity:1. comparing checkerboard patterns on my my iphone (~1000:1) and my galaxy s5 (infinite contrast) in a dark room, i couldn't really tell a difference.

another thing crts may be better at than oleds: viewing angles.
idk about other oled displays but on my galaxy s5 there is some iridescence at oblique angles. i think it's due to the circular polarizer layer in there (which is necessary for reducing the screen's reflectivity)
 
yea, viewing angles is an interesting one.

Here are a couple of papers, the first of which measured viewing angles (but they didn't compare it to their CRT). See figure 13 for the data.

They used a minolta CS100A (which is a spot colorimeter) - I know a lab that I can probably borrow one from - if I have the motivation, I may take similar measurements to compare, though with some care I could probably do the same with my i1 pro or even i1 display pro.

One thing is that those OLED measurements were taken on the first generation of trimaster PVM's - the second generation improves the viewing angles.

Also, here is another tangentially related paper
 
Your panties got twisted up pretty fast there. Chill out bro.

24" 800x600 makes me cringe.

Ok sorry for reacting in the way i did.

But, to be honest, this ain't true for a CRT.

On an lcd, playing old games or newer ones at lower resolution and higher refresh rates, looks bad and blurred.

Not on CRT.

Playing 800x600 or 960x(xxx)@160hz gives a good image on a CRT.

In fact, CRT are also best monitors to play old games on, that have max resolutions way below that of native LCD resolutions.

Scaling on a lcd screen (done by nvidea or ati control panel) gives really bad image.

On a crt you don't need scaling, and it gives ideal image.

I really hope you can see a gdm-fw900 in action on a high resolution one day.

I'll give an example.

I play skyrim with modded 4k textures. So, everything i see gets a layer of 4k textures.

This looks awesome on a CRT. You get the feeling you are looking through a glass window at a crystal clear world. I tried same on a 27 inch 2560x1440@120hz overclocked flat screen (which isn't even the standard quality of general lcd gamers) and i quickly went back to my trusty gdm-fw900.

The difference was noticable, didn't like the blacks one bit (omg caves looked like crap, blacks were all washed out) for example.

I agree we all set different priorities in chosing a gaming monitor. You chose size, weight, utility etc. I'm just glad i didn't.

Buying this gdm-fw900 on ebay in 2007 was the best money spent in last decade for me.

I mean, what LCD has a longevity of 10+ years? All stuff these days is made to fail in three years, so to speak.
 
Last edited:
in my opinion, all things considered, oled/future technologies > crt > lcd

lcd advantages over crt:
they are actually being sold
(potentially) wider gamuts

I think this has more to do with demand at the respective times than technology itself. Not sure if there is an inherent phosphor limit.


Maybe. I've always found all my displays capable of more than required brightness, regardless of technology.

higher ansi (checkerboard) contrast
Only a plus on paper.

capable of high refresh rates at high resolutions e.g. 1920x1080@144hz

This is not an advantage. They are still slower than CRTs with much lower refresh rates.



oled advantages over crt:
wider gamut
brighter (or at least they will be in the future)
See above.


crt advantages over oled:
heats up room
you can actually buy them used right now in 2015 for a reasonable price
guaranteed 0 input lag (possible with oleds but dependent on the controller)
scales lower resolutions more pleasantly than most forms of digital scaling
Motion resolution at this time.
More resistant to IR and burn in.
 
I think this has more to do with demand at the respective times than technology itself. Not sure if there is an inherent phosphor limit.
no it's not due to some inherent limit
i think originally the ntsc gamut was really wide but the phosphors weren't really bright/efficient, so people decided to go with less spectrally peaked phosphors that were brighter

Maybe. I've always found all my displays capable of more than required brightness, regardless of technology.
same for me but not for some people

Only a plus on paper.
yes but crt halation is undeniable. not really a big problem imo though

This is not an advantage. They are still slower than CRTs with much lower refresh rates.
yes it is. at high vertical resolutions, crts are limited by the horizontal scan rate which is 120-140khz at most. with 1200 lines youre limited to ~100hz

More resistant to IR and burn in.

what do you mean by ir? infrared light?
 
yes but crt halation is undeniable. not really a big problem imo though

True. FW900 is on the worse side when it comes to that and residual phosphor glow, but far from a deal breaker.

yes it is. at high vertical resolutions, crts are limited by the horizontal scan rate which is 120-140khz at most. with 1200 lines youre limited to ~100hz

Yes, but a CRT at 85 Hz has a much clearer, sharper, higher resolution motion picture than a LCD at 144 Hz, so higher refresh rate of the LCD isn't an advantage.

what do you mean by ir? infrared light?

Image Retention. Actually, I don't think I've ever seen an IR or burn-in problem on a CRT. I think it was more of a problem in the earlier days of the technology.
 
low persistence 85hz is worse than full persistence 144hz when there is no eye tracking involved.

really the only dealbreaker for crts is the size (for some). other than that different aspects of the image are basically perfect or slightly imperfect, but with no major flaws like lcds have
 
True. FW900 is on the worse side when it comes to that and residual phosphor glow, but far from a deal breaker.



Yes, but a CRT at 85 Hz has a much clearer, sharper, higher resolution motion picture than a LCD at 144 Hz, so higher refresh rate of the LCD isn't an advantage.



Image Retention. Actually, I don't think I've ever seen an IR or burn-in problem on a CRT. I think it was more of a problem in the earlier days of the technology.

Only saw burn-in on a CRT once long ago at a copy shop. An extreme case of a log on screen there probably for hours or days on end. I normally a use a screen saver, but have inadvertently fallen asleep several times (or more) to find that a Windows update or a crash had disabled it and thankfully it's never been an issue personally.

Hope the same can be established for OLED. Saw the LG again tonight with the demo...oh...those lovely pure blacks.
 
I think that if your monitor is properly calibrated, then burn-in shouldn't occur, as the monitor would never get bright enough for that to happen.
 
Oh and by the way - I just got an AMD video card for one of my LAN boxes and HOLY FUCK!!! The analog out is stupid clear! I'm going to have to do some side-by-sides, but let's just say that 1920x1440 on the Artisan is USEABLE (for text)! Guess my next card is going to be a used HD-7970 or R9-280x or something like that.

flod - I owe you my Artisan measurements. I'll get them to you today. Sorry for the wait! To give you a quick preview - after I calibrate the monitor for sRGB with a contrast of 300:1, my HCFR measurements using the DTP-94 aren't super hot. For some reason the full white balance isn't right (according to the 94). Bumping up the green drive a little nails the full white, and my average Delta E goes from being 1.x to <1.

And for the record, calibrating the Artisan using the reference system =/= White Balance adjust in WinDAS. All it appears to do is set the "Expert" level settings in the color menu. So to calibrate the Artisan, one must go through the White Balance adjust.
 
Back
Top