Monitors -- Why have LCD's sucked for so long??

I miss my KDS VS-19sn. Damn. It wasn't expensive or fancy but it got the job done. I gave it away to a friend who became addicted to Counter-Strike and Day Defeat (GoldSrc). 1280 x 1024 / 85 Hz or 1600 x 1200 / 75 Hz were my usual resolutions but also great at low res for arcade emulation (CPS and MVS looked amazing).

I do have some faith in LG OLED (and soon competition begins) because input lag is being addressed on newer models, sometimes with a firmware update if you are lucky. But, I do want as close to real-time as possible. I guess I am spoiled, because of the CRT days. In 2004, another friend gave me a hard time because I was complaining about response times on LCD and input lag, image quality during fast scenes. Twelve years later and I still can't wait for LCD retire.
 
op has a lot of weird, outdated, and just plain ignorant points. that's not to say that i don't respect the sentiment you have expressed, but you need to get some things straight. i sought out an old CRT when i was in the market for a new monitor a year or two ago and was able to try a few and yes, they really are great, but high-end gaming monitors are more or less on par in almost every aspect other than contrast ratio. if you check my post history you can see what an advocate i am for VA so it's not like i don't understand the power of high contrast. anyway...

I'm extremely sensitive to Input Lag, and anyone who claims they cant tell the difference between 60hz refresh and 120hz refresh must be brain damaged. I've gone out of my way to keep on to my CRT monitor as they have 0 input lag, and better refresh rates with deeper blacks and way higher resolutions. What I can't understand is how everyone has for YEARS, uncontested, accepted the horrible refresh rates, input lag, and color depth of LCD monitors. Even on this forum, which is primarily gamer and performance enthusiasts. People claim "oh CRT's are so heavy!" ... REALLY? are you spending the majority of your time carrying yours around? because mine sits on my desk... Then to add on to the BS, these companies advertise things like G-Sync and 1080p as if they are awesome features. But it's really just a brand name for something that has been around for years. V-SYNC, which adds, mind you, to your overall input lag! Wow, 1080p eh? That's pretty impressive, my 16 year old monitor only does 2048x1536 at 100hz ... but yeah 1080 yeah that's REALLY GREAT! Manufacturers flood the market with made up terminology and misleading specs and names for those specs to confuse and trick buyers. But in actuality they are putting lipstick on a pig. Not to mention that the current state of LCD's are moving away from the sample and hold method, which they have actually learned causes more strain on your eyes than the dark phase method of the CRT. Anyways, my point is, and I expect a bunch of un-intelligent flame, it's been 20 years. Why the hell haven't we gotten away from 60hz being the norm and horrendous input-lag. I expect answers about "it's a different technology!" OK, that's true, but that's really no excuse. And perhaps with my question, I'm revealing my own answer, and probably relevant answer to this. The industry is money driven and they will milk every "feature" until the last drop. Meaning they purposely progress slowly because there is no money in jumping advancements. It's more profitable to progress slowly. And everyone eats it up. Consumers don't always drive the best product to being the most popular.

no one is touting 1080p as an awesome feature anymore, in fact no one has been doing that for several years now. most high-end monitors are at least 2560x1440 and can be had alongside refresh rates of 165 Hz. find me a CRT that can do that. gsync is an awesome feature. it is absolutely NOT the same thing as vsync, and it absolutely does NOT add to input lag if used properly. what it is is a feature that allows a monitor to refresh specifically when the GPU sends it a frame, rather than refreshing at a constant rate no matter what the GPU is sending it. this results in no tearing and perfect smoothness, and the best part is that so long as the framerate remains under the monitor's maximum refresh rate, there is no additional input lag whatsoever, and this scenario can easily be created no matter what by using a framerate cap. your CRT might accept and display something it thinks is 2048x1536 at 100 Hz but it's probably not actually resolving all that resolution. sample and hold does not cause more eye strain than the scanning of a CRT. it's not possible for it to do such a thing. what does cause eye strain is PWM which is used to reduce the brightness of the backlight by... flickering. not all LCDs do this. we have started to get away from 60 Hz being the norm; there are tons of televisions that have true 120 Hz panels in addition to UHD or DCI 4K resolutions and tons of desktop monitors that have 120, 144, 165, 180, 200, and even 240 Hz refresh rates at 1920x1080 or higher, something a CRT owner could never dream of. input lag is being worked on every generation. the fastest TVs are now into the sub-20 ms range and desktop monitors have been below 4 ms of total input lag for years now. i would be inclined to believe that i'm more sensitive to input lag than just about anyone on this forum and i cannot discern between a no input lag CRT and a 4 ms input lag gaming monitor. it's just not doable unless you're superman.

your last few sentences are dead-on and i wish it wasn't that way. we've had several technologies come and die that would have been better than CRT like SED and FED (i actually had to check a speech i wrote for a college course to remember those two). OLED is coming and i love LG for the initiative they've taken but it sure is taking its sweet time. Dell has a 30" 4K 120 Hz OLED panel in the works but it has been delayed to who knows when. oh well.

You say sharper, I say BLOCKIER. Try playing and older game like Mario on a LCD and then play it on a CRT. You will see what I mean. Now, with a newer game like CS:GO I would say the same thing but it's less noticable like if you were to use 0 Anti-Aliasing vs AA @ 16x. A CRT would look more like a video with 16x AA and a LCD would look more like a display with NO anti-aliasing turned on. Interpolation is another factor I forgot to mention in the original post that adds to input lag. Clear motion and all those similar brand terms saturate the market to trick buyers into purchasing their crappy display. Hence the soap opera effect.

this is silly. aliasing doesn't magically disappear on a CRT and appear on an LCD. interpolation is a great feature if you like higher framerates. no one is using it when playing games or doing anything else, its sole purpose is to increase the framerate of low framerate content like movies and TV shows. clear motion can also refer to backlight strobing in order to reduce motion blur. these are both features that from your posting history would apparently be something you'd like.

I've owned a few decent quality Samsung TV's and monitors and comparing them to my crappiest CRT monitor and the picture is just worse. CRT just has deeper blacks, better frame rates, smoother edges... honestly the only thing the LCD has going for it is the brightness. But with a 5ms response time, who gives a crap. I still laugh about the fact how popular the GoPro camera was and yet no-one probably uses a display to even appreciate the FPS that camera can capture. What a joke. Yeah enjoy your 240fps on your 60hz display. Oh wait... you cant. The point of this is the fact that we have literally gone backwards in technology for the sake of ???? convenience? thinness? size?? It's been 20 years and we cant surpass 60hz? Its become the awful standard. Will we ever get away from it? 4k at 60hz?? what is the point ?

GoPros record at high framerates for slow motion.

Do me a favor, load up CS:GO, type in console, "fps_override 1" & "fps_max 24" then see how great that looks. That is literally what movies are displayed at. 24 frames per second. and people were actually complaining about how the hobbit looked awful with 60fps... this is just too funny to me.

film and game framerates are two entirely different things. film looks like it does because it has natural motion blur to blend the frames together. games like CSGO don't have this. the two are not comparable 1:1 at all. the high framerate version of the Hobbit was 48 fps, by the way.
 
Last edited:
Comparing High end old CRT's with similar priced newer LCD's , i agree who anyone who said that current LCDs destroy CRTs in resolution and refresh rate. What the OP failed to adress is that what drives the monitor market is not the desire of gamers, but the needs of working monitors and TV buyers. For large companies, changing CRT to LCDs was a no brainer: less energy wasted, desk space saved, improved multi monitor setups. For TVs, CRTs were too small.
Things look good with HDR coming, because the TV market force will drive monitor tech in the right direction for gamers.
 
My post history and plenty of threads on this site show what modern high hz LCD gaming monitors can do. They definitely fail for motion clarity compared to crt which is a major tradeoff, but they are greatly superior to 60fps-hz capped LCD monitors, high response time monitors lacking modern gaming overdrive implementations, and without variable hz or backlight strobing.

I think the HDR premium label's black depth standard of .05nit will push LCD monitors but it will probably just mean more VA LCD panels. For me I'm hoping a good samsung VA 144hz+ 1440h (soon to be released) should hold me over for 3 - 5 yrs until oled gaming monitors are hopefully ubiquitous, wrinkles ironed out (clouding/banding, input lag, response time/blurring, color fade concerns, etc), and perhaps we could hope they include some sort of low persistence/frame blanking blur elimination tech eventually. That and of course pricing will hopefully come down to very expensive rather than astronomical. The OLED HDR premium black depth standard is .0005 nit by the way. !!

High hz LCD gaming monitors at high resolutions will be out this year and into next. 3440 x 1440 21:9's with 144hz - 200hz refresh rates and 4k with 120hz. VR went as high as they could for now at 90hz and 2160 x 1200 which is only 1080 x 1200 per eye. PC VR will require much higher resolutions and will push their refresh rates higher when possible in future generations.

Which brings me to my next point - I'd like to point out to people championing 1400 high and very high resolution gaming monitors that the push toward single gpu through waning support multi gpu (of sli by nvidia in particular) and lack of inclusion by many developers in games (lazy console ports, not willing to pay hours to add it for the niche users, whatever) is not looking good for actually supplying those hz with more current frames of game world states/action on the most demanding game's graphic ceilings. The high hz benefits and blur reduction are lost unless you supply very high frame rates. A lot of people seem to feel they can use variable hz as a "fix" for running low frame rate band graphs. Technologies like g-sync will prevent screen abberations caused by fluctuating frame rates which is a good thing, but having a high hz monitor without supplying high frame rates doesn't get anything out of the high hz benefits the monitor is capable of.

100fps-hz average or so would be the lowest I'd go and even then, using variable hz would be a "vibration" blur graph all over the place
For a few examples of high demand games at 2560 x 1440 from the last year not even considering games yet to be released:
I think witcher 3 gets 82 fps-hz average on ultra on a single gtx 1080 with hairworks disabled. Far Cry Primal on very high gets 80 fps-hz average.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)

I know there are backlight strobing proponents. Perhaps running a HDR LCD monitor, which would have a really high max nit brightness, in non HDR mode with backlight strobing at a very high brightness would finally fix the dim backlight strobing mode issue in the future. The dimming of the screen in that mode at current brightness maximums is a big tradeoff to some people. Again the frame rates come into play with single gpus though as well. Once you strobe at 100hz or less you are going to cause eyestrain and anoyance to a lot of people. One strobe per frame means you'd again be struggling ever worse to keep your frame rate minimums at 100 let alone averages, on the most demanding games.
 
Last edited:
  • Like
Reactions: blkt
like this
I just had to make sure I wasn't in AVSForum.

Yes LED's are plagued with issues. You have Panel types (IPS, TN, VA), Color Depth (6bit+FRC, 8bit, 8bit+FRC, 10bit), Refresh rates, input lag, color profiles, etc. That's not including backlight bleed, dead pixels or anything else. Like a vinyl recording, newer doesn't neccessarily mean better. But like an MP3, LCD's are more versatile than their older brother.

I had a plasma--I know what you mean regarding LCD's. My plasma was an F8500 Samsung. Loved it. Talking in past tense...it got too hot because Samsung people wanted it to be slim and quiet (so they removed the active cooling from it--the morons). After a powersupply and a main board in 4 months of ownership, my TV had sat idle almost as much as I had used it. I sold it locally for a very small loss.

The good news is: OLED monitors are coming. Blacks darker than a CRT, no perceived response lag, infinite contrast. Oh, and burn in potential--just like the good ol' days of the CRT.
 
  • Like
Reactions: blkt
like this
You're saying that your 1200$ monitor in the year 2016 is capable of producing similar results of a monitor from 1997? THAT'S REALLY IMPRESSIVE. This is my whole point guys. We have accepted crap for so long and no one here is expressing that, expect me. This is a step backwards in technology. For the sake of what? cheapness for the manufacturer? Then they have to convince you with flashy terms and made up specifications and benchmarks to sell it to you?

Nah, I'm saying it beats your CRT in every way.


The Dell 3014 came out in 2013 and yes it was expensive. So was your CRT when it came out.

The HP Omen 32" came out this year, and it's not expensive. $400 street price. Looks fantastic. Trades blows with the Dell 3014 -- in fact, I ended up recently selling the Dell 3014, because I preferred the freesync at 75hz, and slightly larger 16x9 size over the Dell 3014.

This is a fairly reasonable suggestion.

Buy a HP Omen 32" for $400.
Buy a Fury X refurb for $320 on eBay.
Turn on Freesync.

You'll then understand why CRT monitors can't be even given away these days.

 
Last edited:
It's clear that many of the people replying haven't used a CRT recently.
Motion on CRT is so much better than anything else if you can keep the framerate equal to the refresh rate. Not even the strobing LCDs are comparable.
G-Sync and FreeSync are about smoothing out unstable framerates, not improving motion clarity.
OLED doesn't fix this because it doesn't flicker like CRTs did. CRTs are still better for motion and viewing angles. OLED could be as good or better than CRT, but isn't yet.

I would love to have a 4K 120Hz G-Sync OLED though - because lots of new games have unstable framerates even on high-end systems.
But I wouldn't be replacing a CRT with one for anything where latency or motion clarity mattered.
Hopefully one day we'll get OLED or QLED displays built for gaming, but until then CRT is still the best we have.
 
It's clear that many of the people replying haven't used a CRT recently.
Motion on CRT is so much better than anything else if you can keep the framerate equal to the refresh rate. Not even the strobing LCDs are comparable.
G-Sync and FreeSync are about smoothing out unstable framerates, not improving motion clarity.
OLED doesn't fix this because it doesn't flicker like CRTs did. CRTs are still better for motion and viewing angles. OLED could be as good or better than CRT, but isn't yet.

I would love to have a 4K 120Hz G-Sync OLED though - because lots of new games have unstable framerates even on high-end systems.
But I wouldn't be replacing a CRT with one for anything where latency or motion clarity mattered.
Hopefully one day we'll get OLED or QLED displays built for gaming, but until then CRT is still the best we have.
What premium LCDs have you used?
What CRT do you own?
 
Have you even tried a LCD? I used to be the same way untill LED LCDs stated to hit the market the CFL ones would make me black out visually cause of the horrid flicker.
 
What premium LCDs have you used?
What CRT do you own?
Well my current TV is a Sony that uses a larger version of the panel used in the FG2421.
So it's a 10-bit VA panel with 5000:1 native contrast and a 70-zone local dimming system, which does backlight scanning rather than strobing.
I've recently used the current 1440p IPS G-Sync displays and the original PG278Q.

Current CRT is a cheap brand monitor that uses a Samsung tube. It's far from being a high-end display.
But motion on it is still far better than anything LCD, Plasma, or OLED can offer.
Viewing angles are better than just about anything else too.
Black level is basically perfect, though as with all CRTs contrast could be a lot better.
 
  • Like
Reactions: blkt
like this
I haven't met a TN panel I liked yet. As to motion - I must not be as sensitive to it as some people. I immediately liked even the old LCD panels better than my CRT for daily use (eye strain considerations) , and after only a few generations of LCD the motion blur wasn't much of a nuisance in every day use/gaming use than CRT IMO.

VA or IPS are the only way to go for me. I don't think I'd go back to a CRT ever. I had a pretty decent 21" Hitachi CRT, and haven't missed it for probably 10 years as compared to the LCDs that followed it.
 
I used to have a 50lb 19 inch flatscreen CRT monitor on my desk with a 100lb 27 inch flatscreen CRT TV next to it and I don't miss them one bit. I'm glad they're gone.
 
I used to use a GDM-FW900 as my main monitor, at least until it died over a year ago. Now I have to fall back to a GDM-5410, which means no widescreen in 2016. Ouch.

Still, there's a reason I'm still using CRTs for main monitors, and this thread explains it all quite nicely. LCDs need to just die already now that we have OLED; once they solve the burn-in issues, there won't be a reason to keep a fundamentally flawed technology like LCD around.

TN panels in particular are just awful and need to die off; I can't stand the vertical shifting, and dark screens look especially bad on some of the panels I've seen, particularly on laptops. IPS glow doesn't bother me much, but black levels don't even come close to CRTs, even if you rule out the glow. VA isn't used on fast-refresh gaming monitors, barring the Eizo FG2421.

However, I'm in the fortunate position of knowing what high-end FD Trinitron monitors look like, and being able to get them for dirt cheap prices at that. I don't have to trade off motion clarity and input lag for image quality; I can have both at once, and it won't cost me $650-800 like a 144-165 Hz AHVA 2560x1440 G-SYNC panel would, which doesn't even guarantee that you're going to get a panel that's completely devoid of dirt, dust, dead pixels, backlight bleed, clouding, or even weird firmware bugs that you can't just reflash yourself.

Recent events may force my hand, however, specifically the fact that NVIDIA finally followed AMD and Intel in dropping VGA support from their Pascal cards, and DisplayPort to VGA adapters currently suck compared to a native 400 MHz RAMDAC, with the ones that don't suck being likely to have inflated HDFury-level prices that could just buy you a new monitor. Getting a new graphics card is effectively going to require getting a new monitor (or being constrained to my 13" Cintiq Companion Hybrid, which is too small and only 60 Hz), and I might as well make the G-SYNC jump if that's the case. My wallet is going to hate me for that, but it's the only way to get a remotely passable LCD.

CRTs aren't getting any newer, either. I'm gonna have learn to repair that FW900 myself, because no one else is even going to bother with it.

I also use a 24" WEGA Trinitron SDTV for retro consoles, but that's a different usage case entirely. Somehow, that thing manages to make composite video look watchable instead of absolute garbage, it has component video just begging for an RGB transcoder, and it passes the GunCon compatibility test. It's a keeper, even if I'm still aware of its flaws.
 
Did you see the new "gaming mode" from LG? they have the backlight strobe just like CRT's. They call it "
Dynamic Action Sync & Black Stabilizer". Works at 120hz.
 
op has a lot of weird, outdated, and just plain ignorant points. that's not to say that i don't respect the sentiment you have expressed, but you need to get some things straight. i sought out an old CRT when i was in the market for a new monitor a year or two ago and was able to try a few and yes, they really are great, but high-end gaming monitors are more or less on par in almost every aspect other than contrast ratio. if you check my post history you can see what an advocate i am for VA so it's not like i don't understand the power of high contrast. anyway...



no one is touting 1080p as an awesome feature anymore, in fact no one has been doing that for several years now. most high-end monitors are at least 2560x1440 and can be had alongside refresh rates of 165 Hz. find me a CRT that can do that. gsync is an awesome feature. it is absolutely NOT the same thing as vsync, and it absolutely does NOT add to input lag if used properly. what it is is a feature that allows a monitor to refresh specifically when the GPU sends it a frame, rather than refreshing at a constant rate no matter what the GPU is sending it. this results in no tearing and perfect smoothness, and the best part is that so long as the framerate remains under the monitor's maximum refresh rate, there is no additional input lag whatsoever, and this scenario can easily be created no matter what by using a framerate cap. your CRT might accept and display something it thinks is 2048x1536 at 100 Hz but it's probably not actually resolving all that resolution. sample and hold does not cause more eye strain than the scanning of a CRT. it's not possible for it to do such a thing. what does cause eye strain is PWM which is used to reduce the brightness of the backlight by... flickering. not all LCDs do this. we have started to get away from 60 Hz being the norm; there are tons of televisions that have true 120 Hz panels in addition to UHD or DCI 4K resolutions and tons of desktop monitors that have 120, 144, 165, 180, 200, and even 240 Hz refresh rates at 1920x1080 or higher, something a CRT owner could never dream of. input lag is being worked on every generation. the fastest TVs are now into the sub-20 ms range and desktop monitors have been below 4 ms of total input lag for years now. i would be inclined to believe that i'm more sensitive to input lag than just about anyone on this forum and i cannot discern between a no input lag CRT and a 4 ms input lag gaming monitor. it's just not doable unless you're superman.

your last few sentences are dead-on and i wish it wasn't that way. we've had several technologies come and die that would have been better than CRT like SED and FED (i actually had to check a speech i wrote for a college course to remember those two). OLED is coming and i love LG for the initiative they've taken but it sure is taking its sweet time. Dell has a 30" 4K 120 Hz OLED panel in the works but it has been delayed to who knows when. oh well.



this is silly. aliasing doesn't magically disappear on a CRT and appear on an LCD. interpolation is a great feature if you like higher framerates. no one is using it when playing games or doing anything else, its sole purpose is to increase the framerate of low framerate content like movies and TV shows. clear motion can also refer to backlight strobing in order to reduce motion blur. these are both features that from your posting history would apparently be something you'd like.



GoPros record at high framerates for slow motion.



film and game framerates are two entirely different things. film looks like it does because it has natural motion blur to blend the frames together. games like CSGO don't have this. the two are not comparable 1:1 at all. the high framerate version of the Hobbit was 48 fps, by the way.



literally everything you posted was either wrong, misinterpreted or both.. its seriously cringe worthy. and then the people liking the comment makes me cringe as well. i honestly expect more from people from this forum.
 
literally everything you posted was either wrong, misinterpreted or both.. its seriously cringe worthy. and then the people liking the comment makes me cringe as well. i honestly expect more from people from this forum.
Still better than just simply saying "you are wrong" and not offering any reasons as to why.
 
literally everything you posted was either wrong, misinterpreted or both.. its seriously cringe worthy. and then the people liking the comment makes me cringe as well. i honestly expect more from people from this forum.
good post, A+.
Still better than just simply saying "you are wrong" and not offering any reasons as to why.
what do you mean "still better"? everything i posted is factually correct.
 
good post, A+.

what do you mean "still better"? everything i posted is factually correct.

I wasn't refering to your post, or the facts it stated.

I was referring to the quality of the post I quoted directly, being that, even if your post was factually INcorrect, the overall quality is still better than OP's "you are wrong" post, since it doesn't offer any reason as to why.
 
I miss CRTs.. Used to have some nice ones. Ultra flat screen trinitron type with the 2 teeny black lines going across lmao. Lots of Viewsonics. SD content looked amazing and used to be able to get really high refresh rates.

However, just something about a cathode ray being aimed directly at your skull for many hours a day. I know... probably just my own paranoia.. (though there was some mention about dust particles found near CRT TVs to be mildly radioactive? hahaha.)

Dunno... Never really had the same feels of a nice screen since the old days of having CRTs. My newish Samsung KS8500 is decent. Not perfect, but wasn't too expensive, and good for my purposes.
 
Last edited:
I agree with the overall motion of LCD's being worse. But by in large, short of competitive shooters, I've never noticed a difference with 60hz or the 75hz my CRT provided over the LCD. I did notice the color depth from CRT to TN panel, but the IPS panel fixed that. I don't miss CRT's when I have a nice glossy screen.

I
 
I miss CRTs.. Used to have some nice ones. Ultra flat screen trinitron type with the 2 teeny black lines going across lmao. Lots of Viewsonics. SD content looked amazing and used to be able to get really high refresh rates.

However, just something about a cathode ray being aimed directly at your skull for many hours a day. I know... probably just my own paranoia.. (though there was some mention about dust particles found near CRT TVs to be mildly radioactive? hahaha.)

Dunno... Never really had the same feels of a nice screen since the old days of having CRTs. My newish Samsung KS8500 is decent. Not perfect, but wasn't too expensive, and good for my purposes.

The two black lines are wires that support the aperture grille. I too had those in my 21".
And, yeah, they do emit X-rays. X-rays are ionizing, but they don't come from nuclear events - just electrons.
Aside from the constant 'drifting' of settings in and out of the right range, I was kind of annoyed by the high pitched whine at certain refresh rates.
But, yeah, miss them!
 
The two black lines are wires that support the aperture grille. I too had those in my 21".
And, yeah, they do emit X-rays. X-rays are ionizing, but they don't come from nuclear events - just electrons.
Aside from the constant 'drifting' of settings in and out of the right range, I was kind of annoyed by the high pitched whine at certain refresh rates.
But, yeah, miss them!

Oh yeah! That whine! hahaha. I could always tell if someone was watching TV or not by hearing it from a nearby room in the house I grew up in. XD
 
Nah, I'm saying it beats your CRT in every way.


The Dell 3014 came out in 2013 and yes it was expensive. So was your CRT when it came out.

The HP Omen 32" came out this year, and it's not expensive. $400 street price. Looks fantastic. Trades blows with the Dell 3014 -- in fact, I ended up recently selling the Dell 3014, because I preferred the freesync at 75hz, and slightly larger 16x9 size over the Dell 3014.

This is a fairly reasonable suggestion.

Buy a HP Omen 32" for $400.
Buy a Fury X refurb for $320 on eBay.
Turn on Freesync.

You'll then understand why CRT monitors can't be even given away these days.



It is a reasonable suggestion, but let's break that first and last statement down.

First on some of the most basic levels even a cheap decent crt is better than that monitor. And as far as a quality crt?
An fw900 destroys that monitor hilariously.

To put it in TRUTH perspective, that monitor isn't really that much better in the grand scheme of things than the 10 year old 20" 5ms 900p Tn panel i have sitting here which can overclock to 75 Hz without frame skipping and has the same ppi as your monitor. And it could even have better pixel response in more color transitions since every Va made thus far has some very slow transitions in certain colors.

What's more, my shitty Tn also is undoubtedly better for greyscale use than yours, as though it does have the Tn top to bottom gamma shift, it's no where near as crippling as Va off angle gamma shift that crushes blacks right in front of you and loses detail making it useless for such work in photoshop and other situations like dark grey on dark backgrounds. One of the reasons Va is not only bad, but entirely useless for certain things. Any mid grade color crt that can do that most basic task.

The contrast indeed blows the TN away, but crt? No. Same or worse, but close enough, draw.

Freesync while great is pretty much the stand out feature but it is useless to anyone without an Amd card, same as gsync, and i could take them or leave them for the games i play and try to attain a solid non variable framerate and stick it.. No one is needing it playing CS or Quake ectr at 300 fps. Some games it's the bees knees. I get that. Still. I could go without and not miss it. And gsync is not without issues anyway, go look at all the threads and posts of it not working, having stuttering ect. Fast sync works on crt and it's smooth as shit. Vsync on crt is lag free. It's basically a draw. I played every major fps for a decade with vsycn no problem. Never had a more connected feeling display. Shits on all Ips gaming monitors and only matched by a few 144Hz Tn panels and only when running at 120-144fps.

So that's really how much better your monitor is than even a merely mediocre crt that can do at least 75 Hz.
The crt still likely has better color, better response, better viewing angles, better ppi. Everything except freesync and we'll give it the res but then take it away because of the total shit dpi of it anyway. The dpi is worse than a 20" 900p and even worse than the ppi of a 1280x1024 17" crt. So you need to push it back, yeah 6 feet to even make it look worth a shit, so it's a TV. Actually the more i lay it all out, that monitor is crap, lol.

Omen 32" 91.79 PPI
20" 900p 91.79 PPI
1280x1024 17" 96.42 PPI


"there's a reason" the fw900 thread has more hits and pages than any other monitor on this site, over 3 million, hundreds of pages of posts and has basically been on the front page of this sub forum for like 9 years straight.

If they were still made today i guarantee if you could pop down to your local Frys and buy one this forum would be filled with people that own them and the fw900 thread would have 9 million hits and would be 900+ pages by now, and the rest would be wishing they had one.

So to rephrase your statement. Your monitor is better in every way "to you", for your use. But in reality when we look at it overall it's actually pretty mediocre.

I'll take a straight fw900 any fucking day and fart in the rest general direction..they get oled at 120Hz below 1ms with no blur smearing image retention at PC monitor size ( below 32") and put it on the store shelf. Game over. I piss on a crt. Until then anyone trying to sell you the notion that a cheap ass Va panel with shit dpi is "better in every way" than any decent crt is straight smoking crack.
 
my 16 year old monitor only does 2048x1536 at 100hz
Interlaced

Crts have some big cons and little (but unique) pros, I do understand why the vast mayority of pc users (bare in mind that not everyone is a gaymer) prefers the lcd, pixel perfect, bigger screens with smaller bezels, lighter, cheaper, less heat...
A fellow Mitsubishi diamond pro 2020 user
 
I own both a 1440p, 144 Hz G-Sync LCD as well as an old TV studio quality CRT monitor. I use the CRT for retro gaming and for that purpose, it's absolutely fantastic as it helps smooth out the low res pixels as well as has very good colors. Much better than the LCD for that. What I don't miss are the image geometry and convergence issues and being at the mercy of graphics card DAC quality. I remember suffering from issues like blurry edges on some CRTs and GPUs that just didn't put out a good analog output. It was enough for me to jump to LCDs fairly early on.

For general computing and modern games I could not be happier with my current LCD. Especially with the ULMB strobe mode you get motion clarity that is to my eye just as good as anything on CRT. Even in G-Sync mode it is good enough for me to not be bothered at all. Input lag on the model can be considered non-existant. That said, it has taken long for LCDs to get there and I feel that the next leap, high refresh rate 4K displays will be the next big push in image quality.

Monitor and especially TV manufacturers often concentrate on completely wrong things. I am amazed that we still have no gaming TVs with the amount of consoles out there. If a certain TV or monitor model ends up having low input lag and good motion clarity it seems almost like a fluke rather than deliberate design. I guess curved screens and smart TV features are just easier to sell to a wider audience but now it's hard to find any features that separate one TV from another. I don't care about how deep the blacks are on some higher end model if it does a ton of processing and ends up with several frames of lag.
 
Still better than just simply saying "you are wrong" and not offering any reasons as to why.
lol i dont have the energy to pick apart everything you say because there is literally just too much. the amount of stuff you got wrong and literally misread is outstanding. re-read what you posted and then re-read my original post and actually take the time to read the information in that article that i linked. Also, what CRT monitor is shooting an image at the screen interlaced? Were not talking about television sets here. Can you link to an article talking about PC CRT monitors using interlaced? Gsync, processing that takes place inside the monitor, to limit, or hold the frames that the GPU is sending the display, so that they do not exceed the refresh rate of the display, directly adds to the input lag or perceived lag (lag that is felt) not lag that is seen. How is this different from vsync? Motion blur and input lag are two different things and this has been confused so much. Anytime I ever turned on Vsync the input lag was felt immediately. Sure the image is seen as very clear but the input lags still seem high on the monitors. I never had screen tearing either. Maybe its just the games I play, I dunno? But I think even having the feature of G-sync on a monitor with a refresh rate < 100hz is kinda a waste. Why does 60 or even 85hz still exist?
 
I own both a 1440p, 144 Hz G-Sync LCD as well as an old TV studio quality CRT monitor. I use the CRT for retro gaming and for that purpose, it's absolutely fantastic as it helps smooth out the low res pixels as well as has very good colors. Much better than the LCD for that. What I don't miss are the image geometry and convergence issues and being at the mercy of graphics card DAC quality. I remember suffering from issues like blurry edges on some CRTs and GPUs that just didn't put out a good analog output. It was enough for me to jump to LCDs fairly early on.

For general computing and modern games I could not be happier with my current LCD. Especially with the ULMB strobe mode you get motion clarity that is to my eye just as good as anything on CRT. Even in G-Sync mode it is good enough for me to not be bothered at all. Input lag on the model can be considered non-existant. That said, it has taken long for LCDs to get there and I feel that the next leap, high refresh rate 4K displays will be the next big push in image quality.

Monitor and especially TV manufacturers often concentrate on completely wrong things. I am amazed that we still have no gaming TVs with the amount of consoles out there. If a certain TV or monitor model ends up having low input lag and good motion clarity it seems almost like a fluke rather than deliberate design. I guess curved screens and smart TV features are just easier to sell to a wider audience but now it's hard to find any features that separate one TV from another. I don't care about how deep the blacks are on some higher end model if it does a ton of processing and ends up with several frames of lag.

I'm trying to think of what is worse.. the light bleed from IPS or my CRT's poor contrast and brightness... seems that the only decent monitors out there are like 800$ but I would still wanna test them against the CRT for input lag to see if there is a noticeable difference.
 
I, too, feel as though LCD took a few steps backward. CRT has its problems, but you don't ever have to worry about display lag issues while twitch gaming, and it's taken a long time to LCDs to catch up to CRT in terms of resolution and refresh rate as well.

I have a 24" Sony FW-900 from a zillion years ago. This is actually my second one. I still love it to this day. I tried out the 48" JS9000 for a month or so. It was nothing short of amazing, but the display "felt" much slower than my CRT, and the curve, while kind of neat and maybe a little more immersive, introduced several issues on its own that I couldn't see myself living with. I'm too damned picky. I exchanged the first JS9000 due to clouding and uniformity issues. The next one was better in that regard, but still not great, AND it had a dead pixel right in the middle which I noticed constantly. :/ I sent that back too. The curve also exaggerates reflections a lot. Now I'm waiting until next year to see what kinds of displays are available.

I'd love a 40-50" 4k FALD display, but OLED isn't even an option right now. It's got too many issues like motion blur, input lag, and really bad burn-in problems. I'd love to see something like micro-LED make its way to PC displays, because that sounds like the best of CRT and plasma in a smaller form factor without the drawbacks of OLED (each pixel is a tiny LED). But who knows? It's too new to tell. I'm just hoping Samsung makes something as good as the KS8000, but with FALD backlighting or something.
 
lol i dont have the energy to pick apart everything you say because there is literally just too much. the amount of stuff you got wrong and literally misread is outstanding. re-read what you posted and then re-read my original post and actually take the time to read the information in that article that i linked. Also, what CRT monitor is shooting an image at the screen interlaced? Were not talking about television sets here. Can you link to an article talking about PC CRT monitors using interlaced? Gsync, processing that takes place inside the monitor, to limit, or hold the frames that the GPU is sending the display, so that they do not exceed the refresh rate of the display, directly adds to the input lag or perceived lag (lag that is felt) not lag that is seen. How is this different from vsync? Motion blur and input lag are two different things and this has been confused so much. Anytime I ever turned on Vsync the input lag was felt immediately. Sure the image is seen as very clear but the input lags still seem high on the monitors. I never had screen tearing either. Maybe its just the games I play, I dunno? But I think even having the feature of G-sync on a monitor with a refresh rate < 100hz is kinda a waste. Why does 60 or even 85hz still exist?
it doesn't hold or limit anything unless you have vsync turned on in the control panel. if you're getting a framerate below the maximum refresh rate of the monitor it doesn't add any extra input lag because the monitor can draw all of the frames that your computer is rendering as soon as they're rendered. if you limit the framerate like i said your maximum input lag will be 7.24 ms, which is the exact amount of delay between frames at that framerate. if you want minimum input lag you can have it disable itself if the framerate exceeds the maximum refresh rate of the monitor and then it's the exact same thing as vsync off. you don't know what you're talking about and it's ok that you don't want to learn or explain why you think you're right, doesn't affect me. do your thing, man.
 
Gsynch is a remarketed V-sync, which causes a very noticeable input lag. Which is why most serious fps gamers turn that crap off.

I'm thinking not quite.

Before the days of adaptive synchronization you had two choices, V-Sync ON or OFF.

V-Sync ON tells your display card to send complete frames to the monitor in sync with the monitor's refresh rate, V-Sync OFF tell the card to just send whatever is in the frame buffer instead of sending only complete frames.

Adaptive synchronization adjusts the monitor's refresh rate to a setting that the video card can support with fully rendered frames. It adjusts the monitor to the card given the demand instead of forcing the card to meet the monitor's demand.
 
Preview of NVIDIA G-SYNC, Part #2 (Input Lag) | Blur Busters

As a fast-twitch game with a fairly high tick rate (Up to 128Hz, configurable), whole input lag in CS:GO is very low compared to both Battlefield 4 and Crysis 3. Sometimes, total whole-chain input lag from button-to-pixels even went below 20ms, which is quite low!

At first, it was pretty clear that G-SYNC had significantly more input lag than VSYNC OFF. It was observed that VSYNC OFF at 300fps versus 143fps had fairly insignificant differences in input lag (22ms/26ms at 300fps, versus 24ms/26ms at 143fps). When I began testing G-SYNC, it immediately became apparent that input lag suddenly spiked (40ms/39ms for 300fps cap, 38ms/35ms for 143fps cap). During fps_max=300, G-SYNC ran at only 144 frames per second, since that is the frame rate limit. The behavior felt like VSYNC ON suddenly got turned on.

The good news now comes: As a last-ditch, I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.

Why is there less lag in CS:GO at 120fps than 143fps for G-SYNC?
We currently suspect that fps_max 143 is frequently colliding near the G-SYNC frame rate cap, possibly having something to do with NVIDIA’s technique in polling the monitor whether the monitor is ready for the next refresh. I did hear they are working on eliminating polling behavior, so that eventually G-SYNC frames can begin delivering immediately upon monitor readiness, even if it means simply waiting a fraction of a millisecond in situations where the monitor is nearly finished with its previous refresh.

I did not test other fps_max settings such as fps_max 130, fps_max 140, which might get closer to the G-SYNC cap without triggering the G-SYNC capped-out slow down behavior. Normally, G-SYNC eliminates waiting for the monitor’s next refresh interval:

G-SYNC Not Capped Out:
Input Read -> Render Frame -> Display Refresh Immediately

When G-SYNC is capped out at maximum refresh rate, the behavior is identical to VSYNC ON, where the game ends up waiting for the refresh.

G-SYNC Capped Out
Input Read -> Render Frame -> Wait For Monitor Refresh Cycle -> Display Refresh

This is still low-latency territory
Even when capped out, the total-chain input lag of 40ms is still extremely low for button-to-pixels latency. This includes game engine, drivers, CPU, GPU, cable lag, not just the display itself. Consider this: Some old displays had more input lag than this, in the display alone! (Especially HDTV displays, and some older 60Hz VA monitors).

In an extreme case scenario, photodiode oscilloscope tests show that a blank Direct3D buffer (alternating white/black), shows a 2ms to 4ms latency between Direct3D Present() and the first LCD pixels illuminating at the top edge of the screen. This covers mostly cable transmission latency and pixel transition latency. Currently, all current models of ASUS/BENQ 120Hz and 144Hz monitors are capable of zero-buffered real-time scanout, resulting in sub-frame latencies (including in G-SYNC mode).

Game Developer Recommendations
It is highly recommended that Game Options include a fully adjustable frame-rate capping capability, with the ability to turn it on/off. The gold standard is the fps_max setting found in the Source Engine, which throttles a game’s frame rate to a specific maximum.

These frame rate limiters hugely benefit G-SYNC because the game now controls the timing of monitor refreshes during G-SYNC. By allowing users to configure a frame rate cap somewhat below G-SYNC’s maximum refresh rate, the monitor can begin scanning the refresh immediately after rendering, with no waiting for blanking interval.

I lowered fps_max more significantly to 120, and got an immediate, sudden reduction in input lag (27ms/24ms for G-SYNC). I could no longer tell the difference in latency between G-SYNC and VSYNC OFF in Counterstrike: GO! Except there was no tearing, and no stutters anymore, the full benefits of G-SYNC without the lag of VSYNC ON.

It is highly recommended that Game Options include a fully adjustable frame-rate capping capability, with the ability to turn it on/off. The gold standard is the fps_max setting found in the Source Engine, which throttles a game’s frame rate to a specific maximum.

These frame rate limiters hugely benefit G-SYNC because the game now controls the timing of monitor refreshes during G-SYNC. By allowing users to configure a frame rate cap somewhat below G-SYNC’s maximum refresh rate, the monitor can begin scanning the refresh immediately after rendering, with no waiting for blanking interval.

So, running a frame rate graph that never, or hardly ever, crosses the max refresh rate should avoid that. That might be a benefit to having a 165hz or 200hz max refresh rate monitor, to allow for overages on actual frame rates (your "frame rate" is just an average of a roller coaster/noise graph of frame rate fluctuation).
The other option on very high frame rate maximum games would be to cap it artificially just below the max refresh rate of the monitor. Above the max refresh rate and the g-sync solution essentially switches back to v-sync mode until the frame rate drops below that threshold. G-sync also utilizes frame doubling or more than doubling "fill in" duplicate frames as necessary if you go to 30fps and less.

Your could probably dial down graphics eye candy to 100fps-hz to 120fps-hz average on more demanding games which usually is a frame rate graph of (65min) 70fps-hz to 140fps-hz (166 max), or just cap the game's frame rate. You shouldn't have any issue as long as your frame rate graph is not consistently going over or under the max/min g-sync hz ranges. (If you are running 30fps and less in your frame rate graph I think that is the least of your problems though. :b )

100fps-hz average or so would be the lowest I'd go and even then, using variable hz would be a "vibration" blur graph all over the place
For a few examples of high demand games at 2560 x 1440 from the last year not even considering games yet to be released:
I think witcher 3 gets 82 fps-hz average on ultra on a single gtx 1080 with hairworks disabled. Far Cry Primal on very high gets 80 fps-hz average.

blur reduction/motion clarity increase 100fps-hz ave:
0% <- 20%(~80fps-hz) <-- <<40% (100fps-hz)>> -->50%(120fps-hz)->60% (144fps-hz)
and motion definition/path articulation/smoothness wise
1:1 -- 1.x:1 (~80+ f-hz) <--- << 5:3 (100fps-hz)>> --> 2:1 (120fps-hz) -> 2.4:1 (144fps-hz)
 
Last edited:
It is a reasonable suggestion, but let's break that first and last statement down.

First on some of the most basic levels even a cheap decent crt is better than that monitor. And as far as a quality crt?
An fw900 destroys that monitor hilariously.

To put it in TRUTH perspective, that monitor isn't really that much better in the grand scheme of things than the 10 year old 20" 5ms 900p Tn panel i have sitting here which can overclock to 75 Hz without frame skipping and has the same ppi as your monitor. And it could even have better pixel response in more color transitions since every Va made thus far has some very slow transitions in certain colors.

What's more, my shitty Tn also is undoubtedly better for greyscale use than yours, as though it does have the Tn top to bottom gamma shift, it's no where near as crippling as Va off angle gamma shift that crushes blacks right in front of you and loses detail making it useless for such work in photoshop and other situations like dark grey on dark backgrounds. One of the reasons Va is not only bad, but entirely useless for certain things. Any mid grade color crt that can do that most basic task.

The contrast indeed blows the TN away, but crt? No. Same or worse, but close enough, draw.

Freesync while great is pretty much the stand out feature but it is useless to anyone without an Amd card, same as gsync, and i could take them or leave them for the games i play and try to attain a solid non variable framerate and stick it.. No one is needing it playing CS or Quake ectr at 300 fps. Some games it's the bees knees. I get that. Still. I could go without and not miss it. And gsync is not without issues anyway, go look at all the threads and posts of it not working, having stuttering ect. Fast sync works on crt and it's smooth as shit. Vsync on crt is lag free. It's basically a draw. I played every major fps for a decade with vsycn no problem. Never had a more connected feeling display. Shits on all Ips gaming monitors and only matched by a few 144Hz Tn panels and only when running at 120-144fps.

So that's really how much better your monitor is than even a merely mediocre crt that can do at least 75 Hz.
The crt still likely has better color, better response, better viewing angles, better ppi. Everything except freesync and we'll give it the res but then take it away because of the total shit dpi of it anyway. The dpi is worse than a 20" 900p and even worse than the ppi of a 1280x1024 17" crt. So you need to push it back, yeah 6 feet to even make it look worth a shit, so it's a TV. Actually the more i lay it all out, that monitor is crap, lol.

Omen 32" 91.79 PPI
20" 900p 91.79 PPI
1280x1024 17" 96.42 PPI

CRT is obsolete for good reason. They emit x-rays and flicker. Eyestrain is inevitable. You'd have to be crazy to risk eye damage and migraines when plasma and OLED are available if you must have perfect blacks.
The Dell U3014 is vastly superior to most LCD's too. It has a 3.2ms gaming mode, insane for IPS with a scaler plus the 10-bit GB-r colours are the best I've seen on an LCD.
 
CRT is obsolete for good reason. They emit x-rays and flicker. Eyestrain is inevitable. You'd have to be crazy to risk eye damage and migraines when plasma and OLED are available if you must have perfect blacks..

Only ooold CRTs emit X-rays, any from the 90s or 2000s should have metal housing and lead glass to prevent that. In regards to harmful radiation in general, using a CRT for a year is the equivilant of eating a couple of bananas.

Having owned an X-Star DP2710 with attrociously low frequency, perceivable PWM dimming, I can tell you that LED PWM is nothing like CRT scanning.
Having the entire backlight blinking with LED's quick rise/fall, is the offender, especially when out of sync with the refresh rate.
edit: Of course, you'll have to use the CRT at a refresh rate where you don't percieve the scanning. I don't mind lower hz in games, but I use 100+hz for desktop work.

CRTs are obsolete because of logistics and consumers that are content with the level of quality LCDs offer. Which is also why plasma is dying or dead.
 
Last edited:
CRT is obsolete for good reason. They emit x-rays and flicker. Eyestrain is inevitable. You'd have to be crazy to risk eye damage and migraines when plasma and OLED are available if you must have perfect blacks.
The Dell U3014 is vastly superior to most LCD's too. It has a 3.2ms gaming mode, insane for IPS with a scaler plus the 10-bit GB-r colours are the best I've seen on an LCD.

l've used nothing but crts for 23 years until like 4 months ago and never had any eye strain problems. Plasma is a TV i don't care about a 40-50" Tv for a computer monitor so it's irrelevant. Happy to welcome Oled to the party though there aren't any Oled pc monitors on store shelves yet, so again irrelevant. I hate Ips glow. I owned an Ips a decade ago and i hated it and got rid of it immediately and that was just a 20" so a 30" oversaturated wide gamut Ips running in srgb mode? No thanks.
I have to ask, as i see people talking about 10 bit panels and it's obvious they aren't quite sure what they are talking about. In what scenario are you using 10 bit? And are you running wide gamut 24/7 and why?
 
Last edited:
CRT is obsolete for good reason. They emit x-rays and flicker. Eyestrain is inevitable. You'd have to be crazy to risk eye damage and migraines when plasma and OLED are available if you must have perfect blacks.
The Dell U3014 is vastly superior to most LCD's too. It has a 3.2ms gaming mode, insane for IPS with a scaler plus the 10-bit GB-r colours are the best I've seen on an LCD.
The fact that CRTs flicker is the reason why motion handling on them is so much better than LCD.
You have to make an LCD or OLED flicker again in order to have good motion handling, so being "flicker free" is hardly an advantage.
And this is not the same as "PWM flicker" as seen on an LCD display or Plasma TV which causes eyestrain as a result of the flicker being several times higher than the refresh rate.

The front of a CRT is made of thick leaded glass, so x-rays are not a concern so long as your nose is not literally touching the screen while you use it. That myth was debunked decades ago.

CRTs being analog devices are whatever bit-depth your DAC provides. Typically GPUs would have a 10-bit RAMDAC. 16-bit DACs are available for scientific applications.
 
I get what everyone is saying about the "step backwards" but you have to realize when you switch technologies from CRT to LCD (CRTs have been around alot longer than the current form of LCD). The reason for the switch was for lighter, bigger, and more power efficient. I do agree, i think its crappy that we don't have higher refresh rates for LCD monitors but they haven't focused on that area of the technology to further the development. They are stuck on pushing high resolutions and packing more pixels per inch than they are to get better performance.
Not sure why everyone is bickering so much about opinions. They both and Pro and both have Cons.....same wither every other argument over new tech versus old tech. CRT is older more refined but limited. LCD is newer and limited by research at this point. Look at the microprocessors lifetime.....Pentium processors were really fast when they were first released but found out later they were flawed because they would at some point divide by 0. Now they are packing any number of cores on a single die at 1/4 the size and 1000x the speed. Once someone starts to push the research the LCD technology with catch up but right now best LCDs that are comparable to CRT performance is going to come at a premium price that is just the truth.
 
Back
Top