IPS is overrated!

IPS is overrated!
if anything IPS monitors are underrated...

Bull.

All TN screens have strong vertical color shift. TN has the worst color shift of any LCD type. This is picture straight on of my TN screen:

tnshift.jpg


Background color should be the same top and bottom. You can learn to tolerate this if you aren't very picky and you don't do any graphics/image touch up work, but it isn't negligible by any stretch of the imagination. I have looked at dozens of TN screen, they all do this. It is pretty much what defines a TN screen.
I see the same thing with virtually all TNs and it is main reason I never owned one myself and never will. Some monitors have it worse and some better. Too bad those fancy 120Hz LightBoost monitors are not the "better" kind. I mean, for my friend because personally I screw LCD for gaming and use good ol'CRT :D

You wanna read or watch movies- CRT
You wanna do photo work- get an CRT
You wanna game professionally - get a CRT
You wanna save money- get a CRT
You wanna pose as gamer- waste your money on these " gaming TNs"
agree with everything except "wanna read - CRT" part :rolleyes:
I can use CRTs for reading but I don't see any reason for doing so. Good LCD is much easier for eyes so why bother using blurry and flickery CRT? :confused:
CRT don't tire my eyes in games, in movies, viewing photos etc but when I start reading text it's another story and CRT shows it's dark side ...

The web involves a lot of scrolling and tab shifting and moving windows around. On a CRT everything is butter smooth and you get through work in a snap because you let everything come to your gaze at the centre and dont move your eyes around (for this smoothness is a 100% requirement). on any LCD this approach quickly turns aggravatingly jerky and dirty. LCDs will simply slow you down in any kind of work (you have to move your eyes around a lot more so you are much slower esp as the resolution increases). at 1024x768 text is nice an big though not as sharp as LCD but easily acceptable. on a CRT i pick things up even during scrolling rapidly especially headlines and thumbnails but ona LCD i have to scroll a page then pause then scroll then pause to see what is loaded. it is not smooth, you cant get an idea of whats on the screen on the fly. (try the readability test of pixperan)
web is stationary text + pictures for most time and 30fps (at most) videos on YT (and other tubes :p) so there is not really much "need for speed"...

slightly blurry enemy in Quake Live may be a problem but slightly blurry moved window... come on man... :rolleyes:

and one more thing:
You wanna do photo work- get an CRT
CRT are cheapset photo work monitors out there (especially they support 10bit ) but still need to be calibrated and have some flaws even then like flaring and doesn't have wide-gamut modes (can't reproduce colors good printers can). So nothing beats good expensive as hell professional grade IPS monitor :)

so all in all imho CRT are the best for everything except reading and professional photo work :rolleyes:
 
CRT doesn't have any processing lag, it provides full motion detail without any motion related artifacts and it can be driven above 120Hz (hence why they are ideal for pro gamers). For everything else it useless (except for those custom build ones)

As for color and medical use, LCD is the only viable tech that provides >90% Arobe RGB, 10-14-bit color (dithered) and high PPI. CRT simply cannot match LCD in this field.
 
Can't believe people are still debating IPS vs TN in 2012.


PS: If anyone think TN are superior to IPS or IPS is overrated, they are seriously deluded or have no appreciation for image quality.
 
Can't believe people are still debating IPS vs TN in 2012.


PS: If anyone think TN are superior to IPS or IPS is overrated, they are seriously deluded or have no appreciation for image quality.

Can't believe people are still debating IPS vs TN in 2012.


PS: If anyone think IPS are superior to TN or TN is overrated, they are seriously deluded or have no appreciation for fluidity or image quality in motion.

Just kidding, but the point is that its all about priorities since sadly, neither technology is good at everything.
 
Last edited:
CRT doesn't have any processing lag, it provides full motion detail without any motion related artifacts and it can be driven above 120Hz (hence why they are ideal for pro gamers). For everything else it useless (except for those custom build ones)
More than 120Hz doesn't matter that much for CRTs, does it?
As for color and medical use, LCD is the only viable tech that provides >90% Arobe RGB, 10-14-bit color (dithered) and high PPI. CRT simply cannot match LCD in this field.

Isn't dithering bad for color and medical use?
 
I am very happy with my 27" Samsung 120hz TN panel. Best PC gaming upgrade I made in a long time, and that was coming from a 30" 1600p IPS. :)
 
I am very happy with my 27" Samsung 120hz TN panel. Best PC gaming upgrade I made in a long time, and that was coming from a 30" 1600p IPS. :)
What is really UNDERRATED are those 1920x1080 27" monitors. And 2560x1440 in the same size are well OVERRATED, going by the opposite to fit the thread name more exactly.
 
TN panels are complete crap. Terrible viewing angles, color-shift, and you can't even tell which part of the screen has the most accurate colors. Anyone who thinks TN panels are as good or better than IPS panels clearly just doesn't give enough of a shit about screen quality, nor do they know what to look for. This thread is insane.
 
CRTs are not only about gaming
CRTs also have insanely high dynamic range => good blacks

try watch Blade Runner on LCD... or play Doom3 on LCD ... or Silent Hill...

you get the basic idea :rolleyes:
 
TN panels are complete crap. Terrible viewing angles, color-shift, and you can't even tell which part of the screen has the most accurate colors. Anyone who thinks TN panels are as good or better than IPS panels clearly just doesn't give enough of a shit about screen quality, nor do they know what to look for. This thread is insane.

I'm not here to argue one side or another, but isn't "you can't even tell which part of the screen has the most accurate colors" a good thing? (ignorance is bliss unless you're a designer).
 
I'm not here to argue one side or another, but isn't "you can't even tell which part of the screen has the most accurate colors" a good thing? (ignorance is bliss unless you're a designer).
DOES NOT COMPUTE
 
What is really UNDERRATED are those 1920x1080 27" monitors. And 2560x1440 in the same size are well OVERRATED, going by the opposite to fit the thread name more exactly.

perhaps my sarcasm detector is broken, but I hated my 27" 1080p monitor. I went from a 24" 1080p to 27" and hated it enough to sell it and by another 24"
 
Can't believe people are still debating IPS vs TN in 2012.


PS: If anyone think IPS are superior to TN or TN is overrated, they are seriously deluded or have no appreciation for fluidity or image quality in motion.

Just kidding, but the point is that its all about priorities since sadly, neither technology is good at everything.

The only three advantages TN have are efficiency, fast pixel response (which only benefits 120Hz displays) and low cost (which also means higher profit margin).
 
CRTs are not only about gaming
CRTs also have insanely high dynamic range => good blacks

try watch Blade Runner on LCD... or play Doom3 on LCD ... or Silent Hill...

you get the basic idea :rolleyes:

But they have poor ANSI contrast ratio. ANSI is far more important than dynamic range
 
CRTs are not only about gaming
CRTs also have insanely high dynamic range => good blacks
try watch Blade Runner on LCD... or play Doom3 on LCD ... or Silent Hill...
you get the basic idea :rolleyes:
What we need is local dimming in computer monitors.

Local dimming is a full-array LED backlight, which dims and turns off LED's behind darker/black parts of the display. That greatly improves the contrast ratio to be more similar to CRT.

This is done with high-end HDTV's such as Sharp's luxury brand, Elite(tm) LCD HDTV (www.elitelcdtv.com)
However, this technology is not in consumer computer monitors. (yet)
 
More than 120Hz doesn't matter that much for CRTs, does it?
Yes it does as it reduces frame lag. The GPU also has to send out (new frame) to match the vertical refresh rate. Recycling existing frames won't yield any gains.

30fps - 60Hz: 33.3ms lag (console games)
60fps - 60Hz: 16.7ms lag
120fps - 120Hz: 8.3ms lag
240fps - 240Hz: 4.16ms lag
480fps - 480Hz: 2.3ms lag
...
1000fps - 1000Hz: 1ms lag


Isn't dithering bad for color and medical use?
No

Today's dithering algorithms can match the native display (providing the image is static).
 
What we need is local dimming in computer monitors.

Local dimming is a full-array LED backlight, which dims and turns off LED's behind darker/black parts of the display. That greatly improves the contrast ratio to be more similar to CRT.

This is done with high-end HDTV's such as Sharp's luxury brand, Elite(tm) LCD HDTV (www.elitelcdtv.com)
However, this technology is not in consumer computer monitors. (yet)

Monitors are too small for this to work and IPS/PLS native black level is too high for it to work effectively. It will also add lag, destroy color accuracy, screw up the luma curve and gamma.

Also most manufacturers have dumped backlit local dimming in favor of much cheaper(and far worse) edge-lit dimming.

So you won't see this tech in LCD monitors.
 
Monitors are too small for this to economically work and IPS/PLS native black level is too high for it to work effectively. It will also potentiallyl add lag...
Fixed a couple phrases for you. Although there's legitimate argument, I should point out there's no technical reason it couldn't work on small LCD's. The problem is for small displays designed for close viewing distances, you need more LED's, and you need the LED's behind the LCD to be closer to the LCD glass to minimize diffusion (halo problem) between the on-segments and off-segments. That raises costs dramatically, which makes it difficult to do economically.

As for input lag, there's a way to pull off local dimming without adding any additional appreciable added input lag, but you need a high-speed signal processor that can do local dimming on a real-time line-based manner. Manufacturers do it the lazy way and framebuffer the refresh, in order to do the local dimming calculations, but nothing prevents it from it being done in real-time on a row-based manner, if you use the right custom chips. Having worked with electronics and I have worked in the home theater industry, some calculations: For a 64-row local dimming backlight, doing it real time on a row-based manner would add only 250 microsecond lag at 60Hz (1/64th of 1/60sec), or 125 microsecond lag at 120Hz (1/64th of 1/120sec) -- 0.25 or 0.125 millisecond respectively -- quite insignificant. For more segments, like a 128-row or 256-row LED local dimming backlight that real-time scans with the LCD refresh, the difference becomes even smaller (e.g. barely more than 30 microseconds of added input lag -- 0.031ms). Alas, all an added expense, too. In addition, you can strobe the rows too, for the zero motion blur effect (scanning backlight) without adding any additional input lag above and beyond the existing pixel persistence lag.

That said, it does affect luma/gamma accuracy to an extent, but the picture does look quite a lot better if you use enough LED's (high enough resolution), the halo effect isn't annoying at all, and for properly-processed local dimming (not always done), greyscale can remain accurate at the macro level (just not accurate at the adjacent-pixel level due to the bleeds between adjacent LED's)

I'm just saying what's possible, just not what's economical.
Alas, the race to thin monitors at the bottom dollar, dictates edge lighting.
 
Last edited:
Fixed a couple phrases for you. Although there's legitimate argument, I should point out there's no technical reason it couldn't work on small LCD's. The problem is for small displays designed for close viewing distances, you need more LED's, and you need the LED's behind the LCD to be closer to the LCD glass to minimize diffusion (halo problem) between the on-segments and off-segments. That raises costs dramatically, which makes it difficult to do economically.

Simply increasing the number of zones does help due to cross contamination. Samsung tried this approach in their labs and saw no significant improvements... met one of the engineers at Samsung House (south of London) back in 2010.

Also due to the close viewing distance, the user will experience blooming (especially in areas close to the edges).

As for input lag, there's a way to pull off local dimming without adding any additional appreciable added input lag, but you need a high-speed signal processor that can do local dimming on a real-time line-based manner. Manufacturers do it the lazy way and framebuffer the refresh, in order to do the local dimming calculations, but nothing prevents it from it being done in real-time on a row-based manner, if you use the right custom chips. Having worked with electronics and I have worked in the home theater industry, some calculations: For a 64-row local dimming backlight, doing it real time on a row-based manner would add only 250 microsecond lag at 60Hz (1/64th of 1/60sec), or 125 microsecond lag at 120Hz (1/64th of 1/120sec) -- 0.25 or 0.125 millisecond respectively -- quite insignificant. For more segments, like a 128-row or 256-row LED local dimming backlight that real-time scans with the LCD refresh, the difference becomes even smaller (e.g. barely more than 30 microseconds of added input lag -- 0.031ms). Alas, all an added expense, too. In addition, you can strobe the rows too, for the zero motion blur effect (scanning backlight) without adding any additional input lag above and beyond the existing pixel persistence lag.

How would you sync the backlight to the LCD to ensure the backlight doesn't cut during pixel transition?

That said, it does affect luma/gamma accuracy to an extent, but the picture does look quite a lot better if you use enough LED's (high enough resolution), the halo effect isn't annoying at all, and for properly-processed local dimming (not always done), greyscale can remain accurate at the macro level (just not accurate at the adjacent-pixel level due to the bleeds between adjacent LED's)

I'm just saying what's possible, just not what's economical.
Alas, the race to thin monitors at the bottom dollar, dictates edge lighting.

Sadly yes it always comes down to cost (not to mention the yield).
 
IPS is not overrated. e-IPS is overrated. I been in the hunt for a new monitor and I tried out the LG IPS235V. Bad glow and a 14ms gtg response time, which was way too slow. The blacks were worse than my current LCD. I had adjust the tilt monitor just so in order for black to be black without any glow. 14ms response time was worse than my dell 2007wfp, with gtg response time of about 8ms. CS:S was unplayable, as was trackmania canyon. My 2007wfp handles both well. (CRT rocks the hardest in both games and not even close.)

What really got me was when I played an hd music video my 2007wfp , as a test, while I was still considering the lg. The viewing angle was far better, colors as vibrant and the motion was better on the older lcd. 7 year old LCD tech btw...

E-IPS is overrated. S-IPS (and variants) are not overrated. Better viewing angles, motion and true 8 bit color.
 
What is really UNDERRATED are those 1920x1080 27" monitors. And 2560x1440 in the same size are well OVERRATED, going by the opposite to fit the thread name more exactly.

lol who would rather have 2560x1440 in 27" than 1920x1080? probably just someone who doesnt have a have graphics card that can cut it with the higher res :D
 
lol who would rather have 2560x1440 in 27" than 1920x1080? probably just someone who doesnt have a have graphics card that can cut it with the higher res :D

No really, there are some very good 27 1080 monitors. Like the one from Philips mva-based display, for example.
And Im talking about gaming and simple internet/windows use. The 27" 1080p puts less stress on your eyes.
 
Simply increasing the number of zones does help due to cross contamination. Samsung tried this approach in their labs and saw no significant improvements... met one of the engineers at Samsung House (south of London) back in 2010.
Also due to the close viewing distance, the user will experience blooming (especially in areas close to the edges).
From the horse's mouth, I believe you, then... There are certainly diminishing point of returns.
Backlight diffusion is very hard to control for local dimming and for scanning backlights (Both situations have a problem of "on" segments leaking into "off" segments).

How would you sync the backlight to the LCD to ensure the backlight doesn't cut during pixel transition?
For motion blur reduction, you do want to turn off the backlight _during_ the pixel transition. There's a lot of existing technology that already does this; see existing scanning backlight technology including Samsung's own, used for HDTV's.

I have created high speed videos (480fps and 1000fps), including a high-speed video of LCD refresh on an old 2007-era 60Hz LCD, and a high-speed video of LightBoost LCD strobe backlight on a 2012-era 120Hz LCD. As the high speed footage clearly shows, the use of strobe backlights makes it possible to bypass the pixel persistence as the limiting factor in motion blur. The video proves this too (not just the research papers, which agree, too). You can strobe the backlight for a shorter time period (<2ms) in a different part of the refresh cycle as the pixel persistence part (>=2ms), and the motion blur is then dictated by the strobe-length rather than the pixel persistence. In the video, the backlight is clearly turned off while pixels are transiting. The stroboscopic effect (CRT) eliminates motion blur caused by eye-tracking on a sample-and-hold display. (photodiode oscilloscope measurement)

Strobing the backlight, only after pixel transition is finished, can eliminate both the pixel-persistence-caused motion blur (only a minor cause of motion blur on modern LCD's -- about 2ms out of 8ms @ 120Hz) and eye-tracking-based motion blur (this is the majority of the cause of motion blur on modern LCD displays). See Science & Referefences that explains the difference between pixel-persistence related motion blur and eye-tracking related motion blur, including some papers made by display manufacturers. (scroll down halfway for the academic links)

Assuming LCD's are designed to refresh quickly enough (3D LCD's have to do so; they need clear fully-refreshed frames for shutter glasses operation; both shutters are closed while waiting for LCD pixel persistence to finish disappearing), they now become compatible with strobe backlights (e.g. LightBoost) that are able to completely bypass pixel persistence for the CRT-style zero motion blur effect, a better-than-Motionflow effect, better-than-plasma, but without the added input lag of interpolation often done in HDTV's. The intra-frame pixel persistence bleed between frames is gradually approaching zero (it's now under 1% for the best panels, pressure is on manufacturers to reduce crosstalk between eyes), as new panels now make it possible for pixel transitions to be practically finished for the previous frame, before the next frame refresh begins.

Admittedly, not everyone likes stroboscopic backlights and they do not benefit all applications (CRT flicker disadvantage, although no worse than a 120Hz CRT if you're running at 120Hz), but they are very good for fast-twitch FPS (e.g. Quake Live, TF2, etc) with some rave reviews from CRT players ("It's like playing on my old CRT"...) On the ASUS VG278H, (using the zero motion blur CRT hack) to force-enable LightBoost in 2D instead of 3D -- the ~2ms strobes lead to a true measured 85% motion blur reduction relative to a 60Hz LCD. I'm also hearing from one beta tester (Vega, who now has a Benq XL2411T to compare with his FW900 CRT's and Catleap 2B's) that the strobes are about ~1ms, which allows a true measurable 94% motion blur reduction relative to regular 60Hz LCD (he is one of my beta testers of a motion test pattern that I am currently programming, my modernized PixPerAn-like replacement).

Strobe backlights have a big advantage over scanning backlights -- they don't have to worry about backlight diffusion between adjacent zones (which would limit the amount of achievable motion blur reduction). For more information, I also have a Scanning Backlight FAQ as well as a Science & Referefences section.

Although CRT reigns supreme for many attributes, the single attribute of motion blur is finally being conquered by certain LightBoost LCD's just only in recent months, and there are many further technological improvements possible, e.g. combining a strobed high-CRI backlight with an IPS LCD. (Strobe backlights are theoretically inexpensive to do, it can be done as an edgelight without needing a full array LED backlight. What you need is simply extra wattage in the backlight to prevent a dim picture during strobing, and precision-modified PWM control that's well-synchronized to the LCD refresh. This no doubt will become more common in enthusiast displays, as prices of LED's fall and LED brightness keep going up). And obviously an easy way to turn on/off the strobe mode operation, for the motion vs. non-motion moments.

Thanks,
Mark Rejhon
BlurBusters.com Blog
 
Last edited:
That said, it does affect luma/gamma accuracy to an extent
For non color critical work those implementations can be acceptable (depending on the user) although the impact on the trc is quite high (like the ABL in Plasma-TVs). This is something that has always annoyed me when using a LCD-TV with local dimming. A content depending trc is a no-go for me. Other side effects (halo etc.) are in comparison no big problem &#8211; the increased range of contrast compensates for this.
 
From the horse's mouth, I believe you, then... There are certainly diminishing point of returns.
Backlight diffusion is very hard to control for local dimming and for scanning backlights (Both situations have a problem of "on" segments leaking into "off" segments).

For motion blur reduction, you do want to turn off the backlight _during_ the pixel transition. There's a lot of existing technology that already does this; see existing scanning backlight technology including Samsung's own, used for HDTV's.

I have created high speed videos (480fps and 1000fps), including a high-speed video of LCD refresh on an old 2007-era 60Hz LCD, and a high-speed video of LightBoost LCD strobe backlight on a 2012-era 120Hz LCD. As the high speed footage clearly shows, the use of strobe backlights makes it possible to bypass the pixel persistence as the limiting factor in motion blur. The video proves this too (not just the research papers, which agree, too). You can strobe the backlight for a shorter time period (<2ms) in a different part of the refresh cycle as the pixel persistence part (>=2ms), and the motion blur is then dictated by the strobe-length rather than the pixel persistence. In the video, the backlight is clearly turned off while pixels are transiting. The stroboscopic effect (CRT) eliminates motion blur caused by eye-tracking on a sample-and-hold display. (photodiode oscilloscope measurement)

Strobing the backlight, only after pixel transition is finished, can eliminate both the pixel-persistence-caused motion blur (only a minor cause of motion blur on modern LCD's -- about 2ms out of 8ms @ 120Hz) and eye-tracking-based motion blur (this is the majority of the cause of motion blur on modern LCD displays). See Science & Referefences that explains the difference between pixel-persistence related motion blur and eye-tracking related motion blur, including some papers made by display manufacturers. (scroll down halfway for the academic links)

Assuming LCD's are designed to refresh quickly enough (3D LCD's have to do so; they need clear fully-refreshed frames for shutter glasses operation; both shutters are closed while waiting for LCD pixel persistence to finish disappearing), they now become compatible with strobe backlights (e.g. LightBoost) that are able to completely bypass pixel persistence for the CRT-style zero motion blur effect, a better-than-Motionflow effect, better-than-plasma, but without the added input lag of interpolation often done in HDTV's. The intra-frame pixel persistence bleed between frames is gradually approaching zero (it's now under 1% for the best panels, pressure is on manufacturers to reduce crosstalk between eyes), as new panels now make it possible for pixel transitions to be practically finished for the previous frame, before the next frame refresh begins.

Admittedly, not everyone likes stroboscopic backlights and they do not benefit all applications (CRT flicker disadvantage, although no worse than a 120Hz CRT if you're running at 120Hz), but they are very good for fast-twitch FPS (e.g. Quake Live, TF2, etc) with some rave reviews from CRT players ("It's like playing on my old CRT"...) On the ASUS VG278H, (using the zero motion blur CRT hack) to force-enable LightBoost in 2D instead of 3D -- the ~2ms strobes lead to a true measured 85% motion blur reduction relative to a 60Hz LCD. I'm also hearing from one beta tester (Vega, who now has a Benq XL2411T to compare with his FW900 CRT's and Catleap 2B's) that the strobes are about ~1ms, which allows a true measurable 94% motion blur reduction relative to regular 60Hz LCD (he is one of my beta testers of a motion test pattern that I am currently programming, my modernized PixPerAn-like replacement).

Strobe backlights have a big advantage over scanning backlights -- they don't have to worry about backlight diffusion between adjacent zones (which would limit the amount of achievable motion blur reduction). For more information, I also have a Scanning Backlight FAQ as well as a Science & Referefences section.

Although CRT reigns supreme for many attributes, the single attribute of motion blur is finally being conquered by certain LightBoost LCD's just only in recent months, and there are many further technological improvements possible, e.g. combining a strobed high-CRI backlight with an IPS LCD. (Strobe backlights are theoretically inexpensive to do, it can be done as an edgelight without needing a full array LED backlight. What you need is simply extra wattage in the backlight to prevent a dim picture during strobing, and precision-modified PWM control that's well-synchronized to the LCD refresh. This no doubt will become more common in enthusiast displays, as prices of LED's fall and LED brightness keep going up). And obviously an easy way to turn on/off the strobe mode operation, for the motion vs. non-motion moments.

Thanks,
Mark Rejhon
BlurBusters.com Blog

very impressive. I need some time to go through all the links. Didn't want you to think I've ignored everything you've wrote :).
 
Last edited:
I know, but how significant is a reduction of 2ms?

It helps to reduce the total* (overall) lag and every millisecond helps. If we shave off 2ms here and 4ms there, we could improve the chances of winning. But this only concern professionals who compete in LAN parties.

*: Frame lag + Display lag + Mouse/Controller lag + Processing lag + Reaction Time (human lag) = Total lag
 
IPS all the way. I bought my first NEC 20WMGX2 in 2006 and my second a year or two later when they were being discontinued. I put the 2nd into service last year. I dread looking for a new monitor, because I would just want another 20WMGX2, and there doesn't seem to be anything like it still available. I need:

1. IPS
2. Glossy screen
3. 16:10 aspect ratio, no larger than 24" 1920x1200, preferably 20" 1680x1050
4. VESA mount

If that were an equation, the solution would be, "You're screwed."
 
I need:

1. IPS
2. Glossy screen
3. 16:10 aspect ratio, no larger than 24" 1920x1200, preferably 20" 1680x1050
4. VESA mount

If that were an equation, the solution would be, "You're screwed."

it would require a more creative solution. you could buy a used 2490wuxi - preferably with a busted screen - and replace the panel with one from a 24" glossy imac.
 
But they have poor ANSI contrast ratio. ANSI is far more important than dynamic range

That's extremely debatable, especially with regard to video and graphical animation.

On my GDM-F520 CRT, the ANSI is still good enough to deliver a brilliant picture. And the dynamic range and blacks, by today's standards, are remarkable. (Which, BTW, is ridiculous and sad considering we're talking about a screen released in 2001 versus ones in 2012...)

SED? OLED? What happened to the future? It's almost 2013 -- where's my upgrade path?! :)
 
That's extremely debatable, especially with regard to video and graphical animation.

On my GDM-F520 CRT, the ANSI is still good enough to deliver a brilliant picture. And the dynamic range and blacks, by today's standards, are remarkable. (Which, BTW, is ridiculous and sad considering we're talking about a screen released in 2001 versus ones in 2012...)

SED? OLED? What happened to the future? It's almost 2013 -- where's my upgrade path?! :)



Personally I prefer detail over crushed blacks caused by internal reflection.

PS: The majority of 2012 Panasonic PDPs produced black of 0.009 cd/m2 (in both dynamic range and ANSI tests). The 2013 models are expected to reach as low as 0.005 cd/m2. So we don't need SED or FED and both are dead in the water anyway.
 
Last edited:
IPS all the way. I bought my first NEC 20WMGX2 in 2006 and my second a year or two later when they were being discontinued. I put the 2nd into service last year. I dread looking for a new monitor, because I would just want another 20WMGX2, and there doesn't seem to be anything like it still available. I need:

1. IPS
2. Glossy screen
3. 16:10 aspect ratio, no larger than 24" 1920x1200, preferably 20" 1680x1050
4. VESA mount

If that were an equation, the solution would be, "You're screwed."

Buy a 24" IPS from Dell and remove the anti-glare coating. Done and done. http://hardforum.com/showthread.php?t=1674033
 
if anything IPS monitors are underrated...

agree with everything except "wanna read - CRT" part :rolleyes:
I can use CRTs for reading but I don't see any reason for doing so. Good LCD is much easier for eyes so why bother using blurry and flickery CRT? :confused:
CRT don't tire my eyes in games, in movies, viewing photos etc but when I start reading text it's another story and CRT shows it's dark side ...


web is stationary text + pictures for most time and 30fps (at most) videos on YT (and other tubes :p) so there is not really much "need for speed"...

You wanna do photo work- get an CRT
CRT are cheapset photo work monitors out there (especially they support 10bit ) but still need to be calibrated and have some flaws even then like flaring and doesn't have wide-gamut modes (can't reproduce colors good printers can). So nothing beats good expensive as hell professional grade IPS monitor :)

so all in all imho CRT are the best for everything except reading and professional photo work :rolleyes:


i did mention 1024x768 as the best res personally for gaming and then if you dont change the res, text reading is comfortable at this res. lcd has better text esp at higher res but it just burns my eyes quickly even without pwm. so i stick with crt. especially for sony clone CRTs emission standards are strictest, not so for LCDs.

actually crt is 10 bit ready. wide gamut is generally a problem rather than useful. sRGB is still the main standard and is not going away in a hurry.

therefore if you dont want to keep switching monitors- one for one work and another for something else then CRT is the best ATM even for text/photowork.

actually i dont mind crt text even on higher resolutions like 1792x1344@85hz which i use regularly for the web.
 
Last edited:
Personally I prefer detail over crushed blacks caused by internal reflection.

PS: The majority of 2012 Panasonic PDPs produced black of 0.009 cd/m2 (in both dynamic range and ANSI tests). The 2013 models are expected to reach as low as 0.005 cd/m2. So we don't need SED or FED and both are dead in the water anyway.

That goes back to the debate over whether details at that low a level were actually intended to be seen by the director or such. However, yes, I am willing to accept the crushing of the very lowest grayscale to feed my blacks obsession. And I really did notice the dynamic range limitations of the LCDs I tried. That, the inability to produce blacks, and the unevenness of the image, puncture the illusion of looking through a window to my eyes. I remember reading an interview with a Sony executive who facilitated Sony's transition from CRT to LCD displays and who appeared to take a certain pride in that. However, then he went on to talk about how much better emissive displays were and wait until you see those in the future. Ignoring the irony in his comments, I'll say that I am indeed trying to hold on until an emissive computer display successor finally arrives.

I agree on the Panasonics -- very impressive what they've done and are doing. Though with TVs, as opposed to computer displays, I place a higher premium on size versus other factors. My issue with their sets is if I sit close to get a more home theater experience, it looks rough and un-movie-like, because I can see the individual pixels. So I guess some kind of projection for me. Fewer choices now with Mitsubishi's exit -- only front projection now I guess. Of those, the JVCs to me look most appealing. Kind of a tricky and expensive set up though...looks like...
 
It helps to reduce the total* (overall) lag and every millisecond helps. If we shave off 2ms here and 4ms there, we could improve the chances of winning. But this only concern professionals who compete in LAN parties.

*: Frame lag + Display lag + Mouse/Controller lag + Processing lag + Reaction Time (human lag) = Total lag
Agreed.

Interesting point to consider: LightBoost strobe backlights can add a tiny +1 to 2ms lag, but improve reaction time by 100-200ms in fast-twitch games in _certain_ cases (Quake Live, TF2), especially if you have a habit of "shoot-while-moving-fast" of long-time CRT players. On LCD's, you must stop turning/strafing before shooting far-away snipers, because tiny objects are too blurred while moving rapidly on LCD's, but are crystal-clear on CRT and LightBoost. As many fast-twitch gamers know, avoiding stopping, often allows you to avoid becoming a sitting duck.

Being able to utilize all CRT skillz (e.g. strafing at high speeds while shooting faraway enemies, circle strafing, and anything that creates panning motion blur that is difficult on LCD's) on LightBoost monitors, improves eye tracking abilities dramatically, and gives a reaction time advantage that is the same as CRT players, and this factor in certain video games is observed to outweigh the insignificant added input lag. (Inu said "OMG i can play scout so much better now in TF2, this is borderline cheating." -- a player with a CRT style move-while-shooting habit) Although a CRT may still be able to "shoot first" compared to LightBoost (when triggers are pressed simultaneously), the total lag, including human lag, of CRT-vs-LightBoost is extremely similiar (due to reaction time often being a dominating factor in TF2/Quake Live) than LCD-vs-CRT or LCD-vs-LightBoost.

For some people, reaction time is improved so amazingly massively (e.g. playing Scout in TF2 in large open spaces, if you are a "shoot-while-running-and-strafing" CRT player), although not nearly as much for close-combat video games (where input lag matters a lot more). Some long-time CRT players are affected more by the reaction time, rather than by the +1ms/2ms input lag caused by keeping pixel persistence in total darkness before strobing. Also, testimonials in a couple forums mention that eyes become more tired trying to track blurred moving objects, than sharp moving objects, also additionally affecting reaction time. Although strobing can tire the eyes during static images (see people hating PWM dimming), at least one PWM-hater noticed that eyestrain went down once you were staring at motion all the time (compared to non-LightBoost LCD), outweighing any eyestrain caused by strobing. And he could turn off strobing when he was just surfing the web instead of playing games.

To be fair, Vega still prefers IPS, but IIRC, he often plays battleground games, which is less dictated by motion blur than say, Quake Live and TF2 and similiar fast-movement games -- the type of games garnering rave reviews for the CRT-style zero motion blur made possible by LightBoost LCD's. So obviously, zero motion blur benefits certain games and gameplay styles more than others. In the XL2411T example, Vega confirmed a motion test measurement that it has nearly 20 times less motion blur than a regular 60Hz LCD -- motion blur below perceptible levels.

We can't wait for strobe backlights to arrive on IPS displays, as it's been done before in HDTV's but not computer monitors; just somewhat more challenging to engineer. The amazing motion blur elimination, of a computer monitor strobe backlight technology, only sadly exists with TN technology at this time.
 
Last edited:
Agreed.

Interesting point to consider: Strobe backlights can add a tiny +1 to 2ms lag, but improve reaction time by 100-200ms in fast-twitch games in _certain_ cases (Quake Live, TF2), especially if you have a habit of "shoot-while-moving-fast" of long-time CRT players. On LCD's, you must stop turning/strafing before shooting far-away snipers, because tiny objects are too blurred while moving rapidly on LCD's, but are crystal-clear on CRT and LightBoost. As many fast-twitch gamers know, avoiding stopping, often allows you to avoid becoming a sitting duck.

Being able to utilize all CRT skillz (e.g. strafing at high speeds while shooting faraway enemies, circle strafing, and anything that creates panning motion blur that is difficult on LCD's) on LightBoost monitors, improves eye tracking abilities dramatically, and gives a reaction time advantage that is the same as CRT players, and this factor in certain video games is observed to outweigh the insignificant added input lag. Although a CRT may still be able to "shoot first" compared to LightBoost (when triggers are pressed simultaneously), the total lag, including human lag, of CRT-vs-LightBoost is extremely similiar (due to reaction time often being a dominating factor in TF2/Quake Live) than LCD-vs-CRT or LCD-vs-LightBoost.

For some people, reaction time is improved so amazingly massively (e.g. playing Scout in TF2 in large open spaces, if you are a "shoot-while-running-and-strafing" CRT player), although not nearly as much for close-combat video games (where input lag matters a lot more). Some long-time CRT players are affected more by the reaction time, rather than by the +1ms/2ms input lag caused by keeping pixel persistence in total darkness before strobing. Also, testimonials in a couple forums mention that eyes become more tired trying to track blurred moving objects, than sharp moving objects, also additionally affecting reaction time. Although strobing can tire the eyes during static images (see people hating PWM dimming), at least one PWM-hater noticed that eyestrain went down once you were staring at motion all the time (compared to non-LightBoost LCD), outweighing any eyestrain caused by strobing. And he could turn off strobing when he was just surfing the web instead of playing games.

To be fair, Vega still prefers IPS, but IIRC, he often plays battleground games, which is less dictated by motion blur than say, Quake Live and TF2 and similiar fast-movement games -- the type of games garnering rave reviews for the CRT-style zero motion blur made possible by LightBoost LCD's. So obviously, zero motion blur benefits certain games more than others.

I have not actually tried lightboost but from past experience I'm not not convinced lightboost would be beneficial.

I did some extensive testing playing Unreal Tournament 2004 instagib with godlike bots on a special map I made that gave very consistent gameplay where I got around 300 frames per second. I would play several games at each setting with a 10 minute time limit then compare how many kills I had on each.
I originally did the tests to see which resolution and refresh were ideal for playing with on my CRT. I had a 20" ViewSonic CRT that was pretty high end capable of 2048x1536 and very high refresh rates at lower resolutions. I found that the ideal resolution and refresh rate was something like 1024x768@220hz.
In 2005 I bought the world's first ever 2 ms response time LCD which did 1280x1024@75hz. I did my UT2004 instagib test comparing my LCD versus my CRT. At 220hz the CRT was better, but when I set the CRT to the same refresh rate and resolution as the LCD I did better on the LCD by about 2%. I figured this was because the larger screen size.

To sum it up I tested myself and I found no difference in reaction time between CRT and an LCD with no input lag and no light boost.
 
Back
Top