Why are the black levels of IPS displays so bad?

Doesn't lightboost introduce flicker? At 60 Hz it sounds like it could be a problem. My previous NEC had something similar built in and it wasn't pretty, even if it did reduce blur. And loss of color and brightness seems like a big tradeoff.

Oh, I hope my FW900 never dies :) How sad that SED didn't make it, I had really high hopes for that one.
 
That being said, no LCD panels can match the perfect color reproduction (including blacks) of a CRT. And yes, a decent, well maintained and configured CRT will completely destroy a LCD/IPS/LED in all ways except size and weight. Not until LCD's STOP being backlit and actually have a display that reproduces brightness/darkness per pixel will we have that type of reproduction. We'll see what OLED tech brings to the table....
I have good condition SONY GDM-FW900 (CRT) and professional LG W2420R (IPS with RGB-LED and A-TW) on desk next to each other and can't agree with you that CRT is better :rolleyes:

Sure CRTs are great and unbeatable in some aspects but color reproduction actually suck on them compared to my LCD. Let's make small comparsion:

Dynamic range:
No comparsion really as SONY destroy LG. But only at night without any light in room.

ANSI contrast:
Similar to dynamic range: no comparsion, but this time LCD is the better one. No "but"s here. In bright scenes it will always have better blacks.

Viewing angles:
CRT is better but not by much, it's just perfect and LCD just almost perfect

Motion performance:
old flicery tube technology destroys IPS, especially on 96Hz (@1200p) vs 60Hz and even can do judder-free movie playback

Geometry and sharpness:
Always have this feeling something is either too wide or too narrow on CRT. Sharpness is adequate in lower resolutions, ok in 1920x1200 and slighthly too low on 2304x1440. No such issues on LCD obviously

Color reproduction:
When I turn on CRT I always have this feeling colors are off, especially skin tones. And there is this "low end" feeling from really bad ANSI contrast ratio, like all the time. Colors get somewhat better with contrast cranked up though still I find LCD better

Overall black level and contrast ratio:
Obviously CRTs are too much ambient light dependent. Their best performance is in pitch black room and those aren't really normal day conditions. I use ambient light most of the time (don't like to stress eyes more than they have to be) and CRT look crap in those conditions. In bright room on LCD black seem almost perfect!

But LCD can't have both good black and bright picture and obviously not in dark room so CRT shines here with brightness than can burn your eyes and still good performance in dark scenes.

Calibration:
I have hardware calibration with gamut remaping on LCD so it's sure win. CRT is good in a way that there is no banding from calibration thanks to 10bit LUT but otherwise there are a lot problems with such calibration like needing gamma loaders and games disable it so it's kinda pointless to calibrate other than "setting calibration" in the first place...

----------
I use CRT for games and sometimes for movies when movie is mostly dark. Bright movies simply looks better on LCD and you don't have to wait 30m for monitor to warm up. For desktop usage CRT is much worse in most aspects.

So all in all CRTs are far from perfect :(

ps. as of OLEDs: they will bring whole set of their own problems :eek:
 
You are right about lighting conditions required for FW900, it is especially bad since I've removed the AG filter. Terrible during the day without shades. But it isn't such a big problem for me since I'm gaming with dim room lighting and watching movies/series in the dark. I disagree on some of your points, though. While "synthetic" ANSI contrast is better on LCDs, I find that doesn't mean much in real life use. Whether I am gaming or watching movies/series, FW900 is far superior in all conditions and there is just no comparison in darker scenes. I have both calibrated, so I don't have problems with color reproduction, and FW900 can be hardware calibrated too.

LCD is superior in geometry, text sharpness and daylight usage when contrast and color matter. FW900 is superior in blacks, contrast, motion and viewing angles (I don't have A-TW polarizer on my IPS, unfortunately), all of which are my no.1 requirements for display (hence a plasma in living room).
 
you have to run lightboost2 's 1ms strobing with the monitor running 100hz & 100fps+ , or better yet 120hz & 120fps+ to get zero blur benefit. To get the most out of any 120hz - 144hz lcd monitor even without lightboost2 you have to surpass the refresh rate, so that each hz is filled with more recent slice of action data. Otherwise you are just duplicating frames and not changing the pixels between hz,. Athough a 120hz monitor starts to show improvement around 90 - 100fps, its not the best it can show, and in the case of lightboost2, dropping below 100fps at 100hz or below 120fps at 120hz will start to blur again, as well as dropping the more recent action shown per hz motion smoothness/accuracy in any scenario where a higher hz monitor is run at low fps
. Another consideration is that tearing without vsync is lessened when you run fps higher than the refresh rate.
.
Skyrim on high with 4xAA should get 120fps on a gtx680. BF3 on medium hits 118 fps or so. Titans are out now ,but of course very expensive. Graphics cieling's are arbitrary any given gen, they could easly be set higher until all card would max out at 10fps, or even 1 frame per hour like some cgi movie scene lol. Maxing everything out isn't always neccessary to get a great looking game, especially if you get zero (or 50% reduction in) motion blur out of the deal and greatly increased motion tracking. Everything is about tradeoffs. Older source games like L4D2 and TF2have no problem either obviously.
.
edit: I've also owned a few FW900 crts and would always play at 85hz or higher (they can hit 90 to 100hz at certain resolutions). 60hz doesn't look good on any strobing monitor, its too slow. There is more to it than that since crt looks different (better than) lcd at different hz in regard to redraw/strobing but I'll leave it at that for now. 60hz is below a certain threshold even on crt for people with "fast" eyesight.
 
Last edited:
But doesn't lightboost on 120 Hz monitor result in 60 Hz flicker? I have problems from PWM flicker and that is much faster than lightboost.
 
It might bother you if crts at higher hz bother you. Best case is running it at 120hz and maintaining over120fps most of the time. The way lightboost2 works is like this.. It creates a 1ms backlight flash "snapshot" during the pristine part of every single frame100hz&100fps, or 120hz&120fps. Conversely, it blanks out the transition parts of each frame. The nature of the strobing also eliminates what is the overwhelming cause of FoV movement blur - retinal retention blur. (They say only 15% of lcd blur is due to response times now on the modern, very low response time TNs especially). The only realistic way to eliminate retinal retention blur without response times of over 1000hz is by using backlight strobing or backlight scanning of some sort.

lightboost2 cycle:

<-----transition/persistence/retention blur part of frame ----> <----1ms FLASH!> <------- trans/persist./retention blur part of frame-------->

Its a common misconception that lightboost2 is somehow halving the frame rate, this is probably due to confusing 3d shutter glasses with the raw full field strobing of lightboost2 which does not use any glasses at all. It just enables the synchronized 1ms backlight strobing during 2d gaming. At 100hz&100fps you are getting 100frames shown at just over 1ms each, at 120hz&120fps you are getting 120frames of unique action shown at just over 1ms "lit" time each, with result of zero blur in both cases.
 
Last edited:
I have good condition SONY GDM-FW900 (CRT) and professional LG W2420R (IPS with RGB-LED and A-TW) on desk next to each other and can't agree with you that CRT is better :rolleyes:

Sure CRTs are great and unbeatable in some aspects but color reproduction actually suck on them compared to my LCD. Let's make small comparsion:

Dynamic range:
No comparsion really as SONY destroy LG. But only at night without any light in room.

ANSI contrast:
Similar to dynamic range: no comparsion, but this time LCD is the better one. No "but"s here. In bright scenes it will always have better blacks.

Viewing angles:
CRT is better but not by much, it's just perfect and LCD just almost perfect

Motion performance:
old flicery tube technology destroys IPS, especially on 96Hz (@1200p) vs 60Hz and even can do judder-free movie playback

Geometry and sharpness:
Always have this feeling something is either too wide or too narrow on CRT. Sharpness is adequate in lower resolutions, ok in 1920x1200 and slighthly too low on 2304x1440. No such issues on LCD obviously

Color reproduction:
When I turn on CRT I always have this feeling colors are off, especially skin tones. And there is this "low end" feeling from really bad ANSI contrast ratio, like all the time. Colors get somewhat better with contrast cranked up though still I find LCD better

Overall black level and contrast ratio:
Obviously CRTs are too much ambient light dependent. Their best performance is in pitch black room and those aren't really normal day conditions. I use ambient light most of the time (don't like to stress eyes more than they have to be) and CRT look crap in those conditions. In bright room on LCD black seem almost perfect!

But LCD can't have both good black and bright picture and obviously not in dark room so CRT shines here with brightness than can burn your eyes and still good performance in dark scenes.

Calibration:
I have hardware calibration with gamut remaping on LCD so it's sure win. CRT is good in a way that there is no banding from calibration thanks to 10bit LUT but otherwise there are a lot problems with such calibration like needing gamma loaders and games disable it so it's kinda pointless to calibrate other than "setting calibration" in the first place...

----------
I use CRT for games and sometimes for movies when movie is mostly dark. Bright movies simply looks better on LCD and you don't have to wait 30m for monitor to warm up. For desktop usage CRT is much worse in most aspects.

So all in all CRTs are far from perfect :(

ps. as of OLEDs: they will bring whole set of their own problems :eek:

CRT is certainly closer to IPS ANSI contrast ratio than IPS is to CRT dynamic range.

And IPS black level just doesn't look black...

And surely it's not your CRT's fault that you don't calibrate it to sort out the color?

How the display industry could offer good dynamic range and blacks for decades and then yank it away without howls of protest...well actually I was howling...but a bunch of other folks were just struck by how amazing light and thin them LCDs were...and we could clearly see the end coming, were dreading it, and to this day it's still rotten...but OLED gives a sense of hope...

(Yes, admittedly, CRTs only shine in controlled ambient light. And LCDs achieve straighter lines, greater size, and larger clear resolutions. And I'm real happy we have LCDs in the office rather than CRTs. That said, for the sake of my picture quality obsessive side, glad my CRT's live on for my home machine...)
 
My CRT (Samsung 700NF) looks horrible when the lights are turned off. The ANSI with the lights on is around 400:1 and it looks terrible. It displays blacks as very, very light greys with the lights on. The phosphor trailing on my CRT is also pretty terrible. Once set up properly the text looks great, it supports judder free play-back @72hz and is obviously awesome for watching content in the dark.

What baffles me the most is how people switched from clear glass CRT's to extremely grainy matte IPS and didn't notice.
 
What baffles me the most is how people switched from clear glass CRT's to extremely grainy matte IPS and didn't notice.

I don't get this either. People whine about glossy these days when before all we had were glass CRTs, but despite their lack of brightness I don't remember anyone calling for matte coatings then.
 
I don't get this either. People whine about glossy these days when before all we had were glass CRTs, but despite their lack of brightness I don't remember anyone calling for matte coatings then.

Most CRT's had an anti-glare coating, it just wasn't particularly obvious.
 
Plasma is better than 60hz lcd blur, and of course plasma has great black levels -- but there is much better motion available now for pc gaming. Plasma being limited to 60hz input of motion tracking and smoothness alone makes it an inferior choice for modern pc gaming imo. Anything compared to zero blur is a poor prescription for eyesight/FoV-moving viewports in my opinion too. Plasma's screen sizes and ppi, among other possible concerns/tradeoffs, also make them a poor choice.

I see how big of an impact proper black levels make every time I open the Nvidia control panel and switch on the Plasma.

The zero blur thing sounds nice but you're still dealing with inferior black levels.

To each their own.
 
yes everything is tradeoffs. Once you see 120hz or higher motion infused with more recent action frame slices (instead of faking it in some cases perverted by some kind of interpolated frames in a system that introduces input lag besides), its hard to go back.

Combine this with blur reduction of 40% (120hz - 133hz IPS), 50% (120hz TN), 60% (144hz TN), 85% (120hz TN, Lightboost active at 100%), 92% (120hz TN Lightboost active at 10% - for practical purposes, zero blur/smear of object/texture detail, no afterimage shadow).

Greatly increased, real/new/more-current action data slices increase accuracy of motion and feel smoother (more dots per dotted line length, strobing/opening many more apertures into what is currently happening in the game world per second). Achieving what is essentially zero blur on top of that just increases accuracy more, and both increase the game world aesthetically and focally (avoiding even the soften blur effect of the entire viewport that the lesser blur reductions result in).

Its too bad you can't have it all on one monitor but that has been the complaint for years now. This thread focuses mostly on the mediocre black levels on both ips and tn (less obnoxious on glossy imo but I would never choose it over watching a HD movie on my VA tv or a plasma). At least you can get a 2560x1440 60hz 27" ips for $350 or less now for gorgeous color, uniformity, ppi/rez and real-esate... and you have 50 - 60% blur reduction or essentially "zero" blur options possible + greatly increased motion tracking and smoothness on TN gaming LCDs for under $300 (gpu/supporting system considerations aside). The landscape as been much worse imo.

1440p is the way to go for graphics, programming, and desktop use (and slower action gaming).
However, LightBoost is the way to go if you primarily do faster-action video gaming on your monitor. More competitive advantage!
Also, the S27A950D has an undocumented strobe backlight mode (though Samsungs are not good for input lag for this):
Samsung Zero Motion Blur HOWTO -- Works on 700D / 750D / 950D

PixPerAn Tests on LightBoost monitors (I own both BENQ XL2411T and ASUS VG278H)

baseline - 60 Hz mode (16.7ms frame samples)
50% less motion blur than 60 Hz (2x clearer motion) - TN 120 Hz mode (8.33ms frame samples)
60% less motion blur than 60 Hz (2.4x clearer motion) - TN 144 Hz mode (6.94ms frame samples)
85% less motion blur than 60 Hz (7x clearer motion) - TN 120 Hz mode with LightBoost set at 100% (2.4ms frame strobe flashes)
92% less motion blur than 60 Hz (12x clearer motion) - TN 120 Hz mode with LightBoost set at 10% (1.4ms frame strobe flashes)
Versus:
40% less motion blur than 60 Hz (1.7x clearer motion) - IPS overclocked to 120Hz (8.33ms + excess pixel persistence) -- Test done by Vega

This really clearly shows that not all 120 Hz is made the same. There is 7x less motion blur on a LightBoost-enabled 120 Hz TN than an overclocked 120 Hz IPS. Stroboscopically shortening the individually refreshes without increasing the refresh rate, makes a massive difference in motion blur elimination for people who are sensitive to motion blur, and want the "CRT silky smooth effect". As long as your eyes are comfortable with CRT >85 Hz.
 
Just want to let everyone know, even though I posted the thread link to the zero blur lightboost thread, that most of the hard data is from Mark Rejhon
of http://www.blurbusters.com/, who discovered the zero blur capability. Vega also did a lot of follow up work helping testing, ironing out wrinkles/figuring things out in regard to lightboost zero blur, along with others.
I infuse my posts with a lot of insights, realizations, creative analogies and conceptualizations from my own experiences with and learning regarding different monitors and effects. I also quote figures and benchmarks from various sites and from reputable posts/threads here on hardforum. In regard to lightboost2 tech itself's hard data, mainly blurbusters.org and Mark's brain dump of it into the lightboost thread here on hardforum.
 
VA uses a different class of LC, using negative dielectric anisotropy LC compared to TN & IPS. Apply it in a modern vertical alignment cell and it produces excellent contrast.

IPS doesn't have this benefit.
 
I have a SM215TW, 6 years and counting, lagom black test is walk in the park.I am pretty happy with it, I think it is S-PVA, I play FPS games too. I do not know if it has PWM ( I do not feel anyting) but since it is not LED may be it is not such a problem.
I want to replace it for bigger display with higher resolution but reading all the negatives for current monitors is making it vertually impossible, regardless of price.
 
Back
Top