Are any current VA panels capable of getting sub .05/.03 nit black levels?

tybert7

2[H]4U
Joined
Aug 23, 2007
Messages
2,763
The reason I ask is because VA panels seem like they would be our best candidate for HDR computer monitors due to better black levels compared to tn and ips, but if the only way to get that low is to use local dimming (I've never seen this in a computer monitor) then the only way to achieve HDR on a standard computer monitor will be to get that crazy expensive oled display for the desktop (not counting notebook oleds).

http://www.forbes.com/sites/johnarc...d-you-can-just-about-understand/#1d23b8874dcb



The relevant section is the following:

For High Dynamic Range playback a TV must support the SMPTE ST2084 EOTF (electrical optical transfer function – the way a screen turns digital data into visible light) and achieve one of two combinations of peak brightness and black level depth. Either more than 1000 nits peak brightness and less than 0.05 nits black level, or more than 540 nits peak brightness and less than 0.0005 nits black level.




The second standard is tailored for oled, but for this year I expect that to be out of reach price wise for most people, but an HDR VA panel could be a lot more reasonable if they can actually get below .05 nits.... so can they? Or are we doomed to wait until oled monitors come down to earth or some other savior comes along?
 
I'm currently using a BenQ EW2750ZL. I decided to see if it, like the FG2421 above, could reach those levels. In short, it can:

https://pcmonitors.info/reviews/benq-ew2750zl/

At 20% brightness it hits 0.02 nits and 4,000:1 contrast ratio (80 nits for white). At 40% brightness, it's 0.04 nits, 3,425:1 contrast (137 nits for white).

Hitting 0.03 nits for VA isn't the problem. It's hitting 0.03 nits for black while simultaneosuly hitting 1,000 nits for white. This cannot be achieved in current monitors without local dimming which, as you mentioned, isn't used in PC monitors. This is likely only going to happen with OLED and/or Quantum Dot. Philips made a big deal last summer and again in October about launching the first QD monitor, but it never happened. If it did, they're REALLY trying to hide availability of it.
 
About 6300:1 native is where the highest VA panels until now capped out, this is way off the 20000:1 requirement for the first standard.
Better ratios won't be achievable without Full Array dimming and this is always prone to blooming. Unless the pixels are self-illuminating (i.e. millions of local dimming zones), it can never be perfect.
Until there will be OLED that can get extremely bright or a whole different display technology, the HDR standard will only be approximatively reached.
 
VA contrast ratio is pretty much fake, not to metion gamma shift looks pretty badly
black at 0.05nits with 100 nits white means black at 0.5 nits with white at 1000 nits

HDR is not suitable for current technology, especially for LCDs that cannot hit proper blacks with stupid 100 nits white. So what are even talking about here? OLEDs on the other hand lack longevity. At 500 and especially 1000 nits they will burn unusably fast.

And in real life viewing 100 nits is just enough in dark room and 250 nits in well lit room. HDR showcase displays use wide-gamut and other technologies compared to non-HDR which use Rec. 709 and deliberately badly mastered material to confuse people about what HDR does. Same old same old...

Totally not excited.
 
VA contrast ratio is pretty much fake, not to metion gamma shift looks pretty badly
black at 0.05nits with 100 nits white means black at 0.5 nits with white at 1000 nits

HDR is not suitable for current technology, especially for LCDs that cannot hit proper blacks with stupid 100 nits white. So what are even talking about here? OLEDs on the other hand lack longevity. At 500 and especially 1000 nits they will burn unusably fast.

And in real life viewing 100 nits is just enough in dark room and 250 nits in well lit room. HDR showcase displays use wide-gamut and other technologies compared to non-HDR which use Rec. 709 and deliberately badly mastered material to confuse people about what HDR does. Same old same old...

Totally not excited.

There is nothing fake about VA panel contrast ratio. Blacks stay very black even on high brightness and using gamma 2.3-2.4 curved in BT1886 standard can make things truly pop on the screen.

But I do give you that the gamma shift can be bad, especially on monitor grade screens I think. FG2421 looks stunning with its near 4000:1 contrast ratio at 120 nits whites as long as you keep your eyes on the center region. Compared to my 32" TV viewed roughly on same distance where, I shit you not, you really have look for it in checkerboard test image to see the shift. But it only has 2000:1 contrast ratio and shallow blacks (in comparison to Eizo) just crappy 0.07 nits.

TS, no VA maxes out at 0.03-0.05 region without local dimming and proper local dimming with full led backlight is more costly and the feature brings some issues of its own. Haloing of bright objects on dark surface mainly.
 
We do not need 500 or 1000 nits for PC monitors. neither would we be able to distinguish the 0.02 Nits that current generation VAs reach at the relevant 100 nits range from the spec requirement of 0.0005 nits black.
 
neither would we be able to distinguish the 0.02 Nits that current generation VAs reach at the relevant 100 nits range from the spec requirement of 0.0005 nits black.

We very much would! 0.02 is pretty bad and far from black.
 
Once the human eye is fully adapted to the dark, it can detect luminance down to 0.000001 nits
On the other hand, e.g. when you are blinded by a 1000 nit HDR sun pupils constrict and then even 0.1 blacks will look black.
 
FALD is assumed here, guys. No panel is going to survive 1000 nits coming out the back when it's supposed to be full dark for very long. Gamma shift can be ameliorated by curving the display. Whereas blacks were crushed in the center to avoid the appearance of a washed-out screen before, with a sufficient curve gamma can be preserved across the entire screen while still keeping the center at a level capable of showing fine detail.

The problem is whether manufacturers are going to build these displays for the consumer desktop market. The answer to that one isn't as promising i.e. no, they won't.
 
This is why I like gaming on my Plasma

0.001 cd/m2 :D

ANSI Contrast Ratio is over 20,000:1.
 
FALD is assumed here, guys. No panel is going to survive 1000 nits coming out the back when it's supposed to be full dark for very long. Gamma shift can be ameliorated by curving the display. Whereas blacks were crushed in the center to avoid the appearance of a washed-out screen before, with a sufficient curve gamma can be preserved across the entire screen while still keeping the center at a level capable of showing fine detail.

The problem is whether manufacturers are going to build these displays for the consumer desktop market. The answer to that one isn't as promising i.e. no, they won't.

LCD is a shitty technology that should just die already. No matter how much they tweak it, it will always be crap. Here's hoping that OLED will advance enough and come down in price to finally give us something worthy of CRT replacement.
 
What a depressing thread I've created. It seems our only hope for an hdr monitor lies in televisions, and the hopes that someone makes a 40-43" 4k HDR panel with 120Hz vrr with a freesync range of 40-120Hz and displayport 1.3.... with low input lag....


There is at least a chance that a 4k tv will have local dimming, but this all seems so far off aside from oled, and that is way too expensive on the smaller display side.
 
What a depressing thread I've created. It seems our only hope for an hdr monitor lies in televisions,

Take everything said within this thread with a grain of salt. We're supposedly getting HDR capable monitors sometime by the end of calendar year 2016.

http://www.pcworld.com/article/3012...-support-for-hdmi-mobile-displayport-hdr.html

AMD claims 1080p with HDR will look better than a 4K image with SDR. The missing component is HDR monitors, but the company said it expects HDR monitors for the masses to be available by the end of next year.

That quote was from late 2015.
 
Why in the world would anyone want to be blinded by 500-1000 nits from their monitor? :confused:


AMD claims 1080p with HDR will look better than a 4K image with SDR.

AMD tends to claim a lot of things. On a smaller screen from further away, maybe.
 
Why in the world would anyone want to be blinded by 500-1000 nits from their monitor? :confused:




AMD tends to claim a lot of things. On a smaller screen from further away, maybe.


500 nits is not that bright, a sunny day outside is brighter. But for monitor that is obviously too much because every white would blare at that brightness to your eyes. HDR is a different thing from that kind of static brightness. Instead objects that should be bright are bright (like watching a welder on TV) without affecting the brightness of other pure whites the TV shows, like the clean white shirt of a news reporter.

Think of HDR as a dynamic contrast that actually bloody works and is entirely dependant on the content delivered on the screen and not a bad slow reacting guess made by your TV which does not even know the difference between the welders welding gun and news reporters white shirt. It also plays a big role how vivid and lifelike the colors are because we finally move on from 8-bit color range.
 
Last edited:
Back
Top