New VESA Fully Open DisplayHDR Standards

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,620
VESA has stepped in to define new standards to help speed the adoption of High Dynamic Range technology in computer displays. The DisplayHDR specifications are fully open and include a fully transparent testing methodology. At least now maybe we can all agree on what actually is HDR and probably more importantly what is not HDR. No doubt there is confusion about what should be marketed as HDR.


The new VESA High-Performance Monitor and Display Compliance Test Specification (DisplayHDR) initially addresses the needs of laptop displays and PC desktop monitors that use liquid crystal display (LCD) panels. The first release of the specification, DisplayHDR version 1.0, establishes three distinct levels of HDR system performance to facilitate adoption of HDR throughout the PC market. HDR provides better contrast and color accuracy as well as more vibrant colors compared to Standard Dynamic Range (SDR) displays, and is gaining interest for a wide range of applications, including movie viewing, gaming, and creation of photo and video content.
 
We need HDR just like we needed 3D TV's and monitors. There have been Wide-Gamut monitors for a long-time but sRGB still reigns supreme. Now, everybody wants HDR -- how about just using RGB full vs RGB Limited as a first step for TV and Movies? Your typical consumer monitor or TV's have so many defects in production quality on the basic things like dead pixels, color banding, clouding, backlight bleed, contrast levels, etc that we really don't need to add to the issue with things like HDR. I doubt most people would even notice a difference in a properly calibrated high quality TV and an HDR TV. Almost all TV's sold now are 4K but yet the cable companies like Comcast are displaying everything in very compressed 1280x720p The ATSC 3.0 standard will ultimately decide what happens with HDR but I suspect that most broadcasters won't do much at all with 4K video since they aren't doing much at all with even 1920x1080. It's all about how much they can compress things so that they can display more -- it's not about the quality but the quantity.
 
how about just using RGB full vs RGB Limited as a first step for TV and Movies

TV's don't use RGB; that's problem #1.

our typical consumer monitor or TV's have so many defects in production quality on the basic things like dead pixels, color banding, clouding, backlight bleed, contrast levels, etc that we really don't need to add to the issue with things like HDR.

I've been using TV's as my main PC monitor for ages now; I'd argue modern TVs are higher quality then most PC displays at this point. And yes, mine has HDR.

I doubt most people would even notice a difference in a properly calibrated high quality TV and an HDR TV.

In HDR content, the difference is noticeable.

Almost all TV's sold now are 4K but yet the cable companies like Comcast are displaying everything in very compressed 1280x720p The ATSC 3.0 standard will ultimately decide what happens with HDR but I suspect that most broadcasters won't do much at all with 4K video since they aren't doing much at all with even 1920x1080. It's all about how much they can compress things so that they can display more -- it's not about the quality but the quantity.

If the cable companies had their way, we'd still be at 480i. Remember the HD specification was completed back in the 80's and it took 20 years for it to actually catch on.
 
TV's don't use RGB; that's problem #1.



I've been using TV's as my main PC monitor for ages now; I'd argue modern TVs are higher quality then most PC displays at this point. And yes, mine has HDR.



In HDR content, the difference is noticeable.



If the cable companies had their way, we'd still be at 480i. Remember the HD specification was completed back in the 80's and it took 20 years for it to actually catch on.

What do you mean that TVs don’t use RGB? It’s the content that does not use RGB, all TVs convert all incoming signals to RGB. You only have 3 coloured subpixel sets on the screen. Guess what colours they are?

But yes, computer monitors are all shit these days, almost without exception. I have no idea if you can buy a quality small screen 28” to 40” 4K HDR (non OLED) TV though, as that would be better than virtually any monitor on sale now.

But seriously, these VESA standards are utter shit, and look like specs from 3 years ago, obviously nothing but a ruse to keep desktop monitors cheap and easy to produce (regardless of the price the consumer ends up paying). Desktop monitors should be 10bit at least by now, and have much better contrast ratios, but the reality is we have shit 6 bit panels, and a handful of 8 bit panels all with terrible 1000-1 contrast ratios.

And regarding HDR, anyone who says HDR is crap, either has no idea what HDR is, or is blind. Dynamic HDR is the future, but I think it will be 5 years or more before we get a monitor which is good enough to display it properly, if ever. Dolby labs renders HDR content for display at 4000 nits peak brightness, so it will be a few years before any consumer level devices can match that reference, and therefore be able to show HDR to it’s full effect. But some of the latest LCDs are nearly at 2000 nits, and look stunning when displaying HDR content.
 
Last edited:
More accurate colors when no her tv hits anywhere near the 2020 targets? You know what would improve contrast? Better display technology. Current hdr sets still have the same basic problems as sdr displays. Nothing but Band-Aids to bigger problems. An sdr display with perfect contrast would easily beat out current hdr TVs with current led display problems. Sdr TVs also currently pretty much hit 100% of the 709/srgb gamut.

And yes TVs work in rgb. Ycbcr for video is converted internally to rgb. And they will accept rgb input.
 
Back
Top