Can HDR monitor hurts our eyes?

Joined
Aug 9, 2015
Messages
61
HDR monitor will come out 2016. Are your waiting for it?
17662552e10493ebb6.jpg
 
It won't be until around 2018 before we see HDR-capable monitors for a consumer-level price. Most of the first ones will be professional grade screens and probably cost a small fortune.
 
Actually I see phones and cameras being the first applications then gaming monitors and graphic design monitors
 
Most current monitors don't even display full RGB, so I don't don't see this tech going anywhere. I honestly don't think anyone cares about how many colours their screen displays. Partly because the way human eyes see colour adjusts for deficiencies and flaws anyway so it's less obvious particularly if you don't have both screens next to each other.

I can't see high-contrast displays (which is realistically what "HDR" is) being any different. The marketing buzz-wording is getting pretty bad already, that's how you know they're desperate.
 
I have no interest in HDR on an edge lit display. Until OLED monitors arrive with it it's pretty much a gimmick in my opinion. There's no way to produce accurate HDR on anything else since you don't have precise control over each pixel with LCD. Unless they start using full array backlights with a lot of zones like some high end TVs.
 
I have no interest in HDR on an edge lit display. Until OLED monitors arrive with it it's pretty much a gimmick in my opinion. There's no way to produce accurate HDR on anything else since you don't have precise control over each pixel with LCD. Unless they start using full array backlights with a lot of zones like some high end TVs.

Like I said this will come to tablets and phones and DSLR cameras first. Then at enthusiasts, I mean gamers, or people with too much money and likely at the same time industry use then finally the TV and everyone else market..
 
Last edited:
I've got a CRT, so I already have an "HDR" monitor. It definitely doesn't "hurt eyes", unless you use it below 75Hz, and that's because of flicker, not HDR.
It seems a bit pointless to try to make a HDR LCD, as the color and contrast inherently sucks.
 
I've got a CRT, so I already have an "HDR" monitor. It definitely doesn't "hurt eyes", unless you use it below 75Hz, and that's because of flicker, not HDR.
It seems a bit pointless to try to make a HDR LCD, as the color and contrast inherently sucks.

Local dimming zones should help. Visio apparently demoed a 384 zone dimming television.
 
I've got a CRT, so I already have an "HDR" monitor. It definitely doesn't "hurt eyes", unless you use it below 75Hz, and that's because of flicker, not HDR.
It seems a bit pointless to try to make a HDR LCD, as the color and contrast inherently sucks.
HDR is about high brightness and the way that brightness information is encoded, not contrast ratio.
Your CRT will be pushing <200 nits, probably <100 nits. HDR displays are going to have peaks of 800+ nits.
 
For topic: nope, the things people come up with, sigh...is the OP some kind of troll?

For many posters: you clearly don't know what HDR video is. CRT? LOL!

I bought my HDR TV in april, the panel is ~1000 nits and 10/12(?)bit 120hz, the current interface controller only has hdmi 2.0a for now, and that only came with a firmware update later on.

Unfortunately the interface standards are not even out there yet, no gpus with hdmi 2.0a or DP 1.3, even the hardware players are not out yet.
 
The nits that is my worry about. Especially at night, when we near to monitor.
HDR is fundamentally different from SDR. Just because there can be peaks in excess of 1000 nits, does not mean that it will look anything like an SDR display running at 1000 nits.
 
HDR is fundamentally different from SDR. Just because there can be peaks in excess of 1000 nits, does not mean that it will look anything like an SDR display running at 1000 nits.

Think about what you just said...
 
You have to understand that HDR content isn't just scaled up SDR content. A lot of material won't look dramatically different.

Watch 16:30- 18:15 in that video that I linked to.

I get how the tech works, though I feel a couple issues will ground this...

-Say everyone had a monitor that could hit 1K nits. Then I think content would be tailored to only utilize the high end of that when needed. And people could be able to set that max to where they want it. As it is now, when a scene switches to all max brightness white display after a dark scene that can be a pain in the ass for night viewing.

-For any outdoor scene, it could be downright bothersome if the whole scene used the upper end of 1K nits like it probably should. A lot of people don't run their display at max brightness as it is, due to eye strain.
 
Good video right there. I'd probably suggest starting it at 14:30 or so. Anyways, Eager for HDR myself. And if SpectraCal isn't worried about brightness, then neither am I.
 
-For any outdoor scene, it could be downright bothersome if the whole scene used the upper end of 1K nits like it probably should. A lot of people don't run their display at max brightness as it is, due to eye strain.

Why do you think it "probably should"? The goal of content mastering in video is not to recreate the scene luminance on the target display.
 
Think about what you just said...
SDR is mastered for display at 100 nits.
It only looks correct on a display where the peak brightness is calibrated to 100 nits.
If you set an SDR display to 1000 nits, everything will be displayed 10x brighter than it is supposed to be.

HDR codes brightness differently from SDR.
An HDR source can be mastered to any pre-determined peak brightness level. This is included in the HDR video's metadata.

If you have two HDR displays, one which is capable of a peak brightness of 400 nits (which is very low for HDR) and one capable of 1200 nits, for any scene coded to &#8804;400 nits the image should look exactly the same on both.
Unlike SDR, the 1200 nit display won't just display everything 3x brighter.

If a scene were coded to include peaks of 1200 nits - and you're not going to get full white screens at 1200 nits, only highlights within a scene - then the 1200 nit display will show things accurately, while the 400 nits display will have to use highlight compression to display the image.
A 400 nit HDR display should still look a lot more vibrant than SDR, but not nearly as vibrant as the 1200 nit HDR display.

If you had a display which could theoretically do 10,000 nits - and the specification can handle that - it will be the same there.
Most content is not going to be displayed 100x brighter, as it would be with an SDR source.
I don't know if 10,000 nit HDR displays will be a thing that actually happens, but if it did, I imagine they would be at least 80" in size.
10,000 nits would have a high enough brightness that the television starts to look like a window, instead of a display.

And you will still be able to limit your display's peak brightness.
Just because you have a 1200 nit capable HDR display, does not mean that you are required to run it at 1200 nits.
But you probably won't be dropping it down to 100 nits either.
 
I wish! Cause that's gonna leave a cool scar...for tha ladies :D

But yes, all the hi-end graphics were made by me. Took some time but the end result is, as you can clearly see, ravishing.
 
Last edited:
I could just raise the beam current on my CRT a ton and make it really bright. Sending an HDR signal over VGA would be the hard part.

Probably wouldn't want to run the tube that high permanently, but it would be interesting to try.
 
That still wouldn't be HDR though.

In effect you'd just be cranking the brightness on your tube massively. HDR will only do super bright light where you need it and will allow for better color saturation instead of a washed out look while displaying high intensity light content as well.

On your CRT everything would just look washed out. It'd also burn in images fairly quickly on the phosphor pixels too. So even if you're half serious don't do it.
 
I have an LED TV screen from a few years back that has over 1000 local dimming zones. It was the only model to feature such a thing and was discontinued quickly after.

It is a nice screen and the picture quality rocks for movies. However, that said, it would be horrible for a monitor. Mainly because you can still tell especially with black background and white text that there are zones. The text will contain an aura for lack of a better word that glows around it.

So what should be sharp text/contrast ratios looks more like a neon white light with an aura. I do have my HTPC connected to it and it doesn't do so well there. I'd still prefer a normal crappy IPS over it.

We really need a new technology like OLED to really fix the issue.
 
Localized dimming issues have nothing to do with HDR though.

And HDR OLED PC monitors with a high enough nit output to be worthwhile (you'll want over 1,000 nits at a minimum, around 4,000 nits would be good from a image quality stand point but I don't think anything can do that right now except for some super high end demo units) as well as somewhat affordable are probably not coming for a long time.

The closest thing to a HDR OLED screen I know of right now only outputs around 800 nits and costs around $5K. Its a TV screen and not a PC monitor though.

You might see non-stupidly prices HDR LCD PC monitors in a year or 2 and they might be worth it. Or not. We'll have to wait and see.
 
Isn't HDR just another attempt to try to make LCD tech just a little less shitty by using workarounds? They talk about backlight zoning, etc utilized with it still even. Isn't OLED a much better option than sinking money into the inferior LCD tech for yet another generation? OLED doesn't use a backlight and seems superior in every way.

There are already 2560x1440 OLED latops coming to market. We need OLED computer monitors next. Competition in OLED would be good too. For TVs, it's all LG so far. On laptops, it might be samsung too atm though (I'm pretty sure the laptop I mentioned was a samsung panel)..
 
Isn't HDR just another attempt to try to make LCD tech just a little less shitty by using workarounds? They talk about backlight zoning, etc utilized with it still even. Isn't OLED a much better option than sinking money into the inferior LCD tech for yet another generation? OLED doesn't use a backlight and seems superior in every way.

There are already 2560x1440 OLED latops coming to market. We need OLED computer monitors next. Competition in OLED would be good too. For TVs, it's all LG so far. On laptops, it might be samsung too atm though (I'm pretty sure the laptop I mentioned was a samsung panel)..

No, HDR is a completely different way of coding brightness.
It replaces gamma with a scheme that is far more suited to flat panel displays that respond linearly (or close to it) unlike CRTs that naturally have a non-linear response, which gamma was near-enough the inverse of.

HDR benefits OLED displays just as much as it benefits LCD displays - though OLEDs are less suited to the high brightness capabilities that HDR brings.

SDR video was designed to be viewed at 100 nits brightness.
HDR video does not place any real restrictions on brightness, so content can be mastered up to 10,000 nits.

Realistically, most content will be mastered at 1,200-2,000 nits.
But that is the absolute peak brightness that highlights will reach. Maximum scene brightness is to remain below 400 nits.

If you viewed SDR content on a 1,000 nit display, everything would be displayed 10x brighter than intended.
If you viewed that same SDR content on a 500 nit display, everything would be half the brightness of the 1,000 nit display.

If you viewed HDR content on a 1000 nit display or a 500 nit display, any scene where the peaks are lower than 500 nits should look exactly the same on both displays, due to the way that brightness is now coded. It represents an exact value, not a relative value.
For scenes which have highlights exceeding 500 nits, you will have highlight clipping on the lower brightness display (everything above 500 nits turns white) which would not be present on the higher brightness display.

HDR enables far more vivid, saturated, and realistic images than SDR ever could.
High-brightness SDR displays are not at all the same thing as HDR displays.
 
I get that on the content end, and that it is welcomed as a broader spectrum so to speak. I was talking about the implementation being served up on another generation of LCD tech and touting it's contrast benefits when imo OLED (or HDR content on OLED as you said) is where the dev money on displays should go.
 
LCD still has a few benefits over OLED: practically no retention and burn-in, and higher brightness. It's also cheaper, for now.

I'm sure more manufacturers would try OLED if they could and it proved viable financially, but for the time being most are limited to LCD whether they like it or not.

Now that OLED PC monitors are starting to come out we'll have a chance to see if it's viable as a general purpose monitor technology. It will take a few years to see where it goes. Hopefully OLED improves to the point that it makes LCD completely obsolete. If not, next candidate: QDLED. :) Maybe also something will come out of blue phase LCD, but I've no idea if it can help with viewing angles and black performance.
 
Isn't HDR just another attempt to try to make LCD tech just a little less shitty by using workarounds?

There are people who have seen HDR tech demos who have told me me that HDR is simply fucking incredible, and has to be seen to be understood. It's been described as revolutionary as the jump from SD to HD. This isn't just a gimmick.
 
I was talking about the implementation being served up on another generation of LCD tech and touting it's contrast benefits
HDR isn't about contrast per se. Though it will help there by default.

Localized increased brightness + better colors at high brightness (which is what HDR is) isn't the same as localized dimming.

There was a really good video posted on the previous page that talks about this stuff and maybe it'd help you understand better what HDR is all about.
This isn't just a gimmick.
This right here.

A good HDR display will be a significantly bigger step up in visual quality than getting a 4K resolution monitor.
 
Last edited:
I don't see how it could hurt anyones eyes as long as it's flicker free?
It wont go as bright as the sun so no worries there :p

The closest thing to hdr screen were some of gen 2 samsung amoled screens. Like on Galaxy note2. Very saturated with perfect black and contrast. I had that phone for some time and it was so weird going to lg g3 after it
 
Correct me if I'm wrong but

You needs content mastered in HDR to use it right ?

So that means a tiny minority of current BR movies.
If you have raws from photo camera can they be converted into HDR ?

Games will also need to be programmed to use it - so everything that exist now won't benefit ?

So we will be looking at slow increase in content over next few years ?
 
Back
Top