What Does HDR Mean For Games And Movies?

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Honestly, I’m still not sure what to think about HDR. A lot of the comparison images I’ve seen really don’t help, as they suggest HDR is all about over-saturated colors and blown-out highlights.

Essentially, HDR allows for displaying a wider range of colours with brighter highlights and perceived deeper blacks while preserving details across mid-tones when all three elements are on screen. This allows for a more 'true to life' presentation that more closely replicates what the human eye is capable of seeing. On top of that, more of the detail in the source material is preserved compared to current standards, where colour compression cuts down on the amount of different tones rendered on-screen.
 
I feel like it's probably one of those things like how you couldn't really see how HD looks on an SD screen, you have to see it in person.

I have not seen it in person, and until everyone is like "OMG! IT'S THE BALLS!", or I've seen it in action and become one of those people, it joins the long list of "TV/Monitors have pretty much peaked, how do we get people to buy new ones" features I ignore.

Could be awesome, could be this years "quantum dots"
 
Last edited:
Honestly, I’m still not sure what to think about HDR. A lot of the comparison images I’ve seen really don’t help, as they suggest HDR is all about over-saturated colors and blown-out highlights.

If you believe the marketing blather that's because you're attempting to view HDR content on a monitor/TV not designed for HDR output.

Done correctly HDR brings subtleties to shadows and stops whites from blowing out (i.e. you can see what's outside the windows in a scene shot inside). But A) it's hard to do correctly and I have a feeling most movie studios won't put in the time to do it correctly for most content and B) there aren't a lot of cameras out there right now that can handle shooting HDR video let alone consumer monitors and TVs to view it on. My bet is that it goes the way of 3D TV and no one is talking about it anymore in 2 years.
 
Okay the Dolby Vision vs Standard is not more "realistic" as stated in article, unless you are the Predator and have abnormal light sensitivity. The standard picture on right is closer to what that place really looks like. HDR is an artistic choice, not a means of enhancing realism. I used to shoot a lot of HDR on DSLRs because I like the look and was fascinated by the technical aspects of it. There are a lot of really interesting things you can do with HDR sequence of shots in post composition that allow you to selectively bring out highlights that the lighting at the time didn't allow or other things.
 
There isn't a big push towards HDR on the encoding side. We'll actually be using H.265 on all our devices years before HDR is a thing any of us use for the first time in streaming video.
 
A) it's hard to do correctly and I have a feeling most movie studios won't put in the time to do it correctly for most content and B) there aren't a lot of cameras out there right now that can handle shooting HDR video let alone consumer monitors and TVs to view it on. My bet is that it goes the way of 3D TV and no one is talking about it anymore in 2 years.
As far as i can tell what dolby is calling HDR is really just 10bit video. When setting up shots for recording you do not have to do anything special to get a 10 bit image to look good, infact it makes it easier. All of the digital cinema cameras can shoot in 10 bit and some of the ultra high end ones record 12 bit. The biggest issue i can see with this is that one of the first things compressing an image does is get rid of bit depth. I would be willing to bet that it will be more than 10 years before you see 10bit broadcast over cable even if they go to a 4k standard which i also doubt will happen.
 
  • Like
Reactions: N4CR
like this
They must really be doing a bad job on the marketing front if people still don't understand what display-level HDR means.

Contrast, baby. The real world isn't lit up in gradations of 1000:1.
 
It's definitely saturation. I just got an LG OLED and a UHD Blu-Ray player with Dead Pool, and the HDR 4k Dead Pool does look more saturated than just the regular Blu-Ray version.

As for games, when I plugged in my PC, tried The Witcher 3, the sky looks bluer, the grass greener, etc. The best way I can describe it is it looks more like a cartoon than the dreary dull IPS monitor. I do like the TV, though it has a dead pixel, which I can't stand, but afraid of re-exchanging it with Amazon due to their banning of accounts (already at 5 returns this year, all of them high priced items, and this is my second OLED due to the first one being defective).
 
Okay the Dolby Vision vs Standard is not more "realistic" as stated in article, unless you are the Predator and have abnormal light sensitivity. The standard picture on right is closer to what that place really looks like. HDR is an artistic choice, not a means of enhancing realism. I used to shoot a lot of HDR on DSLRs because I like the look and was fascinated by the technical aspects of it. There are a lot of really interesting things you can do with HDR sequence of shots in post composition that allow you to selectively bring out highlights that the lighting at the time didn't allow or other things.

Awwww yeah. Making declarations without actually viewing the material on an HDR screen.
 
I don't care what it means, as long is it doesn't mean this bullshit

11-hdr-photography.jpg


That is an abomination. Luckily that does not appear to be what we are going for.....they actually mean in a technically truthful way.
 
With that ice picture, the shot on the left is what is wrong with movies, totally unrealistic with the blue saturation. When will they learn that water is not a blue slushie? The "standard definition" is realistic.
 
I'm dealing with HDR at work. Here's what we've observed:

If you watch a properly calibrated SDR display by itself, you'll think it looks great.
If you watch a properly calibrated HDR display by itself, you'll think it looks great.

If you place them side by side your brain will tell you the SDR set is horribly washed out.

If you see the demo without first observing them in isolation you'll think it's rigged to favor the HDR display.

HDR is cool, but it is hard to sell when your brain is fucking with you. Any demo that involves screen shots on an typical compute monitor are pointless and even really high fidelity "prosumer" monitors likes Dells 32" 4K or HD DreamColor line don't really convey it.

Dolby Vision: gets you nothing over vanilla HDR10 , practically speaking. Given that DV is "optional" on UHD BluRay, it's having a hard time gaining foothold in the consumer space. It is cool, but the downstream toolchain is a mess. Nobody want's to bend over for Dolby just to put DV content on their service.

My advice: do not by an HDR set in 2016. Give it another year. Shit's still being decided /wrt to standards.
 
How in the fuck is this new? It isn't.

So, how long have your had a 10bit 4K60 consumer 65" with a wide color gamut (REC 2020) flat panel with 1000 nit peak white and a standard to support it?

HDR video is not that played nuclear wasteland freak-show photographers have been fucking with for years.
 
I think I may have reached a point where I can't keep up with this shit.

...that or I simply don't care anymore.
 
It's definitely saturation. I just got an LG OLED and a UHD Blu-Ray player with Dead Pool, and the HDR 4k Dead Pool does look more saturated than just the regular Blu-Ray version.

As for games, when I plugged in my PC, tried The Witcher 3, the sky looks bluer, the grass greener, etc. The best way I can describe it is it looks more like a cartoon than the dreary dull IPS monitor. I do like the TV, though it has a dead pixel, which I can't stand, but afraid of re-exchanging it with Amazon due to their banning of accounts (already at 5 returns this year, all of them high priced items, and this is my second OLED due to the first one being defective).
Your description makes me not want to get one. I hate it when images get oversaturated and have always avoided wide gamut monitors like the plague. I'll probably have to see these things in person to make any sort of judgement though.
 
Your description makes me not want to get one. I hate it when images get oversaturated and have always avoided wide gamut monitors like the plague. I'll probably have to see these things in person to make any sort of judgement though.


Yeah almost everything about the LG OLED looking more colorful is because of contrast and a game on PC looking more saturated has Zero to do with HDR and is because the LG is pushing the colors unrealistically.

HDR will become great but initially trying to game on HDR tv's is just going to map from sRGB to bt.2020 but at least now that the standards are set future games can be coded to support HDR properly so it will look proper on TV and monitors that support HDR.

in the mean time netflix and amazon already support proper HDR and once the consoles get their updates, the gaming industry will be able to target HDR displays from the start and it will work with exsisting displays and HDR ones and it wont have to be the crappy "tone mapped" HDR any more.
 
Your description makes me not want to get one. I hate it when images get oversaturated and have always avoided wide gamut monitors like the plague. I'll probably have to see these things in person to make any sort of judgement though.
It probably looks more saturated because of the gamut. I have pictures where the raw image looks amazing and when it's converted to SRGB it loses something. I think you can minimize it by adjusting out of gamut colors, but I can promise you that I've had images that I thought looked magical until they were downgraded to SRGB...after that they looked OK, as long as you didn't seen the original.

FWIW, Netflix has HDR content and I fully expect to see HDR movies on 4K Blu Ray too. As I recall, there are already some theaters that support a commercial version of HDR, so the future is coming.
 
I don't care what it means, as long is it doesn't mean this bullshit

11-hdr-photography.jpg


That is an abomination. Luckily that does not appear to be what we are going for.....they actually mean in a technically truthful way.
That's a bad use of HDR (or alternatively they wanted it to look surreal. I've seen HDR photos where they're just using it to give the image roughly the same DR as the human eye.
 
Another overused effect. Just like the lens flare, blooming, and over smoothing of motion on TV's.
 
Eurogamer Article said:
And beyond that, it's worth pointing out the AV specialist sites are also discovering some fundamental issues with current HDR-equipped displays.
HDTVtest said:
Contrary to popular belief, the purpose of HDR (high dynamic range) mastering is to expand the available luminance range rather than elevate the overall brightness of HDR videos.
(╯°□°)╯︵ ┻━┻

I have not waited over a decade to put up with this crap.
 
I'm in the camp of, if I need an HDR screen to see this special HDR quality, how can I see the difference by looking at a picture on a regular monitor?
 
As far as i can tell what dolby is calling HDR is really just 10bit video. When setting up shots for recording you do not have to do anything special to get a 10 bit image to look good, infact it makes it easier. All of the digital cinema cameras can shoot in 10 bit and some of the ultra high end ones record 12 bit. The biggest issue i can see with this is that one of the first things compressing an image does is get rid of bit depth. I would be willing to bet that it will be more than 10 years before you see 10bit broadcast over cable even if they go to a 4k standard which i also doubt will happen.

Most digital cameras out there (even the nice big expensive ones the studios use) don't have the necessary dynamic range to record true HDR video. It's one of the big things that's been driving camera development for the last few years now. Everyone wants to be the first to hit the kind of dynamic range that the best film stocks could (given proper lighting). And if all Dolby is trying to offer is a deeper color palette I don't see that helping when most studio colorists are content to just make everything orange and blue because the studios don't give them time to do the job properly.
 
Overhyped marketing for a hopped up unrealistic picture.

This reminds me of the first wave Wide Gamut color monitors that didn't have proper sRGB settings. So everything was over the top colors. Now it is over the top colors and contrast.
 
Overhyped marketing for a hopped up unrealistic picture.

This reminds me of the first wave Wide Gamut color monitors that didn't have proper sRGB settings. So everything was over the top colors. Now it is over the top colors and contrast.[/QUOTE


Wrong. Oversaturated/overhyped picture only happens when you have a mismathcing gamut between source and the display. You probably have been seeing DCI-P3 colorspace for some years in Cinema and you have not seen oversaturated colors and sunburnt skins and so on, have you? HDR brings this colorspace into home and even that is merely a stepping stone, as display tech evolves they aim for Rec2020 which covers about every color human eye can see, as far as RGB color pixel system can do anyway. sRGB colorspace is horribly limited and should die already. It cannot, for example, show true Neon colors.



Listen up people, you CAN NOT see what HDR is about in an SDR monitor. I actually wish all those websites would remove those simulated "comparison" pictures because they give people completely wrong idea and make some shrug it off as "gimmick". It is not, it is an evolution that finally buries the age old sRGB gamut (and few other things) that just keeps hopping on from standard to another and refuses to die. And its not just about gamut that HDR is about.
 
HDR seems quite pointless in the consumer space for about 99% of the people that will be using (and can afford) HDR capable equipment. How many of these people that will be buying new UHD Blu-Ray players and OLED TV's and actually get them professionally calibrated? The Majority of the population sticks with the out of the box settings or choose "Vivid" to get that horrible overblown saturation and sharpness. If you add HDR into that mix, I can see that problem becoming even more exaggerated.

The Professional space is a different beast where everything is typically calibrated to perfection anyway. But I know the marketing machine won't stop with HDR being the next "big thing" as they need to try and justify continuing TV purchases. I still haven't found a TV that can match my Panasonic 65ZT60 in PQ anyway. OLED is making a case, but still has to many motion handling problems for it a proper replacement. I am sure this will change in the next few years as manufacturing intensifies.
 
Wrong. Oversaturated/overhyped picture only happens when you have a mismathcing gamut between source and the display. You probably have been seeing DCI-P3 colorspace for some years in Cinema and you have not seen oversaturated colors and sunburnt skins and so on, have you? HDR brings this colorspace into home and even that is merely a stepping stone, as display tech evolves they aim for Rec2020 which covers about every color human eye can see, as far as RGB color pixel system can do anyway. sRGB colorspace is horribly limited and should die already. It cannot, for example, show true Neon colors.

Wde Gamut and HDR are actually two different things.

Wider Gamut done properly is subtle, if you put the two TVs side by side, you would only occasionally see the difference, but they aren't that big of a deal and without the side/side comparison you wouldn't really notice the "missing color". Most of the people "wowed" by wide gamut, are looking at boosted saturation from incorrect gamma.

Higher Dynamic Range done properly is even more subtle. All increase the dynamic range does is reduce banding. Which is nice, but unless there is banding, it does just about nothing.

So what they do to market this "HDR" feature, is all kinds of unrealistic tone mapping enhancement to boosting contrast. You could do similar effects without increased dynamic range. It's a farce.

"HDR" as is being sold and marketed, is a massively overhyped tone mapping, farce.
 
Last edited:
Wde Gamut and HDR are actually two different things.

Wider Gamut done properly is subtle, if you put the two TVs side by side, you would only occasionally see the difference, but they aren't that big of a deal and without the side/side comparison you wouldn't really notice the "missing color". Most of the people "wowed" by wide gamut, are looking at boosted saturation from incorrect gamma.

Higher Dynamic Range done properly is even more subtle. All increase the dynamic range does is reduce banding. Which is nice, but unless there is banding, it does just about nothing.

So what they do to market this feature, is all kinds of unrealistic tone mapping enhancement to boosting contrast. You could do similar effects without increased dynamic range. It's a farce.

HDR as is being sold and marketed, is a massively overhyped tone mapping, farce.


I know they are different things. And again, you are wrong. HDR is also about having bigger differences in brightness, more resembling real life than a "postcard" that the SDR currently is. Currently if you have a bright sun and guy wearing white shirt in same picture, in SDR they are either both equally bright OR they guys shirt is mapped down to grey. In HDR both can be equally white but the sun is noticeably brighter (and all this without clipping black or white details. No bloom effects thank you), which is why either OLED or FALD are pretty much requirements for true HDR output.
 
Wde Gamut and HDR are actually two different things.

Wider Gamut done properly is subtle, if you put the two TVs side by side, you would only occasionally see the difference, but they aren't that big of a deal and without the side/side comparison you wouldn't really notice the "missing color". Most of the people "wowed" by wide gamut, are looking at boosted saturation from incorrect gamma.

Higher Dynamic Range done properly is even more subtle. All increase the dynamic range does is reduce banding. Which is nice, but unless there is banding, it does just about nothing.

So what they do to market this "HDR" feature, is all kinds of unrealistic tone mapping enhancement to boosting contrast. You could do similar effects without increased dynamic range. It's a farce.

"HDR" as is being sold and marketed, is a massively overhyped tone mapping, farce.


If you have an hour of spare time you might want to watch this where Scott of AVSForum and Spectracal (CALMAN calibration software guys) discuss about HDR.
 
Excellent video there. Also, for those confusing HDR displays with HDR photography or looking for general info, here's a good quick link:

What is HDR for TVs, and why should you care?

Keep in mind, if you're looking at HDR TVs in a showroom, it's the same story as any TV. They will be set to 'torch' mode ("dynamic" mode or similiar, generally with all sorts of nasty processing and boosted color engaged to get the attention of typical buyers). Get into a menu and switch it away from showroom mode to home mode, then a movie or cinema mode and the TV will show what it's capable of in a truer sort of way. Haven't really started looking because I'm not in the market for one just yet.
 
Most digital cameras out there (even the nice big expensive ones the studios use) don't have the necessary dynamic range to record true HDR video. It's one of the big things that's been driving camera development for the last few years now. Everyone wants to be the first to hit the kind of dynamic range that the best film stocks could (given proper lighting). And if all Dolby is trying to offer is a deeper color palette I don't see that helping when most studio colorists are content to just make everything orange and blue because the studios don't give them time to do the job properly.

Uhh.. you'll have to be more specific to be remotely correct.

Lots of SLRs are shooting in 14 bit at this point. They gives 14 stops of dynamic range. Film is about 13 stops. A decent number of digital video systems are capturing at 12 bits. It's pretty close right now to film, and it's pretty much inevitable that DV will be surpassing all film stock in the not too distant future. Most anything with decent produciton values on TV can be taking advantage of 10-bit color space, which is what we are currently talking about for consumer displays.

As for the orange and teal atrocity, the deep color content seems to be redoing the color grading. I'm sure we'll be seeing horrific conversions form non-HDR intermediaries and shit, but good stuff will be out there for those who care. 4k and a larger color space should at least be good to sustain some of the value of physical media.
 
With that ice picture, the shot on the left is what is wrong with movies, totally unrealistic with the blue saturation. When will they learn that water is not a blue slushie? The "standard definition" is realistic.

Unfortunately HDR will not affect that at all. Unrealistic colours in movies and TV shows are (usually) an intentional creative decision rather than a flaw in the process.
 
I'd be interested in seeing a good quality projector with HDR, but it would require a much brighter lamp than typical home projectors, right? I know that Sony has a newer 4K projector with HDR, but it's not supposed to be "great". It's good, just not great. If I'm spending $10K on a projector, it's going to be great. I'll wait until it hits $5K before getting one, but even then it'll have to be very good.
 
I know they are different things. And again, you are wrong. HDR is also about having bigger differences in brightness, more resembling real life than a "postcard" that the SDR currently is. Currently if you have a bright sun and guy wearing white shirt in same picture, in SDR they are either both equally bright OR they guys shirt is mapped down to grey. In HDR both can be equally white but the sun is noticeably brighter (and all this without clipping black or white details. No bloom effects thank you), which is why either OLED or FALD are pretty much requirements for true HDR output.

Again. Bullshit.

8 bits is very close to creating shades where people can barely distinguish the difference (hence,the lack of banding in general).

You aren't going to notice any real difference with proper tone mapping of 10+ bits, because shades that are already available are so very close.

All that properly mapped higher bit depths does is insert a few more largely indistinguishable shades between them, but 10 bits would be handy for completely eliminating what banding that still remains. It would be nice for that, but hard to market when it is so rare to start with.

Since that difference is so subtle it would never be noticed by anyone, outside of the rare banding situation.

The fact that with proper tone mapping the effect is borderline invisible, leads to a host of introduced "HDR" effects to hype/market this tech (and you could do the same type of effects in 8bit).
 
Personally, for me, I feel like if I look at an HDR next to standard color-gamut set, it's noticeably more colorful. But if I look at an HDR set by itself I can barely tell and as I watch it it doesn't make a big difference to me because I'm not really paying attention to the exact shade of every color in the image.

So for me, if I'm buying a new TV I might buy an HDR set for a few more bucks. But I'm not going to run out and buy one just because it's better.
 
Again. Bullshit.

8 bits is very close to creating shades where people can barely distinguish the difference (hence,the lack of banding in general).

You aren't going to notice any real difference with proper tone mapping of 10+ bits, because shades that are already available are so very close.

All that properly mapped higher bit depths does is insert a few more largely indistinguishable shades between them, but 10 bits would be handy for completely eliminating what banding that still remains. It would be nice for that, but hard to market when it is so rare to start with.


That would be true IF you were to limit the peak brightness of the screen to 100-120cdm2, which is where the old standards aimed at. 8-bit is all you need and 10-bit is just extra for banding removal. HDR specs go from peak 500cdm (for OLEDS or any screen with infinite blacks) to 1000cdm2 (for LCDs with FALD or otherwise deep blacks, dont remember the number). Thats a lot more colors and shades to play with and 8-bit will band like mofo if it were to try to render a picture with that large dynamic range. Again, I am referring to the white shirt and bright sun example.

Again, if you can watch the video I linked. Those guys are professionals in the field who know more than you and me combined on the subject of colors and video.
 
Back
Top