Samsung and Amazon Introduce yet Another HDR Standard

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
The original HDR10 standard is already old news, as Samsung and Amazon is introducing a “plus” version that flaunts “dynamic tone mapping.” My reading of it is that brightness and contrast levels will now change based on the content of a scene: think of the “dynamic” picture mode on most displays, but now built into the stream or movie. This is actually the fifth major HDR standard—which is funny, since most people still have no clue what high-dynamic range even means.

HDR10+ elevates the HDR10 open standard with the addition of Dynamic Tone Mapping. The current HDR10 standard utilizes static metadata that does not change during playback despite scene specific brightness levels. As a result, image quality may not be optimal in some scenes. For example, when a movie’s overall color scheme is very bright but has a few scenes filmed in relatively dim lighting, those scenes will appear significantly darker than what was originally envisioned by the director.
 
Is this in anticipation of the new FreeSync 2.0? Seems like it could work well this the tech.
 
Obligatory xkcd. This will go well
standards.png
 
Samsung's ploy to get a leg up in sales.HDR (derivatives of it) is becoming a synonym for tomato sauce everyone has their own recipe....
 
I don't quite get it.

My personal goal when watching a movie is to view it exactly as the director envisioned. Should there be scenes where HDR gets in the way of the artistic expression intended, the director either accepts that, or they mitigate it in their material. If there is one thing I absolutely hate is some component manufacturer imposing their (ultimately flawed) "work-arounds" to doing something differently that surely is better for the consumer. UGH.

I've seen this sort of thing with "dynamic video enhancement" on Samsung phones, Dell monitors and various TVs. They all suck; they have to, they're a calculated change to the source material that may or may not look better to individuals.Video card drivers get in on the deal as well. Now I have two hardware components making changes to lighting without knowing what each are doing.

Stop it, Samsung and Amazon.
 
Dynamic HDR is actually something we want for games, especially when one level is outdoors, and the next one is underground. Film can control their lighting for their equipment so as not to cause too much crushing, so HDR10 is probably fine if your'e watching a single movie or TV show. Dolby HDR is also dynamic and this was their leg up on HDR10, but I it requires a hardware decoder and I guess Amazon wanted a license free alternative and got their buddy Samsung to support it from the hardware side, and Windows + GPUs can probably process it from the software side (conjecture but likely).
 
I don't quite get it.

My personal goal when watching a movie is to view it exactly as the director envisioned. Should there be scenes where HDR gets in the way of the artistic expression intended, the director either accepts that, or they mitigate it in their material. If there is one thing I absolutely hate is some component manufacturer imposing their (ultimately flawed) "work-arounds" to doing something differently that surely is better for the consumer. UGH.

I've seen this sort of thing with "dynamic video enhancement" on Samsung phones, Dell monitors and various TVs. They all suck; they have to, they're a calculated change to the source material that may or may not look better to individuals.Video card drivers get in on the deal as well. Now I have two hardware components making changes to lighting without knowing what each are doing.

Stop it, Samsung and Amazon.
Well you see HDR gives you the larger dynamic range that is closer to that you see in the Theater so it is, you know closer to what the director envisioned. It's not the same as photo HDR where they bracket three shots and merge them into a 8 bit per channel, it's actually using a floating point channel so you can say make the sun more than 256x or 1024x as bright as the darkest detail as you'd get with 8bit and 10bit per channel displays (yeah it's a abit different with Video encoding standards but by the time you hit your panel, it's usually 6 to 10 bit). So it's basically jumping your brightness fidelity in leaps and bounds, and it's up to the folks encoding the source material to focus on accuracy. You may have seen some HDR demos which make things look unnatural, but that's like looking at the original Twister DVD and saying DVD sux0rs vs Laservision based on that comparison.
 
Well you see HDR gives you the larger dynamic range that is closer to that you see in the Theater so it is, you know closer to what the director envisioned. It's not the same as photo HDR where they bracket three shots and merge them into a 8 bit per channel, it's actually using a floating point channel so you can say make the sun more than 256x or 1024x as bright as the darkest detail as you'd get with 8bit and 10bit per channel displays (yeah it's a abit different with Video encoding standards but by the time you hit your panel, it's usually 6 to 10 bit). So it's basically jumping your brightness fidelity in leaps and bounds, and it's up to the folks encoding the source material to focus on accuracy. You may have seen some HDR demos which make things look unnatural, but that's like looking at the original Twister DVD and saying DVD sux0rs vs Laservision based on that comparison.

In addition, although it's not the focus of HDR+, the different HDR formats also support Wide Color Gamut and more bits per channel, allowing a more direct conversion from the source material, instead of the dumbing-down that we're used to for use with our consumer level equipment. Of course, that also depends on how HDR is implemented, I'm sure there will be many offenders trying to upscale from source material that's already been converted to 8 bit standard gamut non hdr :(
 
Isnt this the same thing that Valve used in the HL2: Lost coast demo, to 'fake' HDR on current hardware? So essentially, it's faking a higher contrast range through software trickery?
 
Well you see HDR gives you the larger dynamic range that is closer to that you see in the Theater so it is, you know closer to what the director envisioned. It's not the same as photo HDR where they bracket three shots and merge them into a 8 bit per channel, it's actually using a floating point channel so you can say make the sun more than 256x or 1024x as bright as the darkest detail as you'd get with 8bit and 10bit per channel displays (yeah it's a abit different with Video encoding standards but by the time you hit your panel, it's usually 6 to 10 bit). So it's basically jumping your brightness fidelity in leaps and bounds, and it's up to the folks encoding the source material to focus on accuracy. You may have seen some HDR demos which make things look unnatural, but that's like looking at the original Twister DVD and saying DVD sux0rs vs Laservision based on that comparison.

This is the first explanation of HDR that makes sense to me. I've always understood it to be just 10 bit vs 8. My understanding was the current system is 0 = black and 255 = white. I thought HDR meant 0 = black and 1023 = white and we would just have a lot more shades of gray. Since a screen can only display so dark a black and so white a white I've never understood what the point of HDR is.

That being said how does the floating point help with the limitations of the screen? IE If I had a scene mostly of a dark tunnel looking outside the source material can encode the details in dark shadows and the details in the bright light but the tv still can not over come the limitations of its contrast ratio.
 
So according to the photographic evidence, Amazon's HDR10+ will completely transform low-contrast, low-backlit images into normal-looking ones?

Cuz I thought HDR was like "10,000 suns beamed directly into your retinas to simulate what your eyes do when you walk from a dark tunnel into a bright field, only like 10000% more dramatically, cuz it's Hollywood!" :p
 
This is not new. It's also supported by Dolby Vision. It's a matter of static vs dynamic HDR metadata.
 
This is the first explanation of HDR that makes sense to me. I've always understood it to be just 10 bit vs 8. My understanding was the current system is 0 = black and 255 = white. I thought HDR meant 0 = black and 1023 = white and we would just have a lot more shades of gray. Since a screen can only display so dark a black and so white a white I've never understood what the point of HDR is.

That being said how does the floating point help with the limitations of the screen? IE If I had a scene mostly of a dark tunnel looking outside the source material can encode the details in dark shadows and the details in the bright light but the tv still can not over come the limitations of its contrast ratio.

Well some TV maker are trying to workaround those limitations, see Vizio P-series for example, it has a constellation for the backlight which means they can narrow down which LED to backlight. Given it's not OLED, it's still a very nice panel for the price.
Also, HDR / Dolby Vision etc gives more nits. (From what I read here and there, sorry not really my cup of tea). So having more whites and blacks (no pun intended) makes better pictures.
 
"artistic expression"...in Hollywood today?...hahaha...nothing "artistic" about movies today...a lot of leftist agenda based crap , but not much art
 
Isnt this the same thing that Valve used in the HL2: Lost coast demo, to 'fake' HDR on current hardware? So essentially, it's faking a higher contrast range through software trickery?

No, it's not "faking", it's providing a higher dynamic range for filmmakers to take advantage of, and a compatible ecosystem for that to be displayed on.
 
OK, WHY is it better? Explain please.

saying that Dolby Vision is better is not a subjective statement, it's pretty much a fact...look it up anywhere online...DV is a premium version of HDR10

-Dolby Vision mastering supports up to 10,000 nits peak brightness, with a current 4,000 nit peak brightness target
-HDR10 mastering supports up to 4,000 nits peak brightness, with a current 1,000 nit peak brightness target
-Dolby Vision mastering supports up to 12-bit color depth, HDR10 is mastered for 10 bits
-Dolby Vision mastering supports up to the BT.2020 color space, HDR10 is mastered for DCI-P3
 
This is the first explanation of HDR that makes sense to me. I've always understood it to be just 10 bit vs 8. My understanding was the current system is 0 = black and 255 = white. I thought HDR meant 0 = black and 1023 = white and we would just have a lot more shades of gray. Since a screen can only display so dark a black and so white a white I've never understood what the point of HDR is.

That being said how does the floating point help with the limitations of the screen? IE If I had a scene mostly of a dark tunnel looking outside the source material can encode the details in dark shadows and the details in the bright light but the tv still can not over come the limitations of its contrast ratio.

The floating point allows screens with high output backlights boost brightness (at the expense of color accuracy) in sections of the screen where there are high brightness content, while utilizing low backlight with high color accuracy in others. So there are differences in peak output capabilitiy among HDR capable sets, some sets are only able to mildly improve peak outputs despite being fully electronically HDR compliant. The signal value for say the sun is usually far in excess of what an HDR screen is capable of displaying, so some screens would output at 700 nits, while another company might hit 1000, then a different one with a full array backlight might hit 1500 and it might only be able to maintain peak for a few 100 ms to a few seconds before dimming (due to heat and power supply limits and % coverage of the screen at high brigthness). This is where screens using OLED often have poorer peak output numbers than LCDs since they can't really drive OLEDs that hard.
 
The main plus of HDR now is better colors, not crazy brightness. It's one of those things you notice more when you lose it than when you're watching it. It also gets rid of any color banding you may see. It's pretty impressive once you've seen it and I'm glad my TV does both standards.
 
Go back to Fox News, Trumptard.

hahahaha!...after you go back to your mom's basement Obamagonad...oh , and she needs the Rent-A-Center Compaq back , she has Jerry Springer to stream...haha that works!

"compatible ecosystem"...hahaha... who are you? , Marlin Perkins?...millennials , huh , can't pee standing up
 
Last edited:
Back
Top