Netflix Begins HDR Rollout

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
This is relevant for those of you with newer displays that can handle high dynamic range content. Only Marco Polo is getting it for now, but Daredevil is reportedly getting it next.

Earlier this year, Netflix said its near-future plans focused on HDR streaming and creating more original content. And now, according to multiple reports, Netflix has quietly rolled out support for high-dynamic range video. Although it hasn't made an official announcement, an executive at the company confirmed the feature earlier today. "We are indeed live with HDR. It works with compatible TVs, both in HDR10 and Dolby Vision," Yann Lafargue, Netflix's manager of corporate communications said to FlatpanelsHD.
 
Is this going to be more crap that we can't get on the PC and have to use a TV based application for?

It chaps my hide that we STILL cannot view 2k and 4k content through the Netflix website.

I hate that nearly every application like Netflix on televisions is a custom program. I don't understand why it can't be a web based HTML5 implementation. We for instance have a 3 year old Vizio that gets zilch support. It hasn't been updated in ages and the Netflix interface is archaic due to it being a standalone service. We can't even filter by genre. It's pathetic.

So though our TV could do HDR, I doubt that our television will get it unless we buy a new television. I wonder if that's the case for the lot of their users and if getting these new features is going to require new equipment.

From my understanding, HDR is primarily being pushed by Vizio on the manufacturer side. I wonder how much of this is manufacturers pushing the next big thing to sell televisions vs Netflix trying to garner / keep subscribers. I suppose it's win win.
 
So though our TV could do HDR, I doubt that our television will get it unless we buy a new television.
That's pretty doubtful, on all levels from the interface to LCD drivers. :p HDR capable displays have several critical requirements, and it's unlikely that a few years old, price sensitive at that, TV just happens to have everything it needs to support it.

It actually sounds like you should buy a new TV, or at least buy a stick or device (Chromecast, Roku, Apple TV, etc) that lets you do more modern things on it.
 
Is this going to be more crap that we can't get on the PC and have to use a TV based application for?

It chaps my hide that we STILL cannot view 2k and 4k content through the Netflix website.

I hate that nearly every application like Netflix on televisions is a custom program. I don't understand why it can't be a web based HTML5 implementation. We for instance have a 3 year old Vizio that gets zilch support. It hasn't been updated in ages and the Netflix interface is archaic due to it being a standalone service. We can't even filter by genre. It's pathetic.

So though our TV could do HDR, I doubt that our television will get it unless we buy a new television. I wonder if that's the case for the lot of their users and if getting these new features is going to require new equipment.

From my understanding, HDR is primarily being pushed by Vizio on the manufacturer side. I wonder how much of this is manufacturers pushing the next big thing to sell televisions vs Netflix trying to garner / keep subscribers. I suppose it's win win.
HDR is the real deal. You'll believe it when you see it. It also goes by Ultra HD Premium and/or Dolby Vision. Your TV can't do HDR unless it was designed with it to begin with, though. It sucks that Vizio isn't updating the firmware for your TV. You must have been an early adopter for 4K, right?
 
I dont get it. HDMI video is 15-235 scale. 15 is complete black and 235 is complete white. Adding more bits does not change this. If the panel is the same, it doesnt matter if it is showing 8 bit, 10 bit or 12 bit video, contrast should not change, color balance/reproduction should not change. In fact the only thing that should change is gradient. Is the jump from pixel brightness bit setting 50 to 51 to great for your eyes, well good news, we added 1024 levels in between 50 and 51 for your viewing pleasure. I still find it amusing that H10 video needs less bandwidth than 8 bit, because the codec doesnt have to be as choosy.
 
I've been trying to explain the concept of HDR displays to peeps for over a decade now, and most still don't get it. The content finally starts showing up, and I've almost lost the patience to bother. Almost.

I'm still pissed over Dolby buying out BrightSide way back when.

If the panel is the same, it doesnt matter if it is showing 8 bit, 10 bit or 12 bit video, contrast should not change, color balance/reproduction should not change.
Indeed. If the panel is the same. But what if the panel wasn't the same?

Read up on BrightSide.
 
I dont get it. HDMI video is 15-235 scale. 15 is complete black and 235 is complete white. Adding more bits does not change this.

They're just different scales, just like 0 to 100 is the same range on the Celsius scale as 32 to 212 on the Fahrenheit scale. One scale has more degrees than the other but that doesn't change how the temperature works.

A TV uses the 16-235 range. A PC monitor uses the 0-255 range. Video content (TV and movies) use the 16-235 range. Video games and PCs use the 0-255 range.

When you play a video game on a TV, or use a TV on a PC, the 0-255 range is converted to the 16-235 range (unbeknownst to you, consoles are converting the 0-255 console games to the TV's 16-235 range). When you play video content (TV and movies) on a PC monitor, the 16-235 range is converted to the 0-255 range. No data is lost converting between the two. Data does get displayed incorrectly when you attempt to display data set to one range on the other range without conversion, just as if you attempted to take a bath in 70C water instead of 70F water.
 
I dont get it. HDMI video is 15-235 scale. 15 is complete black and 235 is complete white. Adding more bits does not change this. If the panel is the same, it doesnt matter if it is showing 8 bit, 10 bit or 12 bit video, contrast should not change, color balance/reproduction should not change. In fact the only thing that should change is gradient. Is the jump from pixel brightness bit setting 50 to 51 to great for your eyes, well good news, we added 1024 levels in between 50 and 51 for your viewing pleasure. I still find it amusing that H10 video needs less bandwidth than 8 bit, because the codec doesnt have to be as choosy.
Same here. I thought the importance of HDR was in acquisition. So are they just talking better gradients which might be more noticeable on large 4K displays?
 
Same here. I thought the importance of HDR was in acquisition. So are they just talking better gradients which might be more noticeable on large 4K displays?
HDR means bright scenes are brighter. Dark scenes are darker. They can even combine the two. Think of a dark cave and a bright exterior all with proper definition. More contrast, higher dynamic range. This is especially noticeable with OLED screens that can show complete black, but can handle 500 nits of bright white all at the level of the individual pixel. LCDs require full-array local-dimming to achieve a similar result, with a different range of about 0.05-1000 nits.
 
Last edited:
Is this going to be more crap that we can't get on the PC and have to use a TV based application for?

It chaps my hide that we STILL cannot view 2k and 4k content through the Netflix website.

I hate that nearly every application like Netflix on televisions is a custom program. I don't understand why it can't be a web based HTML5 implementation. We for instance have a 3 year old Vizio that gets zilch support. It hasn't been updated in ages and the Netflix interface is archaic due to it being a standalone service. We can't even filter by genre. It's pathetic.

So though our TV could do HDR, I doubt that our television will get it unless we buy a new television. I wonder if that's the case for the lot of their users and if getting these new features is going to require new equipment.

From my understanding, HDR is primarily being pushed by Vizio on the manufacturer side. I wonder how much of this is manufacturers pushing the next big thing to sell televisions vs Netflix trying to garner / keep subscribers. I suppose it's win win.

This is why Android TV, Apple TV, and other exists. Relying on a TV manufacturer for software support is never going to end well, it's why smart TV's are nothing but garbage on the app front.
 
This DOES NOT mean that this is the kind of programming I have to look forward to in the future, right??

I swear every "photog" I see below the age of 30 thinks below is really nice looking. Even my kids kindergarten class photo was photoshopped....many of the parents complained.

13.jpg


HDR-Photography-Bazar-InspirationsWeb.com_.png
 
This DOES NOT mean that this is the kind of programming I have to look forward to in the future, right??

I swear every "photog" I see below the age of 30 thinks below is really nice looking. Even my kids kindergarten class photo was photoshopped....many of the parents complained.

13.jpg


HDR-Photography-Bazar-InspirationsWeb.com_.png

Makes it look like a painting to me.
 
That's pretty doubtful, on all levels from the interface to LCD drivers. :p HDR capable displays have several critical requirements, and it's unlikely that a few years old, price sensitive at that, TV just happens to have everything it needs to support it.

It actually sounds like you should buy a new TV, or at least buy a stick or device (Chromecast, Roku, Apple TV, etc) that lets you do more modern things on it.

We have a PC connected to it via HDMI and Steam Link. We've tried a few TV devices and are never happy with them. What I meant by capable is that the panel is capable of displaying HDR so long as it's software based. The chip inside of the television certainly isn't capable of doing Hardware based HDR or rendering such a thing. That'd be a laugh. Half of the stuff the TV comes with just crashes as is haha

So we just use it as a big display.

I do like the direction that Vizio is going where content is merely casted to the television with their latest P Series. Come to think of it, they actually market it as a display and not a Telescreen.

I wonder if the HDR requires hardware on the other end or if it is software based.

This is why Android TV, Apple TV, and other exists. Relying on a TV manufacturer for software support is never going to end well, it's why smart TV's are nothing but garbage on the app front.

Do Apple TV, Roku and such even support 2k and 4k right now? I don't believe that they support this HDR currently.

This DOES NOT mean that this is the kind of programming I have to look forward to in the future, right??

I swear every "photog" I see below the age of 30 thinks below is really nice looking. Even my kids kindergarten class photo was photoshopped....many of the parents complained.

13.jpg


HDR-Photography-Bazar-InspirationsWeb.com_.png

Gag. I would have complained too.

I don't think that this HDR is going to do quite what people think. What I don't get is why they don't just broadcast it in HDR if that's what they want to do.

Sometimes when HDR is very subtle it can add to an image. Mostly it detracts in my opinion. I can envision some weird looking content. The Walking Dead for example is a show that might look more interesting in HDR. It'd bring a 3D comic like look to the picture. HDR on a nature documentary however, I'll pass.
 
Last edited by a moderator:
This DOES NOT mean that this is the kind of programming I have to look forward to in the future, right??

I swear every "photog" I see below the age of 30 thinks below is really nice looking. Even my kids kindergarten class photo was photoshopped....many of the parents complained.

13.jpg


HDR-Photography-Bazar-InspirationsWeb.com_.png


Nope, that is not "HDR" that is "Tone mapping " or "Tone mapped".

the difference between tone mapped "HDR" and actual HDR content is Tone mapped content takes multiple standard dynamic range exposures or images and stacks them so that the contrast and saturation in the darkest parts of the scene and the detail in the shadows can be seen as clearly as those in the highlights.

Normally if say the sky in a bright scene detailed and visible instead of just being a blown out mass of bright color you loose the detail in the darker parts and vice versa. That is do to the the dynamic range of the camera sensor and the bit depth it saves the data( raw data being 12-16 bit lossless normally and Jpeg being 8bit lossy) typically at or with film depending on the formulation and the processing.

The amount of dynamic range is normally represented in "stops" , some more info here: Exposure range - Wikipedia, the free encyclopedia

The type of HDR coming out with the UHD HDR standard moves to 10-bit or 12-bit color up from 8-bit and moves the peak brightness from about 100 nits we have used since color tv and limited sRGB and rec.709 DVD, blu-ray Windows OS's and the standard for web content up to 10,000 or higher .


here is a good break down about some of that:

Light Illusion


and the type of image you will get from HDR content is more like this:
1280px-St_Kentigerns_Church_HDR_%288226826999%29.jpg


Unfortunatly for computer user to view streaming HDR content on PC is going to call for new Monitors and also new Graphics cards.

The standard needs HDCP 2.2 over DisplayPort or HDMI 2.0a and at least a monitor that is 10-bit native and can display the DCI P3 color gamut and will do 1000 nit peak brightness and probably full array local diming backlight or OLED.

HDR isn't about making super bright displays that will cook you retinas, it is about making the darkest parts of the scene closer to black while not losing shadow detail and having the bright parts as bright as it would look to you eye in nature.

WE might not get HDR movie content on PC's soon but once the capable displays start to come out everything has the potential to look tons better from movies to games.

There is already a big backlog of HDR potential in gaming going back to Half-life 2 and the Lost Coast stuff and many games fake the HDR effect by making it darker when you look in shadows but much brighter when looking at a light source.


some info here and a prototype monitor that Dolby eventually bought into for their HDR stuff:

BrightSide DR37-P HDR display

This isn't a gimmick like 3d glasses, this is taking the much bigger range of color and light info that modern cameras can produce and instead of shrinking it way down into a limited range that LCD have moved past aged ago instead is represented to the limits of the tech and beyond so in the future as displays get better, the content will not be limited.

The standard is so forward looking now that it can handle colors and brightness peaks and lows beyond what Lasers and OLED can do so the people making content can do the workflow now and it will map straight to the limits of the display and as better displays come out, it will map to them too, the systems chosen change the way Gamma and color are factored.

we might not get the displays capable of reaching the limits for 5-10 years, but the content made now already has the data ready for when we do.
 
We have a PC connected to it via HDMI and Steam Link. We've tried a few TV devices and are never happy with them. What I meant by capable is that the panel is capable of displaying HDR so long as it's software based. The chip inside of the television certainly isn't capable of doing Hardware based HDR or rendering such a thing. That'd be a laugh. Half of the stuff the TV comes with just crashes as is haha

So we just use it as a big display.

I do like the direction that Vizio is going where content is merely casted to the television with their latest P Series. Come to think of it, they actually market it as a display and not a Telescreen.

I wonder if the HDR requires hardware on the other end or if it is software based.



Do Apple TV, Roku and such even support 2k and 4k right now? I don't believe that they support this HDR currently.



Gag. I would have complained too.

I don't think that this HDR is going to do quite what people think. What I don't get is why they don't just podcast it in HDR it that's what they want to do.

Sometimes when HDR is very subtle it can add to an image. Mostly it detracts in my opinion. I can envision some weird looking content. The Walking Dead for example is a show that might look more interesting in HDR. It'd bring a 3D comic like look to the picture. HDR on a nature documentary however, I'll pass.

I know the Shield TV supports it and by extension any Android TV device with the appropriate HDMI spec. Apple TV unfortunately doesn't since they only went with HDMI 1.4 on the most recent model.
 
I wonder if the HDR requires hardware on the other end or if it is software based.

As far as I know it is both. One of the problems we are running into is there is a possible format war for HDR. I would be very annoyed if my half year old JS9500 couldn't do HDR once content becomes widely available because Samsung loses the war... nonetheless I will check this out and see how it looks.

It looks to me that this is 1080p content though - has anyone seen if this is 4k HDR? My TV came with the 4k HDR Maze Runner and Exodus on a WDpassport and I was able to stomach both as it was visually amazing.
 
this is taking the much bigger range of color and light info that modern cameras can produce and instead of shrinking it way down into a limited range that LCD have moved past aged ago instead is represented to the limits of the tech and beyond so in the future as displays get better, the content will not be limited.

This pretty much sums up the camera and content. Still, the only way and HDR set would exist, is if some how they have had magic panels with super contrast and color gammit(sp) they just decided not to release because an 8 bit interface would not have enough steps between total black and total white.

I can just see the company now "hey Ted you know how for the last 20 years we could only make lcd panels with native 2000:1 contrast, well i got a way we could kill the market with a 20,000 ratio and a 2000nit brightness (yes eyes will bleed)", "sorry Ted but we only have about 200 steps from black to white to work with, the customers will hate it". " Hey joe keep slapping on that 1 billion to 1 contrast ratio on those gaming monitors, Ted figure out how to measure it in a vacuum".
 
I found myself at a store today that had new HDR tv's in the OLED variety. Beside the fact these things were 3mm or so thin (cellphone thin), all I can say is, holy shit.

I haven't been impressed with 4k TVs nor seen a need for an upgrade. But damn, it was like looking through a window. Content was obviously currated for the demo of course. Viewing angle was insanely good.

Price tag was $8k. So a tad on the expensive side right now. I can say that once the content catches up, OLED 4k is the next jump.
 
Another detail worth pointing out is that any modern game engine is calculating lighting internally with the equivalent of HDR. Same goes for most modern film production.
 
HDR isn't about making super bright displays that will cook you retinas, it is about making the darkest parts of the scene closer to black while not losing shadow detail and having the bright parts as bright as it would look to you eye in nature.
Furthermore, we can finally achieve saturated colors at high luminance, as with the sun shining through stain glass windows, fall leaves, grapes, wine, etc. The colors, man. The colors.
 
I dont get it. HDMI video is 15-235 scale. 15 is complete black and 235 is complete white. Adding more bits does not change this. If the panel is the same, it doesnt matter if it is showing 8 bit, 10 bit or 12 bit video, contrast should not change, color balance/reproduction should not change. In fact the only thing that should change is gradient. Is the jump from pixel brightness bit setting 50 to 51 to great for your eyes, well good news, we added 1024 levels in between 50 and 51 for your viewing pleasure. I still find it amusing that H10 video needs less bandwidth than 8 bit, because the codec doesnt have to be as choosy.


spinal-Tap-11.jpg
 
Don't know what the complaining it about. Even HDR content in 1080P looks worlds better. It adds some serious depth to the image on screen.

My Samsung JS8500 is an HDR panel. True HDR content looks amazing.
 
Aside from the bloom effects in gaming most of what is put forth as examples of HDR I see as nothing but artistic renderings. As far as quality video I would call the colors "over saturated" and the contrasts "over peaked." This is because looking at the same scene with your own eyes you will NOT see what a HDR rendering shows. You will see intense color in the brightest sunlight; but in HDR the shadows seem to be thrown out. So there is nothing realistic about it; it is more of a dreamworld effect.

Now as far as a more life like and compelling display I think the work needs to be in the DARK region of the video. Consider the fact that if you are in a dark room; you don't see 100% black (unless you are in a cave) .
You see a huge number of shades of grey; and the small amount of light that penetrates, this is where your actual color is. In most games, video, etc; you have black, and then a little light and that is about it. Totally unrealistic.
 
Aside from the bloom effects in gaming most of what is put forth as examples of HDR I see as nothing but artistic renderings.
They are indeed artistic renderings, because you can't display high-dynamic range on conventional displays. The content may have been HDR at one point (or not), but the content being presented by photographers, by the time you see it, has been downsampled to LDR.

As far as quality video I would call the colors "over saturated" and the contrasts "over peaked." This is because looking at the same scene with your own eyes you will NOT see what a HDR rendering shows.
Bingo! Because you don't have an HDR display, and web browsers currently do not support HDR file types.

Now imagine if, instead of an HDR rendering in LDR on an LDR monitor, you had HDR content on an HDR monitor. You would see something you might actually mistake for the real scene.

So there is nothing realistic about it; it is more of a dreamworld effect.
Oh, wait... so you don't actually understand at all.

Now as far as a more life like and compelling display I think the work needs to be in the DARK region of the video. Consider the fact that if you are in a dark room; you don't see 100% black (unless you are in a cave).
Still not enough. Even with 100% black, a lit match will be the same brightness as the sun on a conventional display.
 
Back
Top