DirectX 12 Adoption “Huge” Among Developers; HDR Support Coming in 2017 as Microsoft Shares New Info

  • Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.
Honestly i figured the hole hdr thing was mostly amd hype.....hmm it might actually be something cool. Is it only for the new AMD cards or is Nvidia going to be using it as well?
 
  • Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.
  • On a fundamental level, HDR allows a display to emit a peak brightness for certain pixels that is a hundred times higher than on today’s TVs. You can’t run every pixel at that brightness, but having parts of it like stars of floodlights with that kind of brightness is a “big win,”
    • With the new features, the email background will still be set at 1.0 value, but bright spots in games, photos and movies will be able to go substantially above that.
    • Developers will be able to submit really high values of brightness, and they will be clamped only when it’ll be absolutely necessary.
    • Color gamuts will work the same way. How colorful a scene can be is currently limited by the official range of colors that current display of windows and HDTV are set at. Going forward those limits will also be removed, and developers will be able to use more than the currently used 30% of the human visual range, up to 75 or 80%. This will allow games to express things visually that they can’t today.
    • When you see a white spot in a game, you don’t really know if it’s a piece of plaster, the reflection of the sun, or a glowing object, because they’re all clamped to the same 1.0 value. With HDR diffuse surfaces like plaster could be set close to 1.0, while light sources could be two or three times brighter, allowing the user to actually distinguish what the object is. It will be a new level of realism.
    • As Windows will be able to support HDR on all of its devices, there won’t be a limit to what panel vendors can create
In other words you would have to make certain the panel that you buy will "support" a certain peak brightness (to areas of the screen or the whole screen) . In TV land it is called NITS and top end models as the Panasonic 902/904 can do 1000 Nits and on areas 1200 Nits and panels need to support at least 10 bits ...
 
HDR was never really an AMD thing, it was always coming. It's just a bit of PR spin on their behalf to try and grab the brand association.
 
HDR was never really an AMD thing, it was always coming. It's just a bit of PR spin on their behalf to try and grab the brand association.

Well to gauge consumer interest in a product, someone has to put their time and neck out there. I think nowadays that Intel shows off a lot more new technology than Nvidia does.
 
Nvidia already has stuff out that supports HDR, Shield TV:
> Capable of supporting 10-bit color format (ITU-BT-2020 compliant) and HDR.

Here is them showing off HDR last year on PC:
 
Nvidia already has stuff out that supports HDR, Shield TV:
> Capable of supporting 10-bit color format (ITU-BT-2020 compliant) and HDR.

Here is them showing off HDR last year on PC:


Anything with an HDMI 2.0a interface should support it, it's part of the specification.
 
I'm confused... Is this some sort of new tech reusing the name of a shader lighting technique introduced in DX9? They should have named it something else.
 
HDR has been promised for years. Game developers have yet to implement it because there were zero monitors on the market that supported it. I think I read somewhere that after creating the scene that they wanted, a game developer has to go back and remove a ton of the cool effects and implement bloom and other effects to simulate HDR. Why do people hate bloom effects? Because it looks out of place as it always seems that something is missing.

This is the equivalent of Picasso creating a masterpiece and being told that he can't release that version; he needs to rub some nuts and berries together in the palm of his hand and then finger paint a simpler version for the masses.
 
3D, 4K, HDR. Yawn. Just another thing they will use to try to get you to buy their stuff.
 
HDR works better on CRT's than current LED monitors, and this is because they can go lower on the black scale, they still can't get high enough brightness though. HDR monitors/tv's vs today's LED's, should help with movies more than games though, It should get rid of the flat lighting that makes movies look look like soap operas.
 
No, Monkey God this this technology actually looks different. I'm not telling you to get HYPED! as this has been promised for years literally. Just store this one in the back of your mind for when the flood of monitors and TVs come out since it seems that we're actually going to get an actual release this time.
 
It's a push that focuses on actual picture quality. I am excited.

I'll upgrade my monitor when I can get a 34", curved, HDR, high hz.
 
No, Monkey God this this technology actually looks different. I'm not telling you to get HYPED! as this has been promised for years literally. Just store this one in the back of your mind for when the flood of monitors and TVs come out since it seems that we're actually going to get an actual release this time.

Cool. Next time in the market for a new monitor/TV ill look into it.
 
  • Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.
Honestly i figured the hole hdr thing was mostly amd hype.....hmm it might actually be something cool. Is it only for the new AMD cards or is Nvidia going to be using it as well?

This sounds similar to Dolby's recent efforts with Dolby Vision & Dolby Cinema. They've been focusing on "better pixels" instead of just "more pixels".
 
  • Microsoft predicts that adoption of High Dynamic Range will be faster than that of 4k. It’s more clearly noticeable to users than 4k, with several TV models that have begun shopping. It’s very difficult for the average consumer to tell the difference between 1080p and 4k unless they get very close to the screen, whereas HDR is something that end users, not just professional graphics artists can see the advantage of.

Its very difficult for an average consumer because there is no 4k content on TV... hell even full 1080p is rare. But seriously, at native resolution you can easily see the difference between 4k and 1080p even at TV viewing distances.
 
HDR works better on CRT's than current LED monitors, and this is because they can go lower on the black scale, they still can't get high enough brightness though. HDR monitors/tv's vs today's LED's, should help with movies more than games though, It should get rid of the flat lighting that makes movies look look like soap operas.

Thats why true HDR is basically limited to OLED screens and LCD with true local dimming (preferably full led backlight and a lot of dimming zones). LCD tv's without local dimming and claiming to have HDR support are just faking it. They dont have big enough ANSI contrast ratio for that.
 
Thats why true HDR is basically limited to OLED screens and LCD with true local dimming (preferably full led backlight and a lot of dimming zones). LCD tv's without local dimming and claiming to have HDR support are just faking it. They dont have big enough ANSI contrast ratio for that.

Well, I've been wanting more TVs, at better prices, with local dimming for a long while. I've been waiting a long time for OLED tech to get better, last longer, and get cheaper. Colors and black levels is why I bought a Panny plasma about 3 years ago over any of the LED TVs on the market. I say bring HDR and TVs that have the specs to handle it on!
 
HDR I say will have big impact with good HDR content. Otherwise it won't be much different with LDR (low dynamic render) content. HDR and VR would make it rather life like and would be an excellent combo.

As for games, most of the games render in HDR to begin with due to better accuracy and tone map down to LDR output.
 
Funny that the text on the slides looked like it had some bloom going on. This is great news that MS is supporting this. HDR will mean more for image quality than 4K ever could, especially for most people coming from 1000:1 contrast LCDs. Hopefully they can backport this feature for all games on any version of DirectX, there are some old games (and ones that will be old by 2017) that would be worth replaying with proper contrast.
 
Wasn't Half Life 2 promoted with HDR in 2003? So why is this HDR different than that HDR?
 
Wasn't Half Life 2 promoted with HDR in 2003? So why is this HDR different than that HDR?

Panels can't handle it you can do an approximation of it but it would not be the same. Remember how you could mimic more colours while your video card limit was only 255, This is the same thing but reversed the computer can turn them out but the panel wouldn't know what correct colour shade to pick ..

When you have 10 bit panels and allow the local dimming or use Oled you can do something that is physically better suited.
 
Wasn't Half Life 2 promoted with HDR in 2003? So why is this HDR different than that HDR?

Valves HDR in source is not true HDR. It is simulated HDR since monitors could not display true HDR images. Valves is basically a bloom trick that changes certain areas of the image to 100% brightness in an effort to fool your eyes.
 
Wasn't Half Life 2 promoted with HDR in 2003? So why is this HDR different than that HDR?

As said, this has nothing to do with shader HDR effect, its just an unhappy coincidence that they share the same name. HDR effect was just a dynamic version of Bloom and is essentially faking "high dynamic range" by just covering the details under glowy bits. HDR in monitors/TV's is an evolution of the way the picture is drawn. We are finally leaving behind the age old and very limited sRGB/BT709 gamut and 8-bit color depth. Colors are aiming to go towards DCI-P3 gamut (not as deep on the greens as Adobe RGB, but bigger than that towards the red) and at-minimun 10-bit color depth (preferably 12-bit) negates most of the banding wider gamut of colors would cause. We also get more range on brightness, two white items can be of different brightness without adding gray on the less bright object. Today there is no difference in brightness between a piece of paper and image of the sun on the TV, HDR fixes that.

I may be bit wrong on the actual details but the the nutshell is something like that. Here is a good article of HDR.
HDR (high dynamic range) on TVs explained - FlatpanelsHD
 
3D, 4K, HDR. Yawn. Just another thing they will use to try to get you to buy their stuff.

Yep. Desperately trying to keep you upgrading when upgrading becomes less and less of an issue. I just went some 6 years without upgrading components. Moore's Law was a real big boon for them but its dead now.
 
Actually, AMD are the only ones supporting 4k60 10bit/HDR, as Dp is the only common standard on mass market, that can do it currently.
 
Actually, AMD are the only ones supporting 4k60 10bit/HDR, as Dp is the only common standard on mass market, that can do it currently.

Im pretty sure HDMI 2.0a does. Or is HDR limited to 30hz in that connection?
 
HDR requires an updated videocard driver. If the panel supports it the cable attached should be trivial ...
 
Until OLED panels are mainstream, forget about HDR.

People are still mostly buying DVDs over bluray right now, that's how far the mainstream care about all this.

I mean, we're still suffering the inferiority of LCDs compared CRTs' infinite colours, all res's native, etc. The mainstream completely and unreservedly made it beyond clear that they'd happily sacrifice a degradation in almost all areas of picture quality in order to save 12 inches of space behind their TV. Which you can't see because there's a TV in the way (lol/sigh).

So how much do you think they care about HDR, which evidently, even on a tech forum, not everyone understands fully?


Yeah, wake me up next decade...
 
If BluRays were the same price as DVDs, people would be buying BluRays instead. Unfortunately, companies think that they can charge a premium for BluRay discs.
 
If BluRays were the same price as DVDs, people would be buying BluRays instead. Unfortunately, companies think that they can charge a premium for BluRay discs.
At my local Target, new DVD releases are typically $18 USD while Blu-ray are typically $22. That's pinching pennies. When it comes to older releases DVD and Blu-ray are often the same price. The days of $50 Blu-ray movies are long gone.
 
At my local Target, new DVD releases are typically $18 USD while Blu-ray are typically $22. That's pinching pennies.


Well, when the alternative is acquiring media for free, companies should understand that an extra $4 can push their potential customers to other sources.
 
If BluRays were the same price as DVDs, people would be buying BluRays instead. Unfortunately, companies think that they can charge a premium for BluRay discs.
Price is determined by demand. As demand increases, supply increases, prices go down. As someone mentioned before, it has to do with the market. Most people don't give a shit about blu-ray to begin with. Your average blu-ray player sucks. The MPAA keeps changing the encryption requiring the players themselves have internet access to download updates. The entertainment industry wasted the past 10 years on this tech and fucked themselves on this one. Now everyone is onto the next best thing which is streaming.

Simply put, it isn't an arbitrary price premium. The technology is simply shit and no one wants it so prices stay high.
 
Until OLED panels are mainstream, forget about HDR.

People are still mostly buying DVDs over bluray right now, that's how far the mainstream care about all this.

I mean, we're still suffering the inferiority of LCDs compared CRTs' infinite colours, all res's native, etc. The mainstream completely and unreservedly made it beyond clear that they'd happily sacrifice a degradation in almost all areas of picture quality in order to save 12 inches of space behind their TV. Which you can't see because there's a TV in the way (lol/sigh).

So how much do you think they care about HDR, which evidently, even on a tech forum, not everyone understands fully?


Yeah, wake me up next decade...

Could have had it all with Plasma, but the negative perception given to Plasma stuck with it until the end.

Rocking a Panny ZT60 Here. Properly calibrated, I will put it against any 4K LCD TV on the market even 3 Years later.
 
Back
Top