Microsoft is bringing Auto HDR to DirectX 11 and DirectX 12 PC games

Armenius

Extremely [H]
Joined
Jan 28, 2014
Messages
41,731
The much touted feature available on the Xbox Series X|S is coming to PC games on Windows 10 for those on DirectX 11 and later. The feature is currently being tested on Windows Insider Program Dev Channel build 21337 and later. The feature will not work with every game out there, but it is claimed to currently work on over 1,000 titles. Perhaps we can look forward to this feature making its way into 21H1?

Bigger. Bolder. Brighter. High dynamic range (HDR) offers the most impressive improvement to the visual experience in recent history. HDR unlocks an entirely new range of colors with more intensity than standard monitors, making games come to life like never before. In November 2020, the Xbox Series X|S consoles launched with the Auto HDR feature which automatically upgrades your backwards compatible games from SDR to HDR to take advantage of this amazing display innovation and provide a richer visual experience even on already-released games.
Today we’re excited to bring you a preview of Auto HDR for your PC gaming experience and we’re looking for your help to test it out. Jump to the How to Enable Auto HDR section to get started! When enabled on your HDR capable gaming PC, you will automatically get awesome HDR visuals on an additional 1000+ DirectX 11 and DirectX 12 games!

Thanks to MrTX for the original tip in the LG 48CX thread.

https://devblogs.microsoft.com/directx/auto-hdr-preview-for-pc-available-today/
 
Uhm, no thanks. A lot of games that were designed for HDR already take it too far and just look like a mess of oversaturated images, good to get a suntan in front of the monitor, but not for a pleasant gaming experience. I dread to think how bad it looks when applied to games that weren't even designed with HDR in mind. Just leave it be as it was meant to be played.
 
One question I have is if this implementation of HDR will require HDR to be enabled in the windows display settings or not. I always leave HDR disabled in the windows display settings because it makes everything look too dark and washed out when I'm just browsing web pages, etc. I do like HDR in games however. I hope that it will allow HDR to be used in games without having to enable it for non-game / non-movie content.
 
Uhm, no thanks. A lot of games that were designed for HDR already take it too far and just look like a mess of oversaturated images, good to get a suntan in front of the monitor, but not for a pleasant gaming experience. I dread to think how bad it looks when applied to games that weren't even designed with HDR in mind. Just leave it be as it was meant to be played.

But think of all the brown browns and grey greys! And all those deep, deep shadows that make the brown and grey seem so vivid and alive!

Diablo in HDR! Just picture it!

1616098654275.png


I don't know if any of you guys were around to remember getting to pick between 8-bit color and 16-bit color on the desktop, and then all the arguments about 24-bit color vs 32-bit color. And how some games didn't look right if you increased the bit depth.

HDR in games reminds me of that. It'll be this thing that is a big deal, there will be a bunch of screenshots and videos showing the difference but you won't really be able to tell unless you have an HDR capable setup, and then suddenly no one will care anymore.
 
One question I have is if this implementation of HDR will require HDR to be enabled in the windows display settings or not. I always leave HDR disabled in the windows display settings because it makes everything look too dark and washed out when I'm just browsing web pages, etc. I do like HDR in games however. I hope that it will allow HDR to be used in games without having to enable it for non-game / non-movie content.
According to the article linked in the OP you need to enable the global option to unlock the Auto HDR option.
But think of all the brown browns and grey greys! And all those deep, deep shadows that make the brown and grey seem so vivid and alive!

Diablo in HDR! Just picture it!

View attachment 340292

I don't know if any of you guys were around to remember getting to pick between 8-bit color and 16-bit color on the desktop, and then all the arguments about 24-bit color vs 32-bit color. And how some games didn't look right if you increased the bit depth.

HDR in games reminds me of that. It'll be this thing that is a big deal, there will be a bunch of screenshots and videos showing the difference but you won't really be able to tell unless you have an HDR capable setup, and then suddenly no one will care anymore.
32-bit color just adds the alpha channel (RGBA). We don't refer to it that way anymore because it causes confusion. Some things use the alpha channel while others don't. There is no bit depth that will make RGB into 32 bits. 10 bpc = 30-bit RGB, 12 bpc = 36-bit RGB.

You really can't illustrate the impact of HDR in an SDR screenshot. Looking at an HDR screenshot of a game on an SDR monitor will typically make it look washed out or oversaturated. That is simply not the case in practice if you have a proper display to take advantage of it (FALD LCD or OLED).

The biggest issue with HDR right now is some developers not implementing it properly in their games. It is an afterthought in some of the biggest games like RDR2, and that can create a false sense of expectations to a lot of viewers.
 
But think of all the brown browns and grey greys! And all those deep, deep shadows that make the brown and grey seem so vivid and alive!

Diablo in HDR! Just picture it!

View attachment 340292

I don't know if any of you guys were around to remember getting to pick between 8-bit color and 16-bit color on the desktop, and then all the arguments about 24-bit color vs 32-bit color. And how some games didn't look right if you increased the bit depth.

HDR in games reminds me of that. It'll be this thing that is a big deal, there will be a bunch of screenshots and videos showing the difference but you won't really be able to tell unless you have an HDR capable setup, and then suddenly no one will care anymore.
LOL. Maybe the Diablo 2 remaster WILL be in HDR! I do remember the 8 bit to 16 bit jump and it was an improvement. I think HDR could be beneficial in some cases, others not and obviously only on HDR equipment. HDR's uses for TV and movie watching is easy to quantify, games not so much and even then probably only for certain types and styles of games. It'll be interesting to see tho (pun intended) and could give a better impression than higher resolutions (8K+) which only make sense on ever larger screens.
 
I don't know if any of you guys were around to remember getting to pick between 8-bit color and 16-bit color on the desktop, and then all the arguments about 24-bit color vs 32-bit color. And how some games didn't look right if you increased the bit depth.

HDR in games reminds me of that. It'll be this thing that is a big deal, there will be a bunch of screenshots and videos showing the difference but you won't really be able to tell unless you have an HDR capable setup, and then suddenly no one will care anymore.
I've had a lot of arguments with people about 16bit vs 24bit, who were saying "65K colors is enough, who needs 16 million shades!" Same as the people who said "DVD resolution is enough, who needs HD" and the current argument is about "24 FPS is enough for movies"

This however is not whether HDR has benefits. It's about whether running a basic algorithm on games to make colors look more vivid on HDR displays is a good thing. I have nothing against HDR when the game is designed around it and done well. This however will probably end up looking like they applied a bloom filter to the games and dialed saturation to 9000.
 
Bigger. Bolder. Brighter. High dynamic range (HDR) offers the most impressive improvement to the visual experience in recent history. HDR unlocks an entirely new range of colors with more intensity than standard monitors, making games come to life like never before.

LOL.
 
This however will probably end up looking like they applied a bloom filter to the games and dialed saturation to 9000.

my guess is they will all end up looking like Half-Life 2: Lost Coast. A couple neat saturation effects, mind-blowing at the time, but niw nothing to write home about.
 
my guess is they will all end up looking like Half-Life 2: Lost Coast. A couple neat saturation effects, mind-blowing at the time, but niw nothing to write home about.
You can enable active side-by-side in the registry, shown at the end of the article, so you can see the difference in real time.
 
You can enable active side-by-side in the registry, shown at the end of the article, so you can see the difference in real time.

yep, i saw that. It’s exactly the kind of feature you need so people can actually see some kind of notable difference.
 
yep, i saw that. It’s exactly the kind of feature you need so people can actually see some kind of notable difference.
It's pretty obvious if you have a proper display. Unfortunately most monitors and even televisions being sold right now don't even have any local dimming feature. "HDR400" monitors use what has been called "full-screen dimming" which looks worse than SDR in practice, and is probably why there are sour opinions on HDR proliferating. I don't even think that "HDR600" with local dimming is enough to get the real impact.
 
So effectively, the only people who might see a noticeable difference with this feature are the ones who bought an expensive HDR TV, or an even more expensive FALD HDR computer monitor.

Forgive my lack of enthusiasm, but the industry did this to itself when it decided on different levels of HDR certifications, and when it became clear that it wasn’t going to be properly supported in Windows for a while. It’s just a checkbox feature on the PC, to say “yeah we can do that” for the comparatively small portion of users who bought the right kind of HDR capable display.

hence the need for a side-by-side comparison ability, so people can point at it and say “oh yeah, it actually does look different!”
 
  • Like
Reactions: M76
like this
32-bit color just adds the alpha channel (RGBA). We don't refer to it that way anymore because it causes confusion. Some things use the alpha channel while others don't. There is no bit depth that will make RGB into 32 bits. 10 bpc = 30-bit RGB, 12 bpc = 36-bit RGB.
Let me introduce you to my friend the 16-bit 5-6-5 TFT display.
 
Even the HDR certifications leave a wide variety of quality. My LG is only HDR400 but honestly looks pretty decent. I have another monitor that is also HDR400 and it looks horrible.

I know HDR1000 would be much better, I have seen some better TVs at Best Buy but those monitors are too expensive right now. I can afford them, but no way I'm paying over $2k for a monitor.
 
So effectively, the only people who might see a noticeable difference with this feature are the ones who bought an expensive HDR TV, or an even more expensive FALD HDR computer monitor.

Forgive my lack of enthusiasm, but the industry did this to itself when it decided on different levels of HDR certifications, and when it became clear that it wasn’t going to be properly supported in Windows for a while. It’s just a checkbox feature on the PC, to say “yeah we can do that” for the comparatively small portion of users who bought the right kind of HDR capable display.
Not to mention the price barrier on the right kind of HDR display, which is not even available in all sizes and formats.
 
Even the HDR certifications leave a wide variety of quality. My LG is only HDR400 but honestly looks pretty decent. I have another monitor that is also HDR400 and it looks horrible.

I know HDR1000 would be much better, I have seen some better TVs at Best Buy but those monitors are too expensive right now. I can afford them, but no way I'm paying over $2k for a monitor.

If 48" isn't too big of a monitor for you, LG's CX seems like an ideal choice right now. Around $1400-$1500 at most places, or a little under $1100 for a refub model on Woot.
 
I think the angry people just don't want to face the reality that they need to buy a display with good HDR now... ;) Very exciting development here, since W10 can use all the HDR help it can get, and good HDR is mindblowing.

Hard to say how it will turn out, until it happens, but I'm looking forward to progress.
 
Yeah, I remember all the naysayers when HD first came out. I was the first one of my friends to get an Xbox360 and as soon as they came over and played Fight Night, the argument was dead.

Also, when HD-DVD/Blu-Ray came out, people were saying like why does anime have to be in HD. I don't want to watch porn and see all of pores in the girl's skin and other such nonsense.
 
I've had a lot of arguments with people about 16bit vs 24bit, who were saying "65K colors is enough, who needs 16 million shades!" Same as the people who said "DVD resolution is enough, who needs HD" and the current argument is about "24 FPS is enough for movies"

This however is not whether HDR has benefits. It's about whether running a basic algorithm on games to make colors look more vivid on HDR displays is a good thing. I have nothing against HDR when the game is designed around it and done well. This however will probably end up looking like they applied a bloom filter to the games and dialed saturation to 9000.
Well it isn't necessarily just a basic algorithm. The thing is many games have been internally HDR for a long time. They use floating point math and deal in linear light space that goes beyond what a display is capable of, and then decides how to render that on to the display's available range. So if the OS can look in to those higher depth buffers and then use that to render out to a higher range display, it could work well. How well will it work? Dunno, and I imagine it'll vary per game depending on how its internal HDR calculations work. Some games that support HDR do a shit job of it, No Man's Sky is a great example, and there's no reason to think that games without explicit support will all be great. However it may be able to be used to good effect in other titles.

Be a nice option to play with, if we ever get some good HDR monitors.
 
I know this is MS and all, but I'd rather see them focus on getting this into something open instead of DirectX. Working to get this kind of feature into Vulkan would be preferable than more DirectX platform specific (ie Windows re'q ) focus. MS has seen how they can get benefits with people on other platforms buying their items - notably games on Steam, hardware like controllers etc... - when they make or at least allow them to be platform agnostic. Lets hope the same happens here. I mean, maybe it will be added to DXVK in the future or something but lets just focus on getting it Vulkan from the start instead.
 
I agree, but this is Microsoft, and the vast majority of existing games use DirectX, so this would be more beneficial there. Also, any new games in Vulkan can add their own HDR support.
 
I think this is a good point, any game made in Vulkan was made in a world in which HDR existed and if the game wanted to support it, they could have. (If I do not get the timeline wrong), this is for old title.
 
I think this is a good point, any game made in Vulkan was made in a world in which HDR existed and if the game wanted to support it, they could have. (If I do not get the timeline wrong), this is for old title.
Microsoft seems to be hitting on a lot of gaming cylinders right now. Their recent publications of their DirectML work has been pretty awesome and sort of exciting. They have also been working with some 3’rd parties to re-port Xbox titles to the PC for the Windows store/game pass. Which is angering Steam users and stuff but I get it, if they want to push the idea that Microsoft can do games putting shitty ports in your own store isn’t a great way to start.
 
Back
Top