HDR gaming on PC

polonyc2

Fully [H]
Joined
Oct 25, 2004
Messages
25,859
anyone here have an HDR gaming monitor or use a TV that has HDR support for PC gaming?...why is HDR lagging behind on PC while it's really taking off on consoles, UHD Blu-ray, streaming (Netflix, Amazon etc all have HDR support) etc...I bought an LG C7 OLED (4K + HDR support) last July (I have a separate dedicated G-Sync gaming monitor) and HDR is a real game changer in terms of visuals...way more important then the 4k resolution itself

my LG has support for all the current HDR formats including the best one- Dolby Vision...watching shows such as Altered Carbon (Netflix Dolby Vision) or Electric Dreams (Amazon HDR) is amazing because of the jawdropping HDR implementation...not to mention UHD Blu-rays with HDR support

why isn't HDR taking off as far as PC gaming support?...there seems to be a very limited selection of PC titles with HDR support (Mass Effect Andromeda and Battlefront 2 being some recent examples)...I know pricing is an issue for dedicated HDR gaming monitors but there has to be other reasons
 
I think the adoption rate is low because HDR monitors are still in the "coming soon" stage of development. Even after they are out and on the market, I suspect that the adoption rate will still be slow because the initial price leaks for the GSYNC HDR monitors was $3,000. If they are anywhere near the supposed leaked prices, HDR will be dead before we ever knew it launched.
 
I think the adoption rate is low because HDR monitors are still in the "coming soon" stage of development. Even after they are out and on the market, I suspect that the adoption rate will still be slow because the initial price leaks for the GSYNC HDR monitors was $3,000. If they are anywhere near the supposed leaked prices, HDR will be dead before we ever knew it launched.

Well you have a point that GSYNC HDR rumored pricing is rediculous.

But GSYNC and HDR are two separate things. Existing HDR monitors, even with Freesync HDR, don't carry nearly as high of a price premium. So it isn't HDR that's adding all that price - that's all (rumored to be) on nVidia right now.

I like HDR a lot, I agree with the OP that it matters to me more than 4K (also coming from an LG C7). I can't speak to high framerates or VRR to support that - a lot of people say that's more impactful, and I won't discount that out of hand, I just don't have any personal experience with it.

I recently picked up a couple of cheap "good enough" 4K IPS monitors (P2715Q) to replace some old LCDs for my main computer. They don't have all the bells and whistles that I want, but right now, no monitor does - so I'm not going to spend a ton of money on what is going to be a compromise somewhere.
 
  • Like
Reactions: toddw
like this
I think the adoption rate is low because HDR monitors are still in the "coming soon" stage of development. Even after they are out and on the market, I suspect that the adoption rate will still be slow because the initial price leaks for the GSYNC HDR monitors was $3,000. If they are anywhere near the supposed leaked prices, HDR will be dead before we ever knew it launched.

pricing of HDR gaming monitors definitely plays a big role but it also seems like developers aren't putting forth an effort for PC games while console games are getting a ton of HDR support...didn't AC: Origins come out of the box with HDR support on consoles while PC took longer (does the PC version even have HDR support?)...makes me want to buy a PS4 Pro to enjoy HDR gaming
 
People looked online to see if HDR was easy to use and saw a lot of issues, many of which couldnt be fixed because it was broken in the OS.
They decide its not worth buying a new monitor for.
Thats why the uptake of HDR is so slow on PC.
Monitors dont sell, mfrs dont invest in better ones, sales stagnate further, game devs dont bother with it.

Blame Microsoft for not knowing wtf they were doing in the first place and not giving a damn about the consumer.
I bet the huge delay in partially sorting it out was wasted trying to monetise it.
 
Seems like whenever I use HDR it fucks my color profile out of this world after quitting out of games and I have to manually turn it back off in Windows settings every single time. I dunno why it can't just figure out how to do it automatically.
 
For games I find that it works absolutely fine. Many/most AAA titles support it. Resident Evil 7, Destiny 2, Far Cry 5, Mass Effect Andromeda, Assassin's Creed Origins, Final Fantasy XV, etc.
The issue is MS's weird implementation of it on the desktop. If you just straight-up ignore that it'll kick in and work fine on everything except the Netflix app. If you want it to work on the Netflix app, you'll have to enable it prior to opening the app. It'll look janky, but once the movie/show starts it'll look right.
 
I play on an LG OLED and the games that support HDR look great. It does annoy me a bit that there are a few console games with HDR support that don't get it on PC.

The only game it's never worked with for me is Mass Effect, which was one of the first (if not the first?) PC game to support HDR. It looks washed out and shitty. The same issue does not exist in BF1 or Battlefront 2, which both look great.
 
I play on an LG OLED and the games that support HDR look great. It does annoy me a bit that there are a few console games with HDR support that don't get it on PC.

The only game it's never worked with for me is Mass Effect, which was one of the first (if not the first?) PC game to support HDR. It looks washed out and shitty. The same issue does not exist in BF1 or Battlefront 2, which both look great.

They ended up breaking normal HDR support for that game after the fact. It worked fine for the first month or so. Dolby apparently still works fine as-is, but HDR10 ended up getting tied to the system-wide HDR setting. That was right when the first Creators Update was hitting and EA went all-in by changing how HDR was going to be handled. They also subsequently stopped updating the game almost immediately afterward :p
If you still have it installed (or still care), try toggling it on system-wide and see if that helps. There were some people on the EA threads that claimed that worked, but I haven't looked into it in 6-7 months.
 
The only game it's never worked with for me is Mass Effect, which was one of the first (if not the first?) PC game to support HDR. It looks washed out and shitty. The same issue does not exist in BF1 or Battlefront 2, which both look great.

I think that was the first game to support Dolby Vision (which is an advanced form of HDR)...Dolby Vision looks stunning on my LG C7 in terms of non-gaming content so I was hoping the game would be the showcase title for HDR on PC
 
My TV goes washed out and purple with my PC and HDR. MS and nVidia blame eachother from what I can gather on forums. It may be fixed by now, but I haven't cared enough to try it again.
 
My problem has been screen tearing while in full screen, which is mandatory for HDR. (I have a C7 also) FC5 is the best example. When I enable full screen and HDR it looks awesome, but I get awful tearing even though I have the framerate capped at 60 and can easily maintain 60fps.
 
My problem has been screen tearing while in full screen, which is mandatory for HDR. (I have a C7 also) FC5 is the best example. When I enable full screen and HDR it looks awesome, but I get awful tearing even though I have the framerate capped at 60 and can easily maintain 60fps.

Interesting. What kind of connection are you using? I have no issues with vsync and HDR and am using a normal HDMI 2.0 hook-up.
 
Without some form of sync on you're going to get tearing. It doesn't matter if the fps perfectly matches the refresh rate, that just means you'll get consistent tearing in the same spot.
 
Interesting. What kind of connection are you using? I have no issues with vsync and HDR and am using a normal HDMI 2.0 hook-up.

HDMI from my 1080ti, cable doesn’t matter right?
 
Without some form of sync on you're going to get tearing. It doesn't matter if the fps perfectly matches the refresh rate, that just means you'll get consistent tearing in the same spot.

That doesn’t explain why the tearing goes away while in borderless windowed mode.
 
That doesn’t explain why the tearing goes away while in borderless windowed mode.

I haven't messed with it lately, but with some previous TV's I *had* run everything in windowed borderless or I'd get crazy stutter with 90% of games. Fullscreen vs. windowed fullscreen uses a slightly different refresh rate and it can make a big difference depending on your monitor's specs.
I might be wrong, but I think windowed fullscreen also auto-enables vsync. If you're in that situation, give this program a shot: https://github.com/Codeusa/Borderless-Gaming/releases
It was a lifesaver for me.
 
I have question:

For people with HDR - HDR is more bits, does this affect frame rate and render time?

Is it time for a review comparing with and without in some of the more demanding games?

I am curious and thinking I want to make a TR system with the new Ryzen parts are in them. HDR is something I would like to do.
 
Windowed mode enables vsync. Same for borderless. There are guides for getting vsync to have lower latency, using fastsync or whatever on blurbusters. It still won't be have lower latency than vsync off.
 
HDMI from my 1080ti, cable doesn’t matter right?
The cable matters greatly if it doesnt support the bandwidth you are trying to use.
If you want 4K 60Hz with 4:4:4 or 4K/60 4:2:2 10bit (HDR) you need the full bandwidth of HDMI 2.0.
Having said that, shorter cable lengths have less issues even with cheap cables.
Some short HDMI 1.4 cables are found good enough to support 17.8Gbps.
Long cables (over a few metres) are where the real problems start even when certified at 18Gbps, some cant do it.
The fatter the wire gauge used, generally the better but even then there are limits.

If you are having dropouts, connection issues or other display problems not related to gfx card function its worth considering if your cable is up to the job.
 
I have question:

For people with HDR - HDR is more bits, does this affect frame rate and render time?

Is it time for a review comparing with and without in some of the more demanding games?

I am curious and thinking I want to make a TR system with the new Ryzen parts are in them. HDR is something I would like to do.

The cost in terms of rendering is minute.
 
Video performance with and without HDR is identical with every game I've personally tested.
What you might notice is that the response time on different TV's changes when HDR mode is enabled. Sometimes it's negligible but with some models it can be extreme. I leave it on in Destiny (I'm a pad player anyway), but if I were playing at a tournament level it would definitely get turned off. My TV's one of the better ones and there is still a noticeable difference in mouse responsiveness.
 
Video performance with and without HDR is identical with every game I've personally tested.
What you might notice is that the response time on different TV's changes when HDR mode is enabled. Sometimes it's negligible but with some models it can be extreme. I leave it on in Destiny (I'm a pad player anyway), but if I were playing at a tournament level it would definitely get turned off. My TV's one of the better ones and there is still a noticeable difference in mouse responsiveness.

Thank you.

I'm still a few month out from pulling the purchase trigger... what model/brand do you have so I can keep it in mind?
 
Thank you.

I'm still a few month out from pulling the purchase trigger... what model/brand do you have so I can keep it in mind?

I'm on a 65" Samsung KS8000. It's a model that launched about 18 months ago, so you probably won't even see them on the shelf anymore.
Rtings lists (or at least they used to) response times for major televisions and at the time it was basically consider THE gaming television for under $3000. Considering how quickly technology moves, I'm sure there are better models now. HDR requires a totally different color calibration on that model and I know there are other models that do a better job of handling it. I have it saved as my "gaming mode," which is less than ideal.
 
I'm on a 65" Samsung KS8000. It's a model that launched about 18 months ago, so you probably won't even see them on the shelf anymore.
Rtings lists (or at least they used to) response times for major televisions and at the time it was basically consider THE gaming television for under $3000. Considering how quickly technology moves, I'm sure there are better models now. HDR requires a totally different color calibration on that model and I know there are other models that do a better job of handling it. I have it saved as my "gaming mode," which is less than ideal.

Pretty much this. Rtings.com is a godsend when reviewing TVs like this, especially since they do focus on how well they work as PC/Gaming monitors.

As a general rule, HDR is going to impact Response Time; how much is dependent on resolution and display mode. The "HDR Game Mode" on most sets helps quite a bit.

I've got a LG OLED B6P; almost went with the KS8000 myself (was the other finalist).
 
I have a 43" Sony Bravia X800D hooked up as my monitor. It doesn't have the higher end backlighting so HDR doesn't really pop as much as it could but overall it's a decent effect. I'd like to upgrade to a better HDR TV but they are all +50" which is too big to use as a PC monitor in my current setup.
 
I made the mistake of getting a 2016 OLED. Still looks great as a TV, but no HDR HLG support, and the HDR game mode that was added severely lowered the brightness, to the point it's useless. I can use non-gaming HDR for games, but input lag becomes roughly 80 ms, so not useful in games that require quick reflexes. (Now, I don't really regret the TV, as it's great, outside that one area. But I don't see myself upgrading any time soon).
 
Why does HDR need to have a standard when I could achieve the same thing with a 10 bpc monitor? After checking my dxdiag info dump, it appears I can't do a damn thing with HDR even with 10 bpc turned on. :rage:
 
Why does HDR need to have a standard when I could achieve the same thing with a 10 bpc monitor? After checking my dxdiag info dump, it appears I can't do a damn thing with HDR even with 10 bpc turned on. :rage:
The different max brightness and maximum spread of displayed colours varies a lot between displays.
To get the best clout from any HDR image without fine grained calibrating for every movie (for example), there needs to be a way of mapping a displays capabilities to what the videos author intended you should see.
It doesnt just need a standard it needs a very good one.
Thats why there are so many standards, they are evolving.
This needs to happen.

The early days of many important technologies take time to establish and early adopters pay for the privilege.
Thats not to say you should be having a hard time doing things manually.
But it is still early days.
 
The different max brightness and maximum spread of displayed colours varies a lot between displays.
To get the best clout from any HDR image without fine grained calibrating for every movie (for example), there needs to be a way of mapping a displays capabilities to what the videos author intended you should see.
It doesnt just need a standard it needs a very good one.
Thats why there are so many standards, they are evolving.
This needs to happen.

The early days of many important technologies take time to establish and early adopters pay for the privilege.
Thats not to say you should be having a hard time doing things manually.
But it is still early days.

I'll just have to wait for an amazing ultrawide HDR display years from now.
 
  • Like
Reactions: Nenu
like this
PC has an HDR support problem - and Nvidia wants to fix it

the PC format has fallen behind in one key area: support for high dynamic range - the future of display technology...HDR screens for PC users are thin on the ground and often poorly specced, while the list of supported games stands at less than 50 per cent of the number found on consoles...are extreme spec 4K displays the answer?...

https://www.eurogamer.net/articles/digitalfoundry-2018-how-nvidia-aims-to-address-pcs-hdr-problem
 
PC has an HDR support problem - and Nvidia wants to fix it

the PC format has fallen behind in one key area: support for high dynamic range - the future of display technology...HDR screens for PC users are thin on the ground and often poorly specced, while the list of supported games stands at less than 50 per cent of the number found on consoles...are extreme spec 4K displays the answer?...

https://www.eurogamer.net/articles/digitalfoundry-2018-how-nvidia-aims-to-address-pcs-hdr-problem

Can't wait to see "nVidia GeDR", or whatever proprietary thing they throw at it, that only comes on monitorsbundled with GSync, for extra $$$, only works on mid-high tier cards, and only in games where they pay the developers to include support.

But it will be amazing... /rolls eyes
 
PC has an HDR support problem - and Nvidia wants to fix it

the PC format has fallen behind in one key area: support for high dynamic range - the future of display technology...HDR screens for PC users are thin on the ground and often poorly specced, while the list of supported games stands at less than 50 per cent of the number found on consoles...are extreme spec 4K displays the answer?...

https://www.eurogamer.net/articles/digitalfoundry-2018-how-nvidia-aims-to-address-pcs-hdr-problem

Bullshit article is bullshit. As this coming from someone who considers AMD video cards to not exist. I will only buy Nvidia video cards, but there is no denying that their proprietary technologies are fragmenting the market. Pushing these 4K 120hz G-Sync HDR monitors is not going to fix HDR on PC. These monitors are going to be insanely expensive, and be loaded full of features that many people cannot utilize or do not want.

I have a G-Sync monitor and I think it's incredible stuff, but it would be a lot better for the monitor market if Nvidia didn't push this proprietary technology on us. I don't want a 4K display, I'd rather stick at 1440p 144hz. If I could pick from a selection of freesync HDR monitors, I'd buy one in a heartbeat. But since I have to wait for an appealing option with G-Sync, it will likely be some time before I have an HDR monitor.

And all the while devs pay no mind to the fact that my LG OLED makes a great companion to my PC for couch gaming.
 
I play on an LG OLED and the games that support HDR look great. It does annoy me a bit that there are a few console games with HDR support that don't get it on PC.

The only game it's never worked with for me is Mass Effect, which was one of the first (if not the first?) PC game to support HDR. It looks washed out and shitty. The same issue does not exist in BF1 or Battlefront 2, which both look great.

What settings do you guys use? I have a 2017 LG OLED. It seems to either 1) not work 2) have shitty colors 3) terrible brightness.

I tried messing with the TV settings, Nvidia control panel settings, and the windows on/off. Not sure what I'm doing wrong.
 
What settings do you guys use? I have a 2017 LG OLED. It seems to either 1) not work 2) have shitty colors 3) terrible brightness.

I tried messing with the TV settings, Nvidia control panel settings, and the windows on/off. Not sure what I'm doing wrong.

I have HDR turned on in Windows. The only game that doesn't work right for me is Mass Effect. If given a choice (which only seems to exist in Frostbite games), I use HDR10, not Dolby Vision. I manage brightness in game and leave my TV settings alone.
 
I have HDR turned on in Windows. The only game that doesn't work right for me is Mass Effect. If given a choice (which only seems to exist in Frostbite games), I use HDR10, not Dolby Vision. I manage brightness in game and leave my TV settings alone.

what are your TV settings, just one of the default presets? PC named as Input? Low/High black levels? People say to use High black levels with a PC input (or on PS4 with Full range) but it makes the blacks look grey instead.

What about nvidia settings? using Default color settings? or NVIDIA color settings? and what color outputs?

There are so many options..
 
And all the while devs pay no mind to the fact that my LG OLED makes a great companion to my PC for couch gaming.

since you already have an LG OLED then you know what a game-changer HDR is with the right content...watch Altered Carbon on Netflix with Dolby Vision or Electric Dreams w/HDR on Amazon Prime and your jaw will drop, not from the 4K resolution but from the implementation of HDR...I think HDR monitors are way more important then SLI or multi-monitor setups etc

if you already have a G-Sync monitor then you're already bought into Nvidia's proprietary tech...if more developers brought HDR support to PC it would be amazing...as it is now I feel like the console experience has made a tremendous leap forward with HDR support in games like God of War, Horizon Zero Dawn etc
 
what are your TV settings, just one of the default presets? PC named as Input? Low/High black levels? People say to use High black levels with a PC input (or on PS4 with Full range) but it makes the blacks look grey instead.

What about nvidia settings? using Default color settings? or NVIDIA color settings? and what color outputs?

There are so many options..

I don't know what my TV settings are off the top of my head, and not at home to check. But everything goes though an AV receiver then to my TV so the settings for the input are the same settings I configured for my Blu-Ray player. I haven't made any setting changes in Nvidia control panel.

since you already have an LG OLED then you know what a game-changer HDR is with the right content...watch Altered Carbon on Netflix with Dolby Vision or Electric Dreams w/HDR on Amazon Prime and your jaw will drop, not from the 4K resolution but from the implementation of HDR...I think HDR monitors are way more important then SLI or multi-monitor setups etc

if you already have a G-Sync monitor then you're already bought into Nvidia's proprietary tech...if more developers brought HDR support to PC it would be amazing...as it is now I feel like the console experience has made a tremendous leap forward with HDR support in games like God of War, Horizon Zero Dawn etc

I'm half way through Altered Carbon, and I while I'm having trouble getting into the show (like the story, hate the acting), it's definitely very visually striking. I'll check out Electric Dreams, looks interesting.

And your right, I did already buy into nvidia's proprietary tech. My point was that locking down peoples monitor choices based on their GPU choice is not good for the growth of new monitor tech. I don't care what Nvidia says, I do not believe they are positioned to fix the poor HDR adoption for PC monitors if they continue with their current tactics.
 
Back
Top