PC Has an HDR Support Problem

For those who say 4k is niche, well I read the exact same comments 10 years ago about 1080p and a few years before that about 720p. I also remember explaining to my friends with consoles the visual difference of 30fps vs 60fps and most didn't get it until they saw it on my rig(at the time a hot rodded P4(3.4ghz) connected via s-video gold contacts to a 32" Sony Trinitron @ 1024x768). As hypocritical as it was I said the same about 120hz(based on outdated info fomr high school stating human vision was limited to ~60fps), and then I got a 3d laptop with 120hz. Well since then I've been off the deep with display tech but I reached my limit of what I could perceive at 144hz in terms of FPS but I can still make out differences of color depths but occasionally can be fooled with pixel manipulations(checkerboard/mirrors/or even AA).

HDMI 2.1-looks to be cool thing. I've commented in a number of places that I feel we're really at a point we need to be able to test the throughput of our ports/cables. I've had to do a lot of rearranging of things in our living room to accommodate new stuff and keep cables hidden. This has brought on new education and experiences regarding cable length thresholds vs. HDR10/4k/60hz/4:4:4 or 1440p/144hz. I can factually tell you that from DP1.2 to HDMI 2.0 we've reached limits anytime you use a cable past 3-4'. DolbyVision and the next couple of HDR specs on the way plus higher HZ look to populate HDMI 2.1 even before it gets released. When 2.0 came out many experts said it was fake and that any highspeed 1.4 would do. Took me about 3 months to find out how untrue that was if you wanted all the bells and whistles of 4k over 3 feet. It doesn't matter to me about DP vs HDMI anymore. I just want something that works and can hold its own for more than a couple of years. These days the options are evolving quicker than the ports or GPU's that drive them.

Back to the root of this thread. People who want to stay at 1080p, 1440p, or even ultra wide, well that's great and nothing wrong with it. I game in mostly in 1440p/144hz, sometimes in 4k/60 HDR or non-HDR, and occasionally will test things in 1080p/120hz just because I remember when I first started gaming in 1080p and thinking one day the textures should catch up to look more like a blu-ray movie and slowly(over a decade) they are. Ironically enough the bigger issue isn't really display resolution but texture resolutions(which are getting better but also show how even modern GPU's can even struggle at 1080p-KCD looking at you) and actual feature support. This too goes back to the fact that PC gaming HDR support is a hot mess. Throw in the 2 present dominating standards(HDR10 & DolbyVision) plus the half dozen or so HDR monitor spec types and it gets even worse. Add the stated fact that most dev's don't even know how to properly implement it for their games and it reaches a nightmare. From anally retentive engineers claiming how a few dots in one screen look better than the other version to money hungry marketing execs looking for the next big labelled payload, I think they need to just slow down and let the market settle into something that can gain a greater consumer confidence. It's not that the goal posts are moving so much as that every other day a new one is added and know one really knows which to aim for. Hence, PC has a HDR support problem.
 
The problem with PC is there being no software-level standard for it. NVIDIA implemented its own solution at the driver level and then Microsoft does its own thing 3 months later that conflicts with it. It also seems that HDR is largely being ignored by Microsoft now that they have bullied their way in.

What doesn't make sense to me is there are so many console versions of multiplatform titles that do HDR, but their PC versions do not. Why make the extra effort to remove a feature from one version of the game?
 
For 4k:

1. There needs to be content.

There actually is a ton of content. Netflix is basically releasing everything they put on the platform in 4k as well as Amazon with their in house productions. Tons of UHD discs and a lot of console games with an X or Pro (a lot that properly support HDR too like Forza I mentioned above, but numerous others)... that said a very high end panel with HDR makes a much bigger difference than 4k. I have a Samsung 9500 LED and an LG OLED that both do a great job with HDR - there is still a scene in Stranger Things where they are in a pitch black area with a flashlight and for the first time in many years of "high end" viewing it looked "real" (hurt my eyes when it came across). And that was on the OLED which isn't as bright as the LED...

PC gaming HDR though is a different story since most of us are still using monitors and justification for a monitor that supports HDR is harder than a nice HT TV. I know some of us will plug into a high end TV, for example Kyle uses a 9000 series Sammy for his main, but that's not for everyone.

I don't know that it's content, I think what we are experiencing now is a saturated market but with panels that are just crappy. The difference between 4k and 1080p for the average user is going to be near nothing so they aren't demanding or seeking content. Most are getting the crappy 50-60" 4k TV and mounting it 15' from the viewing location. Sure there is some placebo effect, but it's 8' before no returns on a 65" panel. Then there are those of us that designed a HT around a viewing experience and have proper hardware, properly spaced... I'm even on the outside of proper 4k at just under 8.5' for my viewing spot, but it was farthest my wife would let me move up the couch ;).
 
So I think this is a mistake. Consoles are going to support HDR10 as thats the spec-de-jour but if the game is a port (implying it already has HDR) I sure would like to use that on my PC monitor.
 
the problem with HDR is implementation, dev usualy butcher the color, look at farcry 5 and mass effect andromeda, even cd project red who are more talented than ubisoft and EA did a half baked job in the witcher 3.

FYI... the HDR problems are with Windows itself, not games or necessarily NVIDIA/AMD drivers. The most recent feature packs went a long way toward fixing this, and I've played the last few games I have on PC in HDR and it looks great. It was a MESS though when MS first added their own messed up HDR option to the display settings and borked NVIDIA's (and various games') existing implementations. That is largely fixed now, though it takes some know-how to understand it and what you're doing (and what setting should be set to what variable and where) - but it's not hard with some practice tweaking.

By the way, why do all the non-tech people think that people don't PC game on TVs these days? You can get a better screen in every way with a TV - except refresh rate. But refresh rates over 60 are A) impossible at 4k using anything but the newest DisplayPort standard, and B) only for those who are deluded into thinking they're going to be another CS GO or DOTA pro. While higher than 60Hz IS noticeable, and desirable, it's currently not worth it for most gaming, because you sacrifice more graphical effects/quality to hit 60 FPS, let alone higher. 60 FPS is definitel more than good enough for non-twitch gaming combined with someone who genuinely has the time and opportunity to practice being a pro at a game like that. Otherwise, it should be about having fun and an immersive experience, not using a shitty VA panel to get 144 Hz and 144 FPS or higher so you can compete with the top of the world's pros...

I LOVE my OLED for gaming, it's by FAR the biggest screen quality advancement I've ever seen, even going from CRT to LCD which had major trade-offs still at first! (and BTW, the burn in issues with OLEDs are really not a concern. All the screens have built in protection from it, and at worst you see a very slight and very temporary image if there is a bright object on a dark background for a long time. But it goes away as soon as the TV is turned off (and it does maintenance when turned off these days), or you play any other content without static images for a few minutes. Maybe I'll be crying in a few years because of my terrible burn in, but I have yet to see even the earliest OLED adopters complaining of permanent burn in, in any reasonable case (I haven't even seen it be a permanent issue even if it is an extreme case, like a floor model T, but we still have to wait another year or two to be sure it isn't an issue over the long term... so far, it seems like it won't be.
 
Last edited:
HDR can go die in a fire. Im sick of the over exaggerated bright transitions from closed areas to brighter ones.

You should get your eyes checked! Not only are there very noticeable better colors (millions upon millions more), but the darks AND brights are so much better. Keep in mind a lot of current games implemented the feature at the last minute and over-did the bright vs. dark a bit in many cases.

But I'd be hard-pressed to skip HDR these days... for example, if a game is released on PC and a decent or better port, I always play it on PC. But if that same game came out and supported HDR on console but not PC, I'd have a tough choice to make - and most other things being equal, I'd take PS4 Pro or XBOX One X with HDR over a PC version without these days, I think, depending on the game and the framerate being acceptably near or at/above 60.
 
The problem with PC is there being no software-level standard for it. NVIDIA implemented its own solution at the driver level and then Microsoft does its own thing 3 months later that conflicts with it. It also seems that HDR is largely being ignored by Microsoft now that they have bullied their way in.

What doesn't make sense to me is there are so many console versions of multiplatform titles that do HDR, but their PC versions do not. Why make the extra effort to remove a feature from one version of the game?

I totally agree and that's why come black friday I'm looking for PS4 pro deals ;)

Plus by then, a lot of their exclusives might drop in price.
 
simple answer really as to why display port survives

No licensing royalty.

You still need to include the port and physical connections, which costs both time and money. Throw in the fact HDMI offers more bandwidth and is more ubiquitous, and DP becomes redundant.
 
So was 1080p, once upon a time.

I remember drooling over expensive 1280x1024 non-interlaced 0.28mm dot pitch monitors, when 1024x768 interlaced 0.39mm dot pitch was the norm...and the argument back then was "literally nothing can take effective advantage of the higher resolution".
 
It is getting adopted quick. Half the TVs displayed at Walmart/Meijer, etc are 4K now. Even my non-techy aunts and uncles are getting them.
 
Back
Top