PC Has an HDR Support Problem

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Despite the impending release of the Acer Predator X27 and Asus ROG Swift PG27UQ, Eurogamer advises that PC games with HDR support may not necessarily grow in number, as these costly displays will entice few buyers. Obviously, adoption rates and developer interest will not improve until modest but cheaper options are available: the author argues that these might make more sense even on a technical level.

In terms of potential spec reductions, a cheaper 60Hz equivalent to these screens would work just fine for the majority of gamers. After all, the irony is that no GPU exists that can run the latest games at 4K at anything like 120fps, let alone 144fps. Indeed, even getting a reasonably solid 60ps on today's top-end PC hardware usually requires careful settings tweaks, and G-Sync proves highly useful for simply for ironing out the kinks in the 50-60fps range.
 
Makes perfect sense to me. I just want a 75-120Hz HDR 1080P 23-24" non-OLED monitor (preferably from Asus), and I don't want to pay >$200 for it.

(Don't get me wrong, OLED looks nice, but I'd get the burn in really bad for sure. LED backlit IPS panel is plenty good 'nuff.)

Edit: Freesync would be nice, but I wouldn't place a huge premium on it.
 
I will upgrade my video card many times before I get a new monitor. So while cards might not do 144fps @4k today. The will in the future.
 
I will upgrade my video card many times before I get a new monitor. So while cards might not do 144fps @4k today. The will in the future.

That's true, and I see the value in it in your use case, but if future-proofing was the case, I'd expect it to support the "freesync-like" frame syncing feature of the new HDMI spec. GSync and Freesync are really dead-man-walking technologies at this point.
 
the problem with HDR is implementation, dev usualy butcher the color, look at farcry 5 and mass effect andromeda, even cd project red who are more talented than ubisoft and EA did a half baked job in the witcher 3.
 
I will upgrade my video card many times before I get a new monitor. So while cards might not do 144fps @4k today. The will in the future.

This.

I know that eventually I'll have the ability to run the resolution and refresh rate. There's nothing proprietary about that.

But throw in competing VRR and HDR standards, and that's a crapshoot. I usually use my monitors for many years, and I am not afraid to spend some money for a good one, but I don't want to buy one only to find out I bought the BetaMax monitor.
 
  • Like
Reactions: DF-1
like this
I like my QLED 144hz samsung. It looks much better than my old LCD. VA panels seem like a reasonable middle of the road, and the response speed is swift.
 
That's true, and I see the value in it in your use case, but if future-proofing was the case, I'd expect it to support the "freesync-like" frame syncing feature of the new HDMI spec. GSync and Freesync are really dead-man-walking technologies at this point.

not a bad point.
 
Proper HDR actually implemted anywhere yet? 1000nits brightness? Most monitors have 1/3 that capability.
 
We're far from HDR. Majority of the HDR monitors don't even have inky black. Almost all movies and games have climax storylines in dark scenes.
 
I wish [H] did an article on HDR, more specifically 4k HDR via HDMI 2.0... seems a lot of people could use some sort of guide.

ill give you a quick guide - it fucking sucks. wait for hdmi 2.1 when at least the hardware shortcomings will be fixed (no more having to switch to 4:2:0 10bit for hdr support, because hdmi 2.0 cant handle 4:4:4 10 bit) and microsoft or nvidia really need to unfuck the desktop hdr settings.
 
ill give you a quick guide - it fucking sucks. wait for hdmi 2.1 when at least the hardware shortcomings will be fixed (no more having to switch to 4:2:0 10bit for hdr support, because hdmi 2.0 cant handle 4:4:4 10 bit) and microsoft or nvidia really need to unfuck the desktop hdr settings.
So I either see this response or a "everything is great". That's sort of why I want a [H] article. I get where you're coming from though
 
ill give you a quick guide - it fucking sucks. wait for hdmi 2.1 when at least the hardware shortcomings will be fixed (no more having to switch to 4:2:0 10bit for hdr support, because hdmi 2.0 cant handle 4:4:4 10 bit) and microsoft or nvidia really need to unfuck the desktop hdr settings.

2.0 can handle 4:4:4 HDR, just not at 4k 60Hz. Even then, Windows screws up HDR so badly I keep HDR off on the Desktop and only enable it in games, where 4:2:2 is "good enough".

HDMI 2.1 adds more bandwidth then god, and throws in VRR to boot. I think when all is said and done, HDMI 2.1 is going to do to Displayport exactly what USB 2.0 did to Firewire (meaning: When is the last time you saw a Firewire port?). We don't need two competing display standards, and given HDMI is more ubiquitous and now trounces Displayport in bandwidth, I simply don't see the other surviving. *Maybe* if DP broke into Home Theater systems, but HDMI is so ubiquitous outside of the PC realm I can't see Displayport surviving now.
 
2.0 can handle 4:4:4 HDR, just not at 4k 60Hz. Even then, Windows screws up HDR so badly I keep HDR off on the Desktop and only enable it in games, where 4:2:2 is "good enough".

HDMI 2.1 adds more bandwidth then god, and throws in VRR to boot. I think when all is said and done, HDMI 2.1 is going to do to Displayport exactly what USB 2.0 did to Firewire (meaning: When is the last time you saw a Firewire port?). We don't need two competing display standards, and given HDMI is more ubiquitous and now trounces Displayport in bandwidth, I simply don't see the other surviving. *Maybe* if DP broke into Home Theater systems, but HDMI is so ubiquitous outside of the PC realm I can't see Displayport surviving now.


im thinking hdmi 2.1 is going to be what hdmi 2.0 should have been from the get go. it seems so stupid to have released i2.0 with the known shortcomings.

but as you said, all the support in the world doesnt matter when windows screws up the desktop support anyways.
 
This logic makes no sense to me. I always try to buy PC hardware that is as future proof as possible, and HDR support is one of those features I consider as future proof as it gets.

You are living in the past and wasting alot of money if you do this for a meh experience.
 
the problem with HDR is implementation, dev usualy butcher the color, look at farcry 5 and mass effect andromeda, even cd project red who are more talented than ubisoft and EA did a half baked job in the witcher 3.
But when done right... forza 7 on the one x 4k@60 with hdr is gorgeous. Tried it on my rig below and although maxed at 80 fps, hdr makes such a huge difference theres nearly no comparison.
 
When they have OLED monitors that don't burn in + HDR then I'll consider it.
 
A legit gaming monitor is a 5-10 year commitment. It just makes sense to future proof a bit. That said, I still don't feel the need to upgrade from my 6-year-old 120Hz 1080p monitor.
 
HDR can go die in a fire. Im sick of the over exaggerated bright transitions from closed areas to brighter ones.
 
To clarify, HDR / 4K works freakin beautifully on PC if watching HDR10 UHD discs.
75” FALD 4K HDR goodness is a sight to behold.

HDR PC gaming, not so much.
 
For HDR and 4:4:4, the new 4k144hz display is capped at 98hz :(

More than likely a limitation related to the port. It's another detail regularly overlooked with these newer displays. Max color depths, HDR, 4k, 144hz is going to consume a huge amount of bandwidth even with DP1.4. Hope the cable they ship with is up to par or else people will be complaining about dropouts too. This tech is evolving at a ridiculously uneven pace in terms of everything that needs to connect to it.
 
I'm running 4k@60hz most games I play run 120 to 150 fps. Or run quake 3 arena over 1000 fps
 
2.0 can handle 4:4:4 HDR, just not at 4k 60Hz. Even then, Windows screws up HDR so badly I keep HDR off on the Desktop and only enable it in games, where 4:2:2 is "good enough".

HDMI 2.1 adds more bandwidth then god, and throws in VRR to boot. I think when all is said and done, HDMI 2.1 is going to do to Displayport exactly what USB 2.0 did to Firewire (meaning: When is the last time you saw a Firewire port?). We don't need two competing display standards, and given HDMI is more ubiquitous and now trounces Displayport in bandwidth, I simply don't see the other surviving. *Maybe* if DP broke into Home Theater systems, but HDMI is so ubiquitous outside of the PC realm I can't see Displayport surviving now.


simple answer really as to why display port survives

No licensing royalty.
 
HDR may not be for the mainstream yet but here on [H] we tend to go [H]arder than the average consumer. I personally play 4k 96Hz and don't think the author has talked to anyone from this forum
 
Despite the impending release of the Acer Predator X27 and Asus ROG Swift PG27UQ, Eurogamer advises that PC games with HDR support may not necessarily grow in number, as these costly displays will entice few buyers. Obviously, adoption rates and developer interest will not improve until modest but cheaper options are available: the author argues that these might make more sense even on a technical level.

In terms of potential spec reductions, a cheaper 60Hz equivalent to these screens would work just fine for the majority of gamers. After all, the irony is that no GPU exists that can run the latest games at 4K at anything like 120fps, let alone 144fps. Indeed, even getting a reasonably solid 60ps on today's top-end PC hardware usually requires careful settings tweaks, and G-Sync proves highly useful for simply for ironing out the kinks in the 50-60fps range.

I don't get this push for 4k....4k 4k 4k this 4k that...

Why not focus on the beautiful 1440p right now and really push the HDR and Gsync tech at an affordable level. Who is making these decisions up high for crying out loud.

Even Linus Sebastion stated the same... why 4k? Why not just 3440 or 1440p for now but make it shine.
 
There are lots of console games with HDR. Some of them have multi-plat PC versions with no HDR or half baked implementations. Makes no sense.

You shouldn't need to be maxing every other possible feature, before HDR is allowed. But that's the way a lot of people here are talking. 4k max graphics 100+fps g/free sync etc.

some of us just wanna plug our game into an HDR capable display (such as a nice HDTV. What blasphemy).
 
because integer scaling of 1080p content looks great , better than any non-integer 1440p scaling.
because 4k creates market for *sync displays.
because 4k > 1440p

Well we see where 4k is headed and it's a niche market.
 
I don't get this push for 4k....4k 4k 4k this 4k that...

Why not focus on the beautiful 1440p right now and really push the HDR and Gsync tech at an affordable level. Who is making these decisions up high for crying out loud.

Even Linus Sebastion stated the same... why 4k? Why not just 3440 or 1440p for now but make it shine.

The full explanation escapes me at the moment, but I've read that 'throwing more pixels at the problem' really is the best way to address LCD's shortcomings with motion. LCDs have always been clear and crisp for static images, but motion quality issues have led every vendor to develop fixes, compromises, and gimmicks to try and mask the problem. Moving to 4k or 8k isn't as big as the jump from 720 to 1080p was for image quality but there are still benefits.

I'm sure the other side of it is production. If they make anything 4k it's cheaper just to move all production to 4k.
 
Consoles are blazing the trail with HDR and their ports to PC will probably continue to support HDR there as well. The biggest issue is not support but the price tag of a good HDR screen with the proper backlighting capabilities and no burn-in.
 
I'll (eventually) migrate to 4k.

They've finally settled on two HDR formats. I wish it'd be one.
4K can mean different things. Is it 4:4:4/60 4k, or 4:2:2/24 4k? (Or whatever...) Yes, the technical basis for the resolution is more complex than just saying "4k". Heck, there's even some confusion over exactly what pixel array is "4k".

For 4k:

1. There needs to be content.
-- Streaming Web
-- Gaming
-- Disks
-- Networked LAN

2. The entire viewer "chain" needs to be updated to 4k.
-- Source (console, computer, Bluray)
-- Amplifier/Receiver (if needed)
-- Cables connecting all the 4k items
-- Display

I will not invest in 2 until I am satisfied that 1 provides enough "pull". As far as 2, there are a myriad of costs and issues. At the most basic level, what hdmi cable works? What length?

Next, the pure horsepower needed. Looking at a PC, to get a true 4k graphics card (capable of 4:4:4/60 4k) pushes, what, $700 these days? That's just the card. It doesn't include the monitor or the any other parts of the PC.

If you're just running one 4k system, the cost is high, but doable.

Right now, I have no need or desire to upgrade to 4k. I look at streaming videos at 1080p and notice all sorts of compression artifacts. Why would I go 4k to see the compression artifacts? (Banding, dropped frames. This is both Netflix and Amazon.)
 
I'll (eventually) migrate to 4k.

They've finally settled on two HDR formats. I wish it'd be one.
4K can mean different things. Is it 4:4:4/60 4k, or 4:2:2/24 4k? (Or whatever...) Yes, the technical basis for the resolution is more complex than just saying "4k". Heck, there's even some confusion over exactly what pixel array is "4k".

For 4k:

1. There needs to be content.
-- Streaming Web
-- Gaming
-- Disks
-- Networked LAN

2. The entire viewer "chain" needs to be updated to 4k.
-- Source (console, computer, Bluray)
-- Amplifier/Receiver (if needed)
-- Cables connecting all the 4k items
-- Display

I will not invest in 2 until I am satisfied that 1 provides enough "pull". As far as 2, there are a myriad of costs and issues. At the most basic level, what hdmi cable works? What length?

Next, the pure horsepower needed. Looking at a PC, to get a true 4k graphics card (capable of 4:4:4/60 4k) pushes, what, $700 these days? That's just the card. It doesn't include the monitor or the any other parts of the PC.

If you're just running one 4k system, the cost is high, but doable.

Right now, I have no need or desire to upgrade to 4k. I look at streaming videos at 1080p and notice all sorts of compression artifacts. Why would I go 4k to see the compression artifacts? (Banding, dropped frames. This is both Netflix and Amazon.)

There isn't really that much confusion about 4K. It's 3840 × 2160 (4K UHD) vs 4096 × 2160 (DCI 4K). You might as well bemoan the confusion over HDD/SSD's advertised vs actual storage size. Framerate and chroma subsampling are not tied to the 4K resolution description in anyway. However, 4K HDR is limited by HDMI 2.0 standard to 60 FPS and adoption of HDMI 2.1 will take awhile to trickle down.

The rest of what you say has some merit but it also echos sediments spoken about pretty much every major jump in display tech in the last few decades. But in the end the major reason people say these reasons is because money.
 
I'll (eventually) migrate to 4k.

They've finally settled on two HDR formats. I wish it'd be one.
4K can mean different things. Is it 4:4:4/60 4k, or 4:2:2/24 4k? (Or whatever...) Yes, the technical basis for the resolution is more complex than just saying "4k". Heck, there's even some confusion over exactly what pixel array is "4k".

For 4k:

1. There needs to be content.
-- Streaming Web
-- Gaming
-- Disks
-- Networked LAN

2. The entire viewer "chain" needs to be updated to 4k.
-- Source (console, computer, Bluray)
-- Amplifier/Receiver (if needed)
-- Cables connecting all the 4k items
-- Display

I will not invest in 2 until I am satisfied that 1 provides enough "pull". As far as 2, there are a myriad of costs and issues. At the most basic level, what hdmi cable works? What length?

Next, the pure horsepower needed. Looking at a PC, to get a true 4k graphics card (capable of 4:4:4/60 4k) pushes, what, $700 these days? That's just the card. It doesn't include the monitor or the any other parts of the PC.

If you're just running one 4k system, the cost is high, but doable.

Right now, I have no need or desire to upgrade to 4k. I look at streaming videos at 1080p and notice all sorts of compression artifacts. Why would I go 4k to see the compression artifacts? (Banding, dropped frames. This is both Netflix and Amazon.)
Sounds like a lot of whining.
Been 40”+ 4K PC gaming since 2014 and 4K HTPC/75” since 2016. It’s wonderful. It’s beautiful. HDR really is worth the hype. I could never go back. Nothing else compares.
Tons of content. There’s like 300+ UHD discs now with more titles releasing weekly. Netflix has gobs of content in UHD - everything new is being released in UHD.
Big 75”+ displays are getting cheaper all the time. Shoot you can get a decent 85” SONY FALD for under $5K now.
The time was like 2 years ago to get into 4K..
 
Back
Top