Displayport 1.2 > HDMI 2.0/2.0a to get 4k 60hz 10bit HDR

Joined
Apr 8, 2016
Messages
16
Hello, i wondering is it possible to get 4k 4:4:4 60hz with GPU display port 1.2 and 4k tv with 10 bit screen? Some LG and Samsung 2016 tv will have 10 bit screens, so if you will connect GPU with display port 1.2 and converter to HDMI 2.0(or 2.0a) will it work? And if it not possible on what side? Is it display port 1.2 problem? Or converter and lack hdcp 2.2? Or tv?
 
10b lives in the realm of professional graphics cards, not of consumer cards. a 10b panel inside a consumer TV is about as useful as a brain inside the skull of some candidates.
 
Hello, i wondering is it possible to get 4k 4:4:4 60hz with GPU display port 1.2 and 4k tv with 10 bit screen? Some LG and Samsung 2016 tv will have 10 bit screens, so if you will connect GPU with display port 1.2 and converter to HDMI 2.0(or 2.0a) will it work? And if it not possible on what side? Is it display port 1.2 problem? Or converter and lack hdcp 2.2? Or tv?
HDMI 2.0 only supports 4:2:0 chroma with 10-bit color at 4K resolution and 60 Hz
10b lives in the realm of professional graphics cards, not of consumer cards. a 10b panel inside a consumer TV is about as useful as a brain inside the skull of some candidates.
Consumer AMD cards support 10-bit and higher color. But NVIDIA currently has a driver lockout for 10-bit color with GeForce, limiting it to Quadro and Tesla.
 
10b lives in the realm of professional graphics cards, not of consumer cards. a 10b panel inside a consumer TV is about as useful as a brain inside the skull of some candidates.
Well 2016 tv trend is HDR, so 10 bit panels
HDMI 2.0 only supports 4:2:0 chroma with 10-bit color at 4K resolution and 60 Hz
Pity, but HDMI 2.0 can be software updated to 2.0a, and if 2016 tv will have 2.0a is it possible to get 444 10 bit 4k at 60hz?
 
10b lives in the realm of professional graphics cards, not of consumer cards. a 10b panel inside a consumer TV is about as useful as a brain inside the skull of some candidates.
There is a good reduction in banding in Alien: Isolation when using >8-bit color.
Unfortunately, Alien: Isolation is the only game I am aware of which supports this feature.

You can still benefit from a >8-bit output in many games which use the GPU LUT for adjusing gamma/brightness however - on NVIDIA cards in particular, since they process the LUT with as many bits as the currently selected output.

This is all separate from 10-bit HDR which treats bit-depth very differently than SDR.

Consumer AMD cards support 10-bit and higher color. But NVIDIA currently has a driver lockout for 10-bit color with GeForce, limiting it to Quadro and Tesla.
NVIDIA has had 10-bit and 12-bit support on consumer cards for more than a year at this point.
As with AMD, however, this is not supported in OpenGL.
My understanding is that professional applications still require you to be using a pro card (Quadro/Tesla/FirePro) on Windows 7 with the desktop compositor disabled.
Since you cannot disable the compositor on Windows 8/10 (at least not in any officially supported manner) I don't believe you can get a 10-bit output from professional applications on those operating systems.

EDIT: Apparently you only need to disable the compositor with AMD cards, so maybe NVIDIA cards still work on Windows 8/10. And perhaps that's true of newer FirePro cards too - I don't have a system with one to test that.
My point was more that you cannot enable 10-bit color in professional applications whether you have an AMD or an NVIDIA GPU, if it's a consumer card.
Which is very frustrating since my GPU is outputting a 12-bit signal to my 10-bit display, yet most applications are limited to an 8-bit output.
 
Last edited:
Well 2016 tv trend is HDR, so 10 bit panels

Pity, but HDMI 2.0 can be software updated to 2.0a, and if 2016 tv will have 2.0a is it possible to get 444 10 bit 4k at 60hz?
The only thing that changed in HDMI 2.0a is the data being transmitted in the packets. The data packet was updated to carry the HDR metadata. The bandwidth is still the same as HDMI 2.0, though, which means it can't carry 4K 4:4:4 at 60 Hz with 10-bit color.
 
The only thing that changed in HDMI 2.0a is the data being transmitted in the packets. The data packet was updated to carry the HDR metadata. The bandwidth is still the same as HDMI 2.0, though, which means it can't carry 4K 4:4:4 at 60 Hz with 10-bit color.
So, you buy 10 bit 4k tv, and you can't get 4k 444 60hz 10bit, just wow, so where is the benefit there?
 
So, you buy 10 bit 4k tv, and you can't get 4k 444 60hz 10bit, just wow, so where is the benefit there?
That's why HDMI needs to be killed. It's an inferior interface technology compared to DisplayPort.
 
So, you buy 10 bit 4k tv, and you can't get 4k 444 60hz 10bit, just wow, so where is the benefit there?
Video is stored as 4:2:0
Televisions are mainly used to watch video.

What's actually worse is that, until HDMI 2.0, the minimum you could transfer over HDMI was 4:2:2 so you would have to upsample chroma, and then it may be upsampled again to 4:4:4 by the TV for processing, or downsampled back to 4:2:0.
 
That's why HDMI needs to be killed. It's an inferior interface technology compared to DisplayPort.

And HDMI requires an IP license to implement while DisplayPort is free. I never understood why the industry didn't switch for 4K.
 
So, maybe there a logic to buy 2015 4k 60hz 444 8 bit tv as pc monitor?
Sure, many people do. But if you're looking for HDR at 60 Hz that probably isn't going to happen anytime this year. Hopefully the new generation of video cards being released this year will have DP 1.3 or 1.4 and we'll start seeing HDR equipped computer monitors in the >40" size range.
 
And HDMI requires an IP license to implement while DisplayPort is free. I never understood why the industry didn't switch for 4K.

Yes, considering HDCP still works with DisplayPort it makes no fucking sense why it isn't the dominant technology. HDMI is really just >.<
 
Well one thing is that displayport is severely limited in length of the cables, especially at high rez + high hz using a lot of bandwidth. HDMI has no problem doing longer runs of 25' to 50', at least at 1080p. Displayport 1.2 does 16', 1.3 is rated for 6' or so. That is one aspect that displayport really fails at. It seems like a lot of monitors have problems on wakeup when using displayport as well.

Regarding dp 1.3 and HDR, as far as gaming goes I'd still be interested in 1440p at 170hz or 3440x1440 at 144hz since 4k will be back to being limited to 60hz all over again when using hdr.

image.png
 
And HDMI requires an IP license to implement while DisplayPort is free. I never understood why the industry didn't switch for 4K.
And it is the only way now
Sure, many people do. But if you're looking for HDR at 60 Hz that probably isn't going to happen anytime this year. Hopefully the new generation of video cards being released this year will have DP 1.3 or 1.4 and we'll start seeing HDR equipped computer monitors in the >40" size range.
I don't understand why to wait?
316p8iw.jpg

DP 1.2 is capable for 4k 444 10 bit 60hz, so it is Tv's fault because hdmi 2.0? This is just don't make any sense they roll 10 bit panels, but there no connectors that could benefit from 444 10bit, just wow. And the only way to do 4k 444 10 bit now is use 4k pc monitor(pricy dell etc), right?
 
And it is the only way now

I don't understand why to wait?
316p8iw.jpg

DP 1.2 is capable for 4k 444 10 bit 60hz, so it is Tv's fault because hdmi 2.0? This is just don't make any sense they roll 10 bit panels, but there no connectors that could benefit from 444 10bit, just wow. And the only way to do 4k 444 10 bit now is use 4k pc monitor(pricy dell etc), right?
The data packet in DP 1.2 doesn't carry the HDR metadata, that is why you have to wait. If you tried to pass HDR over a DP 1.2 connection the scalar wouldn't know what to do with it, so it would just ignore it.

I think TV manufacturers wave off the chroma issue because most people won't be able to tell the difference between 4:4:4 and 4:2:0 in most use cases. As to why HDMI, I think it can reasonably be explained by cronyism, or the "media mafia."

Yes, to your last question. Although there are panels in a range of prices that support 10-bit color, either with or without FRC. Just look for monitors that quote colors supported as 1.07 billion.
 
I hope the polaris & pascal get hdmi 2.0a at least, if not DP 1.3 as well.

4k 4:4:4 @ 24/30hz 10bit is good enough to play most video content coming out for awhile, assuming any software or streaming vendor manages to get in bed with the cartel enough to allow PCs. You don't need the cartel's blessing to play non-commercial footage either.

I also wonder if nvidia can update the 950/960 maxwells to hdmi 2.0a as well, they came with a newer video block that has HDCP 2.2 support while 970/980/ti/titan are HDCP 1.3. Some of the early hdmi 2.0 tvs only needed firmware updates, it depended on what chipsets they use.
 
The data packet in DP 1.2 doesn't carry the HDR metadata, that is why you have to wait. If you tried to pass HDR over a DP 1.2 connection the scalar wouldn't know what to do with it, so it would just ignore it.

I think TV manufacturers wave off the chroma issue because most people won't be able to tell the difference between 4:4:4 and 4:2:0 in most use cases. As to why HDMI, I think it can reasonably be explained by cronyism, or the "media mafia."

Yes, to your last question. Although there are panels in a range of prices that support 10-bit color, either with or without FRC. Just look for monitors that quote colors supported as 1.07 billion.
I hope the polaris & pascal get hdmi 2.0a at least, if not DP 1.3 as well.

4k 4:4:4 @ 24/30hz 10bit is good enough to play most video content coming out for awhile, assuming any software or streaming vendor manages to get in bed with the cartel enough to allow PCs. You don't need the cartel's blessing to play non-commercial footage either.

I also wonder if nvidia can update the 950/960 maxwells to hdmi 2.0a as well, they came with a newer video block that has HDCP 2.2 support while 970/980/ti/titan are HDCP 1.3. Some of the early hdmi 2.0 tvs only needed firmware updates, it depended on what chipsets they use.
So this in not DP 1.2 bandwidth issue, this is Hdcp 2.2 issue, and to get 4k 444 60hz 10bit we need to wait DP 1.3, even through DP 1.2 can do the same, but greed is ruling the world.
 
Back
Top