HDMI 2.0: Too High Latency?

realworld

Limp Gawd
Joined
Apr 24, 2016
Messages
484
It seems like HDMI 2.0 running at full clock (600mhz?) can only get around 30ms of input lag. But when they downclock it to 300mhz (TV using Game Mode), the input lag goes down to low 20ms. I'm merely using the latest Samsung TVs as an example. We can see the downclock in gamemode due to lower bandwidth, resulting in limitation of 4k 60hz @ 4:2:2.

And then, we look at other competition of TVs using HDMI 2.0 and all of them can only do around 30ms when running at its full advertised clockspeed. Never have I seen a 4k TV with HDMI 2.0 go under 30ms of input lag with 4:4:4 support, LG OLEDs included.

So in theory, I think the higher clock of HDMI 2.0 actually causes higher latency, resulting in higher input lag. Am I wrong? Any research for this matter? Are we going to be stuck with HDMI 2.0 for a long time?
 
If it still displaying 60 Hz over a 300 MHz pixel clock at 4K then I imagine the image is actually going to be interlaced regardless of the amount of chroma information. 3840x2160 is 8,294,400 pixels. Displaying that 60 times a second is 497,664,000 pixels per second. You're not going to get that on a 300 MHz pixel clock.
 
If my memory is correct, I believe the 434k has sub 30ms input lag via HDMI 2.0 and so does the Wasabi Mango.

Those Samsung TV's seem to have horrible input lag problems and I don't know why.

Also, my new Philips 4k monitor is sub 30ms input lag via HDMI 2.0.

Think it comes down to the controller board I could be wrong.
 
It will be down to the processing time needed to process a higher bitrate image.
The speed of the onboard CPU, efficiency of software it runs, what the software tries to do and how much data it has to process affect the time it takes to process a frame.
It wont be down to the clockrate as you suggest, that will have the opposite impact (faster speed = less lag) but is so small an effect you wont notice it.

Note that tested lag figures are for 1080p.
4K test kits arent available yet.
 
The 2.0 protocol does not have anything to give it inherently more latency (they didn't change much other than max clocks) its the chipsets/controllers that determine this. It doesn't help that most TV firmware lack a mode where they get out of the fucking way with image "processing and enhancement" and just hand off the damn signal to the tcon. A lot of the first gen chipsets kinda suck too IMO.

Your theory is pretty off, total bandwidth is not linked to latency in most signals and higher frequency reduces raw latency. Sometimes it might seem that way, (e.g. DDR3 --> DDR4) but often it comes from differences in encoding methods, word size, intentional timing/clock delays for stability and other things.

I can't wait until TVs end up forced to displayport or a similar more modern interface, its looking like they might have to for HDR/rec2020/etc. Fuck hdmi.

Also, as noted 4k input lag still has no real way to test yet so any number is a 'guesstimate' from 1080p.
 
Are we all in for a bad surprise when 4k testing equipment comes out? If the issue is image enhancement bullshit would that be better or worse at the native resolution for 4k sets?

Is there an objective way to estimate 4k input lag when you know 1080p lag?
 
Back
Top