Why do TVs/monitors mostly only have 1x HDMI 2.0?

euskalzabe

[H]ard|Gawd
Joined
May 9, 2009
Messages
1,478
The more I read lately about new HDR, 4K TVs and other displays, the more baffled I am. The great majority tend to include one HDMI 2.0 port while any other HDMI ports are 1.4. What gives? What is the logic behind this? When monitors have Displayport it's not such a huge problem, but what about DP-less monitors and TVs? Do manufacturers expect use to use only ONE HDMI 2.0 port?

What is the problem here? Is HDMI 2.0 any more expensive to implement? I see zero reason for any display at this point in 2016 to be released with anything other than all hdmi ports being 2.0. Having just one out of several be 2.0 is illogical. Having none of the ports be 2.0 is laughable.

Unless of course they do it on purpose so people will be left out of HDMI 2.0 sooner rather than later and they are then forced to upgrade yet again... but I refuse to be such a cynical person.

Can anybody explain this?
 
Hdmi 2.0 is more expensive. It's literally a few pennies saved by the TV manufacturer x 1,000's. I would also guess that some of the electronics further down the pipeline would need to be more robust to handle multiple 2.0 signals, but yeah, it's mostly greed.
 
Hdmi 2.0 is more expensive.

That't what I thought at first, but according to hdmi.org, this just isn't true:

Will there be any new royalty and/or increase in current royalties for products that implement HDMI 2.0 features?
No. HDMI Adopters will continue to pay the same royalties on HDMI 1.x products in accordance with the existing Adopter Agreement. There is no additional royalty for implementing the HDMI 2.0 specification.

So, if what HDMI isn't a blatant lie... they are not charging any more or less to those manufacturers who already license 1.x. Which brings me back to my initial confusion: why not provide all HDMI ports in displays as 2.0...
 
Gotcha. I guess despite minimal hardware adjustments implying a difference of pennies; in the macro-scale it'd make a tangible cost difference. It seems that 2016 is a flushing year, clearing out inventory of HDMI 1.x parts while progressively ramping up 2.0 parts. This reinforces my suspicion that 2017 will be the real year of HDMI 2.0 and features it enables, like HDR, to reach a mid-range ~$500 price point.

I've waited 7 years with my current 1080p TV, I can wait one more :)
 
I feel HDMI, like a lot of things, is just a system of control. The specification is never thinking too far ahead, and it seems the features it does support are there for TV manufacturers to price gouge customers as manufacturers can decide which features of the spec it supports.

On the other side DisplayPort does everything HDMI does and more, and the specification is forward-thinking. And now we can see the TV industry (MPEG LA) bullying users of DP by demanding a $0.20 USD license and royalty fee on all products sold with DP. That could be costly compared to the $10,000 annual fee + $0.04 fee per product with HDMI in higher volumes. It would certainly prevent any kind of adoption of DP in televisions to supersede HDMI.
 
we arent talking royalties. we are talking expenses to implement.

Exactly this.

HDMI 2.0 runs the pixel clock at nearly twice the speed of 1.4. They don't add more lanes, so this is really pushing the interface to the limit. Not sure how much faster they can push this semi-parallel legacy bus.

Decoding the higher-speed signal requires more powerful DSPs, and validation of the circuit is more expensive. The price of DSPs has fallen rapidly in the past, but now may not since Moore's law has hit a wall.
 
Last edited:
Exactly this.

HDMI 2.0 runs the pixel clock at nearly twice the speed of 1.4. They don't add more lanes, so this is really pushing the interface to the limit. Not sure how much faster they can push this legacy bus.

Decoding the higher-speed signal requires more powerful DSPs, and validation of the circuit is more expensive. The price of DSPs has fallen rapidly in the past, but now may not since Moore's law has hit a wall.

Given how lots of AVRs over the years (cough Onkyo) have had problems with HDMI v1.4 boards overheating themselves to death...makes me wonder about the longevity of HDMI v2+boards in AVRs.
 
HDMI is an evil shitty cartel, but nicer TVs have more ports. My JS doesn't differentiate between any, and all are 4:4:4 capable.

Monitors shouldn't even bother and stick with DP, also people following the standards seem to indicate that next-gen (rec2020, 8k, etc) might dump hdmi completely. Sounds good to me, fuck HDMI.
 
HDMI is an evil shitty cartel, but nicer TVs have more ports. My JS doesn't differentiate between any, and all are 4:4:4 capable.

Monitors shouldn't even bother and stick with DP, also people following the standards seem to indicate that next-gen (rec2020, 8k, etc) might dump hdmi completely. Sounds good to me, fuck HDMI.

the main problem with DP is disconnect when you turn the monitor off.
 
Definitely not an interface problem, could be hardware design choice but shitty driver/OS implementation is most likely.
no, it's in the DP spec, as it happens on AMD and Nvidia cards on different monitors and OS's and different cables..
 
Maybe I'm missing something obvious...but why is this a problem?
Windows and programs resize since the OS reverts to something like 1024x768. Also your desktop icons will rearrange their layout
 
Windows and programs resize since the OS reverts to something like 1024x768. Also your desktop icons will rearrange their layout

Odd, never have had that happen on the DP monitors I've used. Last time I remember seeing something like (change of default resolution) that was with ancient AMD HD6000 drivers a few years ago, when uninstalling/re-installing drivers in W7 (and once the right drivers were installed it snapped back to correct res). But hitting the power-button by itself on the monitor never seemed to start that behavior IIRC.


LG monitor firmware bug perhaps? Although I suppose maybe your 21:9 aspect ratio is something that causes windows a headache too, since it isn't exactly standard.
 
Odd, never have had that happen on the DP monitors I've used. Last time I remember seeing something like (change of default resolution) that was with ancient AMD HD6000 drivers a few years ago, when uninstalling/re-installing drivers in W7 (and once the right drivers were installed it snapped back to correct res). But hitting the power-button by itself on the monitor never seemed to start that behavior IIRC.


LG monitor firmware bug perhaps? Although I suppose maybe your 21:9 aspect ratio is something that causes windows a headache too, since it isn't exactly standard.

I am using HDMI on my 34" LG.
I use DP on my HP and LG monitors upstairs on the server which has a Quadro card.
 
Displayport's signal strength over distance is very poor. You have to sit more or less on top of your computer case. You could stretch it to 10' - 15' but high rez, high hz monitors will probably recommend 6'. Hdmi could be run across a house, presentation area, store, etc up to 50' or even 75' in some cases without repeaters on high quality lines. That's the biggest difference to me. I was surprised by such a step backwards in the distance capability facet considering how much more modern a cable spec it is. It seems weaker than usb over distance. Of course bandwidth demands are a lot higher with higher resolution monitors and higher rez monitors heading toward 144hz - 200hz by the end of the year or so (which will give more hz leeway in order to allow for the demand of HDR cable monitors on that hz ultimately too).

Here is yet another update to the TftCentral articles, only concerning AUO:
LCD and TFT Monitor News

In short:
- 25'' & 27'' 1080p 240hz TN panels end of 2016
- 35'' 3440x1440 now 200hz VA end of 2016
- 31.5'' 1440p 144hz VA Q4 production (same as planned Samsung panel)
- 27'' 1440p 144hz VA in planning phase
- 27'' 4k 144hz AHVA (IPS) mass production in 2017
- 240hz 1440p planned in 2017

(Updated OP)

A7QjyOz.png



Regarding a single hdmi 2.0/hdcp2.2 18gbps connector on tvs - most people with a high end tv would be using a surround sound receiver capable of doing 4k hdmi 2.0 60hz passthrough so in that case the receiver would have multiple hdmi 2.0 inputs outputing one single one to the tv's hdmi 2.0 input.
 
Last edited:
The thing I explain to a lot of people who aren't involved with cooperate attitudes is how they go into runaway combustion as soon as a single 'high quality' box is ticked. You want to include 2x HDMI2.0 ports on the TV? That will increase the BOM cost by $0.50 per unit, so that increases the Finished Good cost by $10 because reasons. Now the TV is $10 more expensive, which makes it compete closer to higher-end products. in order to stand out against those products, more features need to be added, which increase the BOM, which causes exponential increase in final price, which means it is now competing against a different tier, which means more features, which means....


You get the idea. It is really, REALLY hard to convince a group of suits to abandon this mentality.
 
Displayport's signal strength over distance is very poor. You have to sit more or less on top of your computer case. You could stretch it to 10' - 15' but high rez, high hz monitors will probably recommend 6'. Hdmi could be run across a house, presentation area, store, etc up to 50' or even 75' in some cases without repeaters on high quality lines. That's the biggest difference to me. I was surprised by such a step backwards in the distance capability facet considering how much more modern a cable spec it is. It seems weaker than usb over distance. Of course bandwidth demands are a lot higher with higher resolution monitors and higher rez monitors heading toward 144hz - 200hz by the end of the year or so (which will give more hz leeway in order to allow for the demand of HDR cable monitors on that hz ultimately too).



A7QjyOz.png



Regarding a single hdmi 2.0/hdcp2.2 18gbps connector on tvs - most people with a high end tv would be using a surround sound receiver capable of doing 4k hdmi 2.0 60hz passthrough so in that case the receiver would have multiple hdmi 2.0 inputs outputing one single one to the tv's hdmi 2.0 input.
Run lengths are always going to be a problem where bandwidth is concerned. Sure, you can run 50' of HDMI cable in your house, but you're probably only going to get 1080p if you're lucky. Good luck getting 4K 60 Hz with a good picture over HDMI 2.0 using a cable that is more than 4 metres long.
 
Displayport's signal strength over distance is very poor. You have to sit more or less on top of your computer case. You could stretch it to 10' - 15' but high rez, high hz monitors will probably recommend 6'. Hdmi could be run across a house, presentation area, store, etc up to 50' or even 75' in some cases without repeaters on high quality lines. That's the biggest difference to me. I was surprised by such a step backwards in the distance capability facet considering how much more modern a cable spec it is. It seems weaker than usb over distance. Of course bandwidth demands are a lot higher with higher resolution monitors and higher rez monitors heading toward 144hz - 200hz by the end of the year or so (which will give more hz leeway in order to allow for the demand of HDR cable monitors on that hz ultimately too).



Regarding a single hdmi 2.0/hdcp2.2 18gbps connector on tvs - most people with a high end tv would be using a surround sound receiver capable of doing 4k hdmi 2.0 60hz passthrough so in that case the receiver would have multiple hdmi 2.0 inputs outputing one single one to the tv's hdmi 2.0 input.

Run lengths are always going to be a problem where bandwidth is concerned. Sure, you can run 50' of HDMI cable in your house, but you're probably only going to get 1080p if you're lucky. Good luck getting 4K 60 Hz with a good picture over HDMI 2.0 using a cable that is more than 4 metres long.

It should also be noted it took HDMI a very long time to get where it is today. Hell, IIRC it wasn't until HDMI v1.3 or v1.4 that you could even run HDMI cables longer than 2 meter without snowflakes happening (at any resolution)....because the idiots designed a digital interconnect without any error correction. HDMI being a theater standard (moreso than DP), it also has a headstart on higher-power repeaters.

For really long runs in the industry though you're still not going to use either.

I work in live productions...we had Cirque du Soleil Toruk come through. That show does not use a painted set, it uses 40x high-powered digital projectors to supply all the color needs of the set, all networked, all working as one at once to seemlessly color the stage/set without skipping or lag. Their touring rig has $4,000,000USD in just computer projectors (each retails at $100,000), not including the lens barrels, or lamps, not including the computer gear to drive it, not including the aerial rigging to fly it, or the cabling monstrosity to interconnect it. And every projector needed about 100-150 meters of cable. They weren't using HDMI, or DP....it was all DVI from what I could see.
 
apples, oranges and all that...

Displayport's signal strength over distance is very poor. You have to sit more or less on top of your computer case. You could stretch it to 10' - 15' but high rez, high hz monitors will probably recommend 6'. Hdmi could be run across a house, presentation area, store, etc up to 50' or even 75' in some cases without repeaters on high quality lines. That's the biggest difference to me. I was surprised by such a step backwards in the distance capability facet considering how much more modern a cable spec it is. It seems weaker than usb over distance. Of course bandwidth demands are a lot higher with higher resolution monitors and higher rez monitors heading toward 144hz - 200hz by the end of the year or so (which will give more hz leeway in order to allow for the demand of HDR cable monitors on that hz ultimately too)..

All quoted lengths are pretty much bullshit unless you post resolution aka bandwidth. Repeaters don't count, though I'lll give you there aren't really any for DP. (personally I'd send data over a network and output local, that distance problem is very solved)

Hdmi length is pretty ass at 4k60hz too, and thats the limit for 2.0, and it took them forever. Your chart lists bandwidths that would be multiple hdmi connections, probably dozen+ at "long lengths" where hdmi "works". ("HDTV" as broadcast/viewed is like 1/9th the bandwidth of 4k PC monitor use).
On the wire its really just comparing LVDS vs TMDS, and at the way display bandwidth is heading DP being more like a network interface is the winner.
 
my $250 42" 4K TV has 3 HDMI 2.0 ports on it

And which TV might that be?

I work in live productions...we had Cirque du Soleil Toruk come through... Their touring rig has $4,000,000USD in just computer projectors (each retails at $100,000), not including the lens barrels, or lamps, not including the computer gear to drive it, not including the aerial rigging to fly it, or the cabling monstrosity to interconnect it.

Jeez, it's hard to even wrap my head around that kind of setup (and expenses).

What's getting more and more clear is that 2016 is not a good year to buy a new TV at the $500/700 price range. Besides the lack of several HDMI 2 ports, most TVs process HDR signal but cannot display anything over 400 nits, they tend to use 8 bit panels instead of 10 bit ones which also brings down DCI P3 support from the preferrable %90. It seems that in 2017 these features will have to be included in any TV over $500 that wants to be advertised as HDR-ready and by then, all HDMI ports will have to be 2.0 due to necessity to attach many more HDR ready peripherals.

I guess it's a wait and see kind of year.
 
And which TV might that be?



Jeez, it's hard to even wrap my head around that kind of setup (and expenses).

What's getting more and more clear is that 2016 is not a good year to buy a new TV at the $500/700 price range. Besides the lack of several HDMI 2 ports, most TVs process HDR signal but cannot display anything over 400 nits, they tend to use 8 bit panels instead of 10 bit ones which also brings down DCI P3 support from the preferrable %90. It seems that in 2017 these features will have to be included in any TV over $500 that wants to be advertised as HDR-ready and by then, all HDMI ports will have to be 2.0 due to necessity to attach many more HDR ready peripherals.

I guess it's a wait and see kind of year.
It's that Seiki 4K TV that was on sale this past winter.
IMG_0409.JPG
 
Windows and programs resize since the OS reverts to something like 1024x768. Also your desktop icons will rearrange their layout

This sounds like an OS issue, not a displayport issue. It'd make more sense to resize things when a monitor is reconnected.
 
This sounds like an OS issue, not a displayport issue. It'd make more sense to resize things when a monitor is reconnected.
Well, it did it with Windows 7 and 10 (never had 8) with different video cards, cables, and monitors.
Just google Display Port Disconnect and see what you see.
 
This is the size the screen opens up at when I remote into the machine,
dp-issue1.jpg


Here is the Device Manager for my Server with 2 DP monitors (HAL-9000) and the machine I am on is on the right.
Note that there are no Monitors in the list, both monitors are plugged in but they are powered off,
dp-issue2.jpg
 
Back
Top