HDMI 2.0 officially announced: 18Gbps bandwidth, 60fps 4K, 32 channel audio

Alti

Weaksauce
Joined
Jan 30, 2010
Messages
108
Not sure if anyone posted this yet, but if not:
Only just after it leaked out, the folks at HDMI Licensing are announcing HDMI 2.0 officially. Arriving just in time for the wide rollout of a new generation of Ultra HDTVs, it adds a few key capabilities to the connection standard. With a bandwidth capacity of up to 18Gbps, it has enough room to carry 3,840 x 2,160 resolution video at up to 60fps. It also has support for up to 32 audio channels, "dynamic auto lipsync" and additional CEC extensions. The connector itself is unchanged, which is good for backwards compatibility but may disappoint anyone hoping for something sturdier to support all of those suddenly-popular dongles. The cables won't change either, as the group claims current high-speed Category 2 wires can handle the increased bandwidth. Some companies have suggested upgrade paths for their UHDTVs already on the market -- hopefully we'll find out more about those plans this week at IFA 2013.

Source:
http://www.engadget.com/2013/09/04/hdmi-2-0-official-4k-60fps-32-channel-audio/

This means we can finally look forward to true 4K 60fps monitors. Because obviously Displayport and its lack of mandatory copy protection was not good enough.
4K TV mass production should help lower the cost of 4K monitors as well.
 
How are the Asus PQ321Q and Sharp PN-K321 4K monitors not considered to be true 4k 60FPS monitors?
 
How are the Asus PQ321Q and Sharp PN-K321 4K monitors not considered to be true 4k 60FPS monitors?

I'm guessing he's referring to the fact that every 4K display that can take 4K 60Hz input out there (see http://www.noteloop.com/kit/4k/ for a list) uses tiling multiple streams via single/multiple cables vs a single stream over a single cable. That should end now that HDMI 2.0 is here.
 
because they split the screen in MST mode to drive both sides at 60hz. Among other issues, this means that crossfire do not benefit from framepacer driver
 
In the press release they said they are already starting work on the next version which they will need if they want to support Rec. 2020's 4K 120Hz, 8K 60Hz or 8K 120Hz.

Hopefully it won't be as delayed as 2.0 now that Apple lost the signalling battle.
 
I, for one, welcome our cabled overlords.

Seriously though, anything for the advancement of technology. I haven't had a nice experience with Displayport (Card issue? Doing it wrong? Who knows.) but consumers looking for a simple setup solution which HDMI provides should be made happy.
 
So I wonder now if I can use 3D gaming @ full 1080p 60fps with my current setup. IIRCC, I am limited to 720p 60fps or 1060p 24fps because of the limitations of HDMI.
 
In the press release they said they are already starting work on the next version which they will need if they want to support Rec. 2020's 4K 120Hz, 8K 60Hz or 8K 120Hz.
Theoretically, HDMI 2.0 could do reduced-quality 4K 120Hz by doing 4:2:2 at 8-bit per channel, and using packet-based error correction instead of 8b/10b encoding.

4K 120Hz 4:2:2 at 8-bit/channel is only 15.9Mbps, which theoretically fits within 18Mbps of HDMI 2.0
 
Last edited:
So what are the plans to provide 4K content?

Connect PC, set desktop to 4K, instant content. (sorry couldn't resist)

When it comes to broadcasts the first big thing will likely be the soccer world cup. (it usually serves as broadcasting technology demonstrator)
 
So... Will it manage to be as broken as v1.0 was on release?
 
None of the video cards out now have hdmi 2.0 in them, correct? So even if a monitor comes out with HDMI 2.0, we would have to upgrade our video. I bought the titan, so that is a large amount to just throw away, though gaming on one large 4k panel is enticing. Maybe they will come out with a diplay port to HDMI 2.0 adapter to work??
 
No none of the cards have it yet, and HDMI 2.0 is still in the process of being launched. It 'might' be potentially used in the next-gen AMD Hawaii-based cards (that's a wild guess not based on anything other than hope) but if not then I would certainly expect it to start appearing by the time NVIDIA Maxwell-based cards hit.

This is one of this "look what the future could hold" things, and certainly not cause to regret/plan any specific graphics card purposes :)
 
When it comes to broadcasts the first big thing will likely be the soccer world cup. (it usually serves as broadcasting technology demonstrator)

I can confirm that Brazil's world cup will be recorded in 4k. Comparing prices the 55" 4k TVs here are being sold "cheap" for roughly 5k USD including all taxes. Sony is trying to expand the 4k audience in Brazil by all means, including selling at a loss.
 
I can confirm that Brazil's world cup will be recorded in 4k. Comparing prices the 55" 4k TVs here are being sold "cheap" for roughly 5k USD including all taxes. Sony is trying to expand the 4k audience in Brazil by all means, including selling at a loss.
At $5K, it's still a big profit for a 4K TV. Especially since Sony no longer owns the LCD plants that are making the 4K LCD's.
 
I don't get the 32 channel audio part. An audio "cube" setup is 3 x 3 x 3 = 27...

5 additional subwoofers?

And I thought my 10.2 setup is cool...
 
golf clap, thanks HDMI for another useless standard that will last little more than 2 years. bravo.
 
I don't get the 32 channel audio part. An audio "cube" setup is 3 x 3 x 3 = 27...

5 additional subwoofers?

And I thought my 10.2 setup is cool...

Left sub, right sub, rear sub, dual front subs? Seems legit
 
golf clap, thanks HDMI for another useless standard that will last little more than 2 years. bravo.

Yup, big opportunity missed here. Going to be stuck with a mediocre interface for the next decade. How display manufacturers think 60 Hz is still acceptable is beyond me. I guess they will rely on the cheap and much worse way of getting increased smoothness, having the TV interpolating and creating fake frames/higher refresh rates with associated lag.
 
This is good news, I've eyed some of the current 4K offerings but 30Hz is a dealbreaker.

I've got a U2412M (couldn't justify the $900 for a 1440p at the time, this was before the $400 ones hit the market) and it seems like a good time to hold onto it and wait. I've occasionally found myself eyeing 30" 2560x1600 monitors (especially when they go on sale for $700), but I'm hoping to avoid scratching the upgrade itch until I can get a 4K monitor. 32" 4K 60Hz for under $1000 maybe?
 
Im in the same boat as you. I picked up one of Microcenters 27" 1440p monitors, and Im holding out with this one until I can can a 4k 30-incher for 600-750.
 
Yup, big opportunity missed here. Going to be stuck with a mediocre interface for the next decade. How display manufacturers think 60 Hz is still acceptable is beyond me. I guess they will rely on the cheap and much worse way of getting increased smoothness, having the TV interpolating and creating fake frames/higher refresh rates with associated lag.

Because they don't cater to the maybe 5%? I'm not saying they shouldn't be doing more, just giving a reason why they probably aren't.
 
I was hoping HDMI 2.0 would have bandwidth for support of 4k at 120Hz or 4k 3D at 60Hz.

I guess all hopes are on DisplayPort 1.3 (40Gbit/s raw please)
 
Because they don't cater to the maybe 5%? I'm not saying they shouldn't be doing more, just giving a reason why they probably aren't.

Or more likely because they want to keep charging money to do upgrades. basically the manufacturers are the ones who push them to do this for several reasons. 1 a more rhobust chip might cost a little more money per port. 2 every time they screw you over by making a shitty standard they can force some amount of the population to buy new TVs, recievers and video cards every couple years to get something to work with their new console, etc ... This is why HDMI is BS because everyone is too stupid to see they have been doing this too us over and over.
 
Or more likely because they want to keep charging money to do upgrades. basically the manufacturers are the ones who push them to do this for several reasons. 1 a more robust chip might cost a little more money per port. 2 every time they screw you over by making a shitty standard they can force some amount of the population to buy new TVs, receivers and video cards every couple years to get something to work with their new console, etc ... This is why HDMI is BS because everyone is too stupid to see they have been doing this too us over and over.

In general, I agree with your skepticism. But in this case they are near the limits of low cost chip manufacturing. Look at the current speeds on Ethernet for example. Most motherboards come with 1 gigabit Ethernet. 40 gig adapters cost as much as a cheap PC and 100 gig is nonexistent.
 
the prices come down when a standard makes it go mass produced. Why is it that DVI, which HDMI is based on was able to scale with increasing bandwidth long past its spec? But HDMI na. Shitty ARM chips can do 4k now.
 
the prices come down when a standard makes it go mass produced. Why is it that DVI, which HDMI is based on was able to scale with increasing bandwidth long past its spec? But HDMI na. Shitty ARM chips can do 4k now.

Well, DVI did it by going dual-link. I don't know about ARM chips.

Too bad they didn't jump on high frame rates combined with deep color before they went to 4k. But the TV manufacturers needed something to sell into a saturated HDTV market so they panicked after 3D flopped and jumped on 4K.

I think if you want 120hz plus 48 bit color plus 4K you need in the neighborhood of 80 gigabits/s. That will probably take a next-gen manufacturing technology to produce cheaply.
 
Ridiculous. Barely at the bandiwidth of DP. HDMI 2.0 wasn't made foolproof at all...
 
I was hoping HDMI 2.0 would have bandwidth for support of 4k at 120Hz or 4k 3D at 60Hz.

I guess all hopes are on DisplayPort 1.3 (40Gbit/s raw please)

Personally I was hoping they'd get up to support 8k@120hz@12bit already, so the interface would last a long while...
 
Wth, 8k@10bit@120hz requires 125Gbps? :eek:
Though we will probably laugh about that considering we have years before it drops and it will start at 30Hz.

1080P has had a consumer run of like 5 years (please don't point out the 1080i half refresh a second stuf :D)
Considering tech will progress faster we might be looking at a 3 year period before 8K UHD drops so if the bandwidth doubles every year we can expect 160Gbps by then.
Of course it might be the time to replace the amount of grounds with a unified ground as the pins run their signals in pairs.
Eventually glass fiber will give a near infinite(still finite of course) amount of signalling speed to to nearly no package loss.
 
Well, DVI did it by going dual-link. I don't know about ARM chips.

Too bad they didn't jump on high frame rates combined with deep color before they went to 4k. But the TV manufacturers needed something to sell into a saturated HDTV market so they panicked after 3D flopped and jumped on 4K.

I think if you want 120hz plus 48 bit color plus 4K you need in the neighborhood of 80 gigabits/s. That will probably take a next-gen manufacturing technology to produce cheaply.

I don't need everything but this was pathetic. they designed HDMI 2.0 to last exactly 1 and only 1 standard, ironically a standard that is already shipping. DVI a 10 year old standard had more bandwidth than HDMI up until very recently. That is how bad HDMI is. Right now today most people buying new light boosted monitors are using DVI. If they had at least went with 40gb/s people could have chosen over the years, either 8k at 60 or 4k at 120. But as it stands they wont be getting much of any choice.

You know what the irony of this all is? They think it increases sales but in reality it probably decreases them. Because their marketing and sales departments will never tally up the millions and millions of sales they lost because someone like me said well, I could buy a 4k TV but then I would need to drop money on a new receiver, or video card etc and once I do that I will turn around in a couple years and have to replace it ALL again if I want to upgrade. So na, I will just sit this one out.
 
32 channel audio isn't 32 individual speakers. It does mean however that different sound elements can be isolated and transmitted individually, which makes HDMI more useful for multitrack audio/mixing and so on and so forth.

Dolby's new surround tech ignores discrete channels anyway, I believe - it positions the audio absolutely, and leaves it up to the amp to map that onto the current speaker setup.

Where as an old surround signal like 5.1, 7.1, etc there are discreet channels that were mixed in a calibrated studio environment. Your receiver isn't making any decisions, it simply decodes the signal and routes, for example, the "front left speaker" signal to whatever speaker is plugged into that jack on your receiver. If you arrange your speakers oddly, like having your front left and right too close together or unequally spaced or ... you will not have an accurate mix and things will sound wonky.

The audio being positioned absolutely, if I understand correctly, means that the source media will essentially have discreet audio channels for individual sound effects and dialog, with 3D spatial positioning coordinates encoded in the actual sounds, with the receiver having to decide which speaker to send each sound to based on it's own awareness of the space it resides in and the number/size/arrangement of available speakers. This makes the set up stage where you use the mic that comes with most systems or the settings within the menus rather important.

Pretty neat stuff if you set it up and calibrate it!

I also think they're doing themselves a disservice by making the standard so short lived. It seems very short sighted but then that seems to be very typical of HDMI.
 
I don't need everything but this was pathetic. they designed HDMI 2.0 to last exactly 1 and only 1 standard, ironically a standard that is already shipping. DVI a 10 year old standard had more bandwidth than HDMI up until very recently. That is how bad HDMI is. Right now today most people buying new light boosted monitors are using DVI. If they had at least went with 40gb/s people could have chosen over the years, either 8k at 60 or 4k at 120. But as it stands they wont be getting much of any choice.

You know what the irony of this all is? They think it increases sales but in reality it probably decreases them. Because their marketing and sales departments will never tally up the millions and millions of sales they lost because someone like me said well, I could buy a 4k TV but then I would need to drop money on a new receiver, or video card etc and once I do that I will turn around in a couple years and have to replace it ALL again if I want to upgrade. So na, I will just sit this one out.

As soon as I saw 60Hz, I decided to skip 4K until they come out with something better.
 
Back
Top