HDMI 2.1 Detailed – 8K@60Hz, 4K@120Hz, Dynamic HDR, Variable Refresh Rate, 48Gbps Bandwidth

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,092
HDMI 2.1 Detailed – 8K@60Hz, 4K@120Hz, Dynamic HDR, Variable Refresh Rate, 48Gbps Bandwidth.
http://wccftech.com/hdmi-2-1-detailed-8k60hz-4k120hz/?utm_source=dlvr.it&utm_medium=twitter

The article is very well written. This is one time that you should read a WCCFTECH article. ;)

The interesting part for PC Gamers is this blurb. This Game Mode VRR sounds awfully familiar? Right on the tip of my tongue...


Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.

Game Mode VRR
Q: Does this require the new HDMI cable?

A: No

Q: Will this work with 8K@60 or 4K@120Hz?

A: Yes if those features are implemented along with Higher Video Resolution. That will require the new 48G cable

Q: Is this primarily for consoles or will PCs utilize this also?

A: It can be used for both.

Q: Will this result in more gaming PCs connecting to HDMI displays, either monitors or TVs?

A: The intent of the feature is to enable HDMI technology to be used in these applications. Given that HDMI connectivity already has a strong presence in this area, we expect that use of HDMI technology in gaming will continue to grow.



R.I.P. current video cards? Or do they support this feature set already? I see that the GTX 1080 has HDMI 2.0b and the RX 480 has HDMI 2.0. I know that AMD has FreeSync 2 HDR monitors coming soon and Nvidia is unveiling a 4K 144Hz HDR GSYNC monitor tonight. I'm more concerned about those of use that buy big TVs for monitors. Really neat that the consoles get VRR support with this.
 
It looks like hdmi FINALLY leapfrogged (or will when released) displayport in bandwidth. I believe dp 1.4 only goes up to around 32.4 Gbps if bandwidth does it not?

I don't expect any gpus this year to have this, but perhaps next year. If this gets implemented quickly, there will be less need to have secondary displayport or supermhl inputs on the tv/monitor displays.

EDIT: this will be superior to displayport 1.4 because apparently even that does not allow 4k @ 120Hz with HDR and 10 bit color or more. You have to start compressing things This can't come soon enough. First we need the gpus and cables to come with this, then the displays.

It would be a great boon to have the new xbox upgraded console equipped with one of these to kick the displaymakers into gear. Imagine, a console with variable refresh rate support for smoother playback at higher resolutions? More impetus to get certain tvs/displays to support freesync 2 for lower latency? This can only be good.
 
Last edited:
I wouldn't even consider any graphics card or 4k tvs without HDMI 2.1. It's too good to pass up. 4k 120hz 4:4:4 at 10-bits of awesome colors. And the other additions to HDR and such...
 
I wouldn't even consider any graphics card or 4k tvs without HDMI 2.1. It's too good to pass up. 4k 120hz 4:4:4 at 10-bits of awesome colors. And the other additions to HDR and such...


For tvs, I get it, for gpus? Those are easier to offload to all the people who have no short term hope of taking advantage of the higher bandwidth in the first place. I still plan on getting vega even though I'm sure it won't have this, and when navi hits, hopefully it will have this baked in.
 
I wouldn't even consider any graphics card or 4k tvs without HDMI 2.1. It's too good to pass up. 4k 120hz 4:4:4 at 10-bits of awesome colors. And the other additions to HDR and such...

lol, you make it sound like old hat.
 
Don't expect this to show up on anything this year, unless it at the very end of year. According to the OC3D article the standard is not finalized yet and won't even go out to all HDMI 2.0 adopters until Q2.
 
The first HDMI 2.0 TV was announced the same month as HDMI 2.0 was released I believe. So you might have announcements in 2017Q2 when HDMI 2.1 is expected to be launched.

In terms of PC displays, I think around 1.5 years until the first HDMI 2.0 monitor. Even then I don't think HDMI 2.0 monitors are common place.

In terms of PC outputs. Nvidia's first release post announcement, GM2xx 1 year out, had HDMI 2.0. AMD's supported HDMI 2.0 with it's Carrizo APU 2 years out but it's discrete GPU release at the same time, Fury, did not. It wasn't until Polaris (3 years) until they did with a discrete GPU. Intel still does not support HDMI 2.0 with Kaby Lake.

Regarding the VRR I'm not sure if this will necessarily lead to what people think it will. Even though the interface standard itself will not support VRR it does not necessarily mean the device on either end has to.
 
I wouldn't even consider any graphics card or 4k tvs without HDMI 2.1. It's too good to pass up. 4k 120hz 4:4:4 at 10-bits of awesome colors. And the other additions to HDR and such...
So I guess you won't consider buying a graphics card or TV within the next 2 years. Talk about living in the "realworld".
 
I wouldn't even consider any graphics card or 4k tvs without HDMI 2.1. It's too good to pass up. 4k 120hz 4:4:4 at 10-bits of awesome colors. And the other additions to HDR and such...

That's great if your using your TV as a PC monitor (although how many PC's can run 4k @ 120hz?) but for many people that own current 4k TV's the HDMI's 2.1 feature just won't be utilised, unless you specifically need the features that 2.1 offers I don't think that 's it a big worry for current TV owners or those like myself looking to purchase a new TV this year, also I imagine stuff like the new HDR and VRR features could be firmware updated.
 
This is a much better design path to take than the previous HDMI updates. Those were all aimed at "release faster standard when we have DSP capable of it."

Now they're building-in a long upgrade path, so hopefully the 48G cables will be the last you ever buy. TVs will use the higher resolution modes when necessary, and video cards will slowly add support for higher resolution modes over time, since that controller hardware can suck down power.

As long as they're not Thunderbolt active cable price level, this should be fine.

This is similar to SATA, which built-in future support in the standard for up to 6Gbps. That's why SATA "1.5" cables you bought in 2003 still work with SATA 6gbps. This makes for an easier upgrade path, although video card makers are still going to specify resolution limits on each generation of HDMI 2.1 ports they release. Because battery life.

I like the idea of VRR being included, but just like every other variable frame "standard," this one is optional. Until somebody forces all makers to support one, I won't get my hopes up.
 
Last edited:
I like the idea of VRR being included, but just like every other variable frame "standard," this one is optional. Until somebody forces all makers to support one, I won't get my hopes up.

Microsoft's new XBONE will support it at release this year. At least one TV manufacturer will support it at launch I bet. That manufacturer will own the console crowd.
 
Another one I like is calling video cards an "investment". Funny since every video card in history has becoming cheaper and cheaper with time...
 
Or cheaping out on monitors while buying top of the line GPUs.

Back on topic: when would we start seeing computer monitors with those specs? By the time next HDMI spec update comes out?

I am a bit sour on this because the 4k monitor I bought last year came out with HDMI 1.4 lol.
 
Or cheaping out on monitors while buying top of the line GPUs.

Back on topic: when would we start seeing computer monitors with those specs? By the time next HDMI spec update comes out?

I am a bit sour on this because the 4k monitor I bought last year came out with HDMI 1.4 lol.
Will probably start to trickle out Q3-Q4 this year. Nothing was said about what version of HDMI is used in the recently announced PG27UQ. We'll still have to wait for video cards to come out with HDMI 2.1, but Volta is rumored to be coming out around this time as well.
 
dont worry it is still too slow. 8K not 120 hz...awesome......so we will see DP be 120hz but HDMI wont....HDMI is always a gen behind and is always trash...why 2 standers and the main standard is always a behind the power curve?....fuckers

I miss the day cables were 10 years ahead of displays :/ Fuck this generational garbage...make this shit right once every decade you fucks.


Yes this BS pisses me off.
 
dont worry it is still too slow. 8K not 120 hz...awesome......so we will see DP be 120hz but HDMI wont....HDMI is always a gen behind and is always trash...why 2 standers and the main standard is always a behind the power curve?....fuckers

I miss the day cables were 10 years ahead of displays :/ Fuck this generational garbage...make this shit right once every decade you fucks.


Yes this BS pisses me off.
Just a cord bro
 
dont worry it is still too slow. 8K not 120 hz...awesome......so we will see DP be 120hz but HDMI wont....HDMI is always a gen behind and is always trash...why 2 standers and the main standard is always a behind the power curve?....fuckers

I miss the day cables were 10 years ahead of displays :/ Fuck this generational garbage...make this shit right once every decade you fucks.


Yes this BS pisses me off.

You guys know that nobody can tell the difference between 4k and higher resolutions for content consumption, right?

And there's no effort to provide anything higher than 4k content, ever. No, not even this "8k" satellite broadcast in Japan counts, since it's limited to 100Mbps HEVC.

https://recombu.com/digital/article/what-is-super-hi-vision-shv-ultra-high-definition_M10844.html#

AKA, the same bullshit 1/4 rate Netflix 4k 15-25Mbps uses. "8k" broadcast = same quality and bit-rate as 4k Blu-Ray.

By comparison, Ultra HD Blu-Ray runs from 82 to 128Mbps:

https://en.wikipedia.org/wiki/Ultra_HD_Blu-ray

A real 8k broadcast would at least top 300-400mbps. But bandwidth is scarce, and people really can't tell the difference, so we're locking the standard at FULL 4k-quality.

So what possible imagined impasse could you be bitching about here? This is plenty of bandwidth for 4k 144Hz HDR, or 5k 100 Hz (for content creators). Call back when there's tons of demand for more pixels than we already have, Apple's laptop and handheld "retina" displays have been set-in-stone WELL BELOW 4k for five years now, and they seem to be in no hurry to push beyond 5k in their "content creation" iMac.
 
Last edited:
  • Like
Reactions: Rizen
like this
You guys know that nobody can tell the difference between 4k and higher resolutions for content consumption, right?

And there's no effort to provide anything higher than 4k content, ever. No, not even this "8k" satellite broadcast in Japan counts, since it's limited to 100Mbps HEVC.

https://recombu.com/digital/article/what-is-super-hi-vision-shv-ultra-high-definition_M10844.html#

AKA, the same bullshit 1/4 rate Netflix 4k uses. "8k" broadcast = same quality and bit-rate as 4k Blu-Ray.

By comparison, Ultra HD Blu-Ray runs from 82 to 128Mbps:

https://en.wikipedia.org/wiki/Ultra_HD_Blu-ray

A real 8k broadcast would at least top 300-400mbps. But bandwidth is scarce, and people really can't tell the difference, so we're locking the standard at FULL 4k-quality.

So what possible imagined impasse could you be bitching about here? This is plenty of bandwidth for 4k 144Hz, or 5k 100 Hz (for content creators). Call back when there's tons of demand for more pixels than we already have, Apple's laptop and handheld "retina" displays have been set-in-stone for five years now, and they seem to be in no hurry to push beyond 5k in their "content creation" iMac.
For the most part I agree. 8K is just another trick to get people to buy new televisions.
 
dont worry it is still too slow. 8K not 120 hz...awesome......so we will see DP be 120hz but HDMI wont....HDMI is always a gen behind and is always trash...why 2 standers and the main standard is always a behind the power curve?....fuckers

I miss the day cables were 10 years ahead of displays :/ Fuck this generational garbage...make this shit right once every decade you fucks.


Yes this BS pisses me off.


You should not be pissed off at this news though. hdmi, for the first time in a long time if ever, leapfrogged the specs of displayport.

displayport 1.4 tops out at around 32.4 Gbps compared to hdmi 2.0s pathetic 18 Gbps

hdmi 2.1? 48 Gbps This is the new top dog. I don't know the bandwidth of supermhl, but that has no traction anywhere.

even thunderbolt 3 only goes up to 40Gbps of bandwidth.

This gives us the headroom we were craving. I don't even care about 8k, maybe for some future vr display, but for right now I just want a 4k display with a vrr range of up to 120Hz that can cover the entirety of rec 2020 with 12 bit color.

NOT EVEN DISPLAYPORT 1.4 can do that today, this new hdmi can. That is a good thing, so cheer up.
 
I think his point is that higher refresh rates are out the window because it still looks like it's locked to 60 Hz.
 
You should not be pissed off at this news though. hdmi, for the first time in a long time if ever, leapfrogged the specs of displayport.

displayport 1.4 tops out at around 32.4 Gbps compared to hdmi 2.0s pathetic 18 Gbps

hdmi 2.1? 48 Gbps This is the new top dog. I don't know the bandwidth of supermhl, but that has no traction anywhere.

even thunderbolt 3 only goes up to 40Gbps of bandwidth.

This gives us the headroom we were craving. I don't even care about 8k, maybe for some future vr display, but for right now I just want a 4k display with a vrr range of up to 120Hz that can cover the entirety of rec 2020 with 12 bit color.

NOT EVEN DISPLAYPORT 1.4 can do that today, this new hdmi can. That is a good thing, so cheer up.

and the next displayport will be 60 or 100Gbps....this HDMI was just annouced...a new displayport standard will be soon as well.
You guys know that nobody can tell the difference between 4k and higher resolutions for content consumption, right?

And there's no effort to provide anything higher than 4k content, ever. No, not even this "8k" satellite broadcast in Japan counts, since it's limited to 100Mbps HEVC.

https://recombu.com/digital/article/what-is-super-hi-vision-shv-ultra-high-definition_M10844.html#

AKA, the same bullshit 1/4 rate Netflix 4k 15-25Mbps uses. "8k" broadcast = same quality and bit-rate as 4k Blu-Ray.

By comparison, Ultra HD Blu-Ray runs from 82 to 128Mbps:

https://en.wikipedia.org/wiki/Ultra_HD_Blu-ray

A real 8k broadcast would at least top 300-400mbps. But bandwidth is scarce, and people really can't tell the difference, so we're locking the standard at FULL 4k-quality.

So what possible imagined impasse could you be bitching about here? This is plenty of bandwidth for 4k 144Hz HDR, or 5k 100 Hz (for content creators). Call back when there's tons of demand for more pixels than we already have, Apple's laptop and handheld "retina" displays have been set-in-stone WELL BELOW 4k for five years now, and they seem to be in no hurry to push beyond 5k in their "content creation" iMac.
BULLSHIT!

I can see the difference easy especially with photos and other stuff. I can think of a few games right now i would play at 8K and would be far better than 4K. I have a 4K 27 in and i can tell there is more potential for better clarity.

I can see photos being grand on 8K. No need to zoom or anything. Basically 100% image on display is epic or 50 megapixel stiched landscapes *drool*. If you have used 4K or any other high res display you would know what you said is ridiculous. I can tell the difference between a 1080P phone and a 1440P phone and this is 5 or 6 in screen. You dont think i can tell on a 32 or 40 in screen 1-2 feet away....good god man.

see individual pixel and the gap is very different from seeing 2 different colored pixels side by side. Its the same BS as people saying 24 or 30 frames is fast enough...no jack it isn't. That is the flicker threshold and not what we can actually tell in movement. We can see easily 600FPS in fluidness if not 1000 fps....go look at blur busters.

This applies for pixel density too and 8K at 32 or 40 in is very useful and realistic. 4K isn;t that amazing at 27 inches.

The games i would play would be Star wars empire at war (4K is awesome as is and runs perfectly). RCT3, Rome total war and so many others. AOE3 and more.

500DPI vs 300 DPI is very noticeable on cell phones and these displays wouldn't even touch 300 DPI!!!!!


4K 32in is 137 DPI
8K 32 in is 275 DPI

8K is very much acceptable and easily noticeable.

Now stop being drunk

Also H265 vs 264 is not comparable bit rate. *facepalm*


What I would love is to see 60fps movies (120 fps would be awesome but lets be realistic....the data would be off the charts) because the blur on 24fps sucks...watching ben-hur bluray and i really want more fps :/ 24fps sucks
 
Last edited:
Congratulations, your eyes are Juan in a Million. At normal viewing distances most people can't tell the difference between 4k and 8k, FOR THE CONSUMPTION OF MEDIA.

Gaming may be a special case, but that would depend on your screen size. But since movies are already a motion-blurred, 4:2:0-castrated mess, you shouldn't expect much from people's eyes spotting at differences above 4k. And SLR-quality pictures, how many people view those on a daily basis?

Also, I compared HEVC for all three data rates. Your superior eyesight must have missed that fairly obvious detail.

Netflix 4k = 15-25mbp HEVC
UHD Blu-Ray = 82 to 128Mbps HEVC
8k satellite transmission = 100Mbps HEVC. Check the fucking article I linked, where they said they would have to run it at 200Mbps if they used h.264.

Sounds liek the same bandwidth as 4k top-quality movies to me. So no, not 8k resolution, the same high-resolution blurry mess as Netflix 4k.

Here' I did the hard part for you, and quoted the article.

However, Dr Kubota of NHK expects that 100Mbps Super Hi-Vision data streams will cause congestion for IP networks, and predicts that broadcasters will opt for satellite at first, followed by terrestrial transmissions.

NHK is working on satellite technology using the 21GHz band, and has tested a new type of terrestrial TV modulation scheme in the laboratory.

The MPEG Forum's HEVC (High Efficiency Video Coding) system for real-time decoding promises to will halve the bitrate of an 8K stream from 200Mbps with MPEG-4 H.264 today.
 
Last edited:
Congratulations, your eyes are Juan in a Million. At normal viewing distances most people can't tell the difference between 4k and 8k, FOR THE CONSUMPTION OF MEDIA.

Also, I compared HEVC for all three data rates. Your superior eyesight must have missed that fairly obvious detail.

Netflix 4k = 15-25mbp HEVC
UHD Blu-Ray = 82 to 128Mbps HEVC
8k satellite transmission = 100Mbps HEVC. Check the fucking article I linked, where they said they would have to run it at 200Mbps if they used h.264.

Sounds liek the same bandwidth as 4k top-quality movies to me. So no, not 8k resolution, the same high-resolution blurry mess as Netflix 4k.

Here' I did the hard part for you, and quoted the article.
ah i thought you said HEVC 4K stream vs H264 1080p BD disc. They have similar bit rates but quality isn't comparable. because it is 265v264


and no my eyes arent better again anyone can see 300 DPI...good god. That is what photos are printed at -_- and those are basic photos high quality photos are 600DPI and very noticable; same goes with 150DPI vs 300 DPI lol

Pretend all you want that human eyes are shit but they aren't.

so again stop being drunk because you have never used any of these things...its obvious.

your stance if ridiculous because if eyes are that crappy we wouldn't tell the difference from 720P 1080P or 1440p cellphone and thats plainly not the case. Spew whatever BS you want. You are factually wrong on human eyes and DPI.
 
ah i thought you said HEVC 4K stream vs H264 1080p BD disc. They have similar bit rates but quality isn't comparable. because it is 265v264


and no my eyes arent better again anyone can see 300 DPI...good god. That is what photos are printed at -_- and those are basic photos high quality photos are 600DPI and very noticable; same goes with 150DPI vs 300 DPI lol

Pretend all you want that human eyes are shit but they aren't.

People desire 300 dpi because the magazine is one foot away from your face.

Monitors can get away with half that DPI, because the typical viewing distance is two feet. So 4k is fine for most people.

And TVs typically are 6-8 feet away from people, so the viewing distance game comes into play again.
 
People desire 300 dpi because the magazine is one foot away from your face.

Monitors can get away with half that DPI, because the typical viewing distance is two feet. So 4k is fine for most people.

And TVs typically are 6-8 feet away from people, so the viewing distance game comes into play again.

TL: DR

720P is average human for 1 foot away
360P for 2 feet away


Again this is why photos are printed at 600 or higher DPI if you want best quality. If you ever did anything in this field you would know so stop talking out of your ass.

dear god your dense.

http://www.ubergizmo.com/what-is/ppi-pixels-per-inch/

again you dont know what your talking about

“If the average reading distance is 1 foot (12 inches = 305 mm), p @0.4 arc minute is 35.5 microns or about 720 ppi/dpi. p @1 arc minute is 89 microns or about 300 dpi/ppi. This is why magazines are printed at 300 dpi – it’s good enough for most people. Fine art printers aim for 720, and that’s the best it need be. Very few people stick their heads closer than 1 foot away from a painting or photograph.”

http://techdissected.com/ask-ted/ask-ted-how-many-ppi-can-the-human-eye-see/

again you dont know shit on this topic. 300 PPI for a monitor which you view at 2 feet away is gold.
 
Last edited:
and the next displayport will be 60 or 100Gbps....this HDMI was just annouced...a new displayport standard will be soon as well.

BULLSHIT!

I can see the difference easy especially with photos and other stuff. I can think of a few games right now i would play at 8K and would be far better than 4K. I have a 4K 27 in and i can tell there is more potential for better clarity.

I can see photos being grand on 8K. No need to zoom or anything. Basically 100% image on display is epic or 50 megapixel stiched landscapes *drool*. If you have used 4K or any other high res display you would know what you said is ridiculous. I can tell the difference between a 1080P phone and a 1440P phone and this is 5 or 6 in screen. You dont think i can tell on a 32 or 40 in screen 1-2 feet away....good god man.

see individual pixel and the gap is very different from seeing 2 different colored pixels side by side. Its the same BS as people saying 24 or 30 frames is fast enough...no jack it isn't. That is the flicker threshold and not what we can actually tell in movement. We can see easily 600FPS in fluidness if not 1000 fps....go look at blur busters.

This applies for pixel density too and 8K at 32 or 40 in is very useful and realistic. 4K isn;t that amazing at 27 inches.

The games i would play would be Star wars empire at war (4K is awesome as is and runs perfectly). RCT3, Rome total war and so many others. AOE3 and more.

500DPI vs 300 DPI is very noticeable on cell phones and these displays wouldn't even touch 300 DPI!!!!!


4K 32in is 137 DPI
8K 32 in is 275 DPI

8K is very much acceptable and easily noticeable.

Now stop being drunk

Also H265 vs 264 is not comparable bit rate. *facepalm*


What I would love is to see 60fps movies (120 fps would be awesome but lets be realistic....the data would be off the charts) because the blur on 24fps sucks...watching ben-hur bluray and i really want more fps :/ 24fps sucks
I cannot see the difference in phones. And I've tried.
 
I'm sure displayport will release a higher spec to stay relevant, but my immediate concerns over content are abated with the impending hdmi upgrade. Being a large display aficionado on the desktop I have to use tvs, and they only want to use hdmi. And until now that meant no input could go higher than 4k @60Hz with 8 bit color.

With hdmi 2.1, want 4k @ 120Hz? go for it. throw in 10/12 bit color on top with HDR? no problem. This will be more than enough to tide us over until monitor makers start making monitors larger than 27 to mid 30 inches (ultra wide displays do not count).

I would still like more bandwidth headroom, and hopefull displayport will provide that for 8k and over 120Hz without color compression. And I'd still like to see 4k @ 240Hz to test out 120Hz per eye 3d with some future display so I can see what that new ultra high framerate video looks like Ang Lee was toying with with billy lynns halftime march.
 
I cannot see the difference in phones. And I've tried.
have you had them side by side with a high quality photo that isnt blurry.

If you did and didnt notice it just is your not trained to see it. It is like the average person not seeing ghosting or bad colors....its there...you just dont know what your looking at.
 
have you had them side by side with a high quality photo that isnt blurry.

If you did and didnt notice it just is your not trained to see it. It is like the average person not seeing ghosting or bad colors....its there...you just dont know what your looking at.
I did have them side by side and honestly couldnt tell but I dont wanna say you are wrong, I just disagree for me.
 
I did have them side by side and honestly couldnt tell but I dont wanna say you are wrong, I just disagree for me.
it isnt a night and day thing. You need to look at certain types of things like complex patterns or things like small words or hair. Hair is a really easy way to tell.
 
Back
Top