Is HDMI 2.1 a hardware update or software update?

Panel

Gawd
Joined
Nov 24, 2016
Messages
518
I was curious... HDMI 2.1 is looking to have a release date sometime in 2017. It just got me thinking, will this be one of those updates where we actually need to purchase new hardware (TVs, Monitors) to take advantage of the new HDMI standard, or will it be implemented through software so that all our current equipment can automatically take advantage of everything?

I'm not sure if there's a definitive answer on this or just rumors, but either way, I wouldn't mind getting an idea of what's to come.
 
They're not going to increase bandwidth from 18Gbps to 42Gbps via a software update.

There are features like Dynamic HDR and Game Mode VRR which could theoretically be added to existing HDMI 2.0 devices via a software update, but I think they realize how much of a mess HDMI 2.0 was when they allowed 10.2Gbps and 18Gbps implementations, so they probably don't want that to happen again.
 
Will we see the 2017 models of HDTVs come with HDMI 2.1 or will be waiting til 2018?
 
I assume that, as with previous versions, most of the new features are optional. So yes, a software upgrade MAY be possible, but it doesn't mean it will give you the features you want.
 
I bet it's a hardware update.

I'm fucking bummed out that i bough only a few months ago a 75" Sony ZD9 and a 65" LG Oled OLED65G6P ( $10k + $7k)

And they sure as fuck don't have hdmi 2.1 (or even 2.0b)

Those are seriously kickass TVs, but as soon as 2.1 comes out i'm gonna feel pretty butthurt.
 
Last edited:
I bet it's a hardware update.

I'm fucking bummed out that i bough only a few months ago a 75" Sony ZD9 and a 65" LG Oled OLED65G6P ( $10k + $7k)

And they sure as fuck don't have hdmi 2.1 (or even 2.0b)

Those are seriously kickass TVs, but as soon as 2.1 comes out i'm gonna feel pretty butthurt.
Your TV's aren't going to suddenly look like crap when HDMI 2.1 is out. Enjoy them for as long as possible. Tech is always improving.
 
Your TV's aren't going to suddenly look like crap when HDMI 2.1 is out. Enjoy them for as long as possible. Tech is always improving.
I know i'm gonna feel pretty bummed when the 2.1 tech is gonna be out but I won't be able to do 4k at 120Hz.

But yeah, so far these TVs are stunning and i'm enjoying them quite a lot.
 
I know i'm gonna feel pretty bummed when the 2.1 tech is gonna be out but I won't be able to do 4k at 120Hz.

But yeah, so far these TVs are stunning and i'm enjoying them quite a lot.

Dude, what GPU you are using? Because unless you are doing SLI gtx1080 or Titan you cannot fully enjoy even 4k 60hz except for old games from xbox360 and ps3 era. The hardware requirements for 4k @ 120hz are just insane.

Just enjoy your TV's, you have the best of what both LED and OLED can offer.
 
Meanwhile bluray and dvd are still 4:2:0 So i see no reason to be bummed that now there is a beyond 4:4:4 spec out there that would require 10x the space of a bluray for the same rez
 
I was curious... HDMI 2.1 is looking to have a release date sometime in 2017. It just got me thinking, will this be one of those updates where we actually need to purchase new hardware (TVs, Monitors) to take advantage of the new HDMI standard, or will it be implemented through software so that all our current equipment can automatically take advantage of everything?

I'm not sure if there's a definitive answer on this or just rumors, but either way, I wouldn't mind getting an idea of what's to come.
In the graphics card side: I quess there may be a tiny possibility that the current hardware might support HDMI 2.1 (full banddwidth). At least nVidia started to promise dp1.3 and 1.4 support with Pascal chipset, even tough those standards were not fully ready, so final dp1.3 & 1.4 support will require at least driver (and probably bios) updates.
(exact wording is: DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready). https://www.nvidia.com/en-us/geforce/products/10series/geforce-gtx-1080/

In the display side, I am quite sure that none of current TCON:s won't support HDMI 2.1 full bandwidth.
Dell's new 8K-display requires two TCONs which means that 8K@120Hz will need four of them.
First commercial 8K broadcasts trialled 2016 (Rio Olympics), and 2020 Tokio Olympics will offer 8K aerial broadcasts in Japan. So, I guess that we will not see affordable 8K TCON:s and televisions until 2018.
 
Dude, what GPU you are using? Because unless you are doing SLI gtx1080 or Titan you cannot fully enjoy even 4k 60hz except for old games from xbox360 and ps3 era. The hardware requirements for 4k @ 120hz are just insane.

Just enjoy your TV's, you have the best of what both LED and OLED can offer.
That's true, but I was looking forward to having a solid 120Hz 4K monitor so that it could be future proof. Each year's new hardware would be able to push more and more frames. It seems that a lot of people on this thread are saying that I won't be able to get 120Hz through a software update, AND that there will be no new TVs that carry a hardware update. So essentially, I have to go with something else for the time being.
 
It would be nice to run the Windows desktop in 4:4:4 10bit @60hz. No need to switch it into 4:2:0 for HDR.

Beyond that I'm not real upset about missing out on HDMI 2.1. My stupid expensive monster cable should handle more than 18gbps but my KS8000 will not. Even if Samsung could just release an upgraded one connect box they probably would not do that. Not to mention I might have to ugprade my gtx 1080 in order to provide a source signal for the upgraded standard.

I would probably be a little more concerned if I'd just spend many thousands on a very expensive OLED.
 
HDMI 2.1 48Gbps Tcon chips I expect in 2019. That silicon will be hard to create. Anything else that claims HDMI 2.1 not running at 48 Gbps is garbage. The HDMI consortium are a bunch of con artists so be aware.
 
HDMI 2.1 48Gbps Tcon chips I expect in 2019. That silicon will be hard to create. Anything else that claims HDMI 2.1 not running at 48 Gbps is garbage. The HDMI consortium are a bunch of con artists so be aware.
What makes you think it will be 2019 and not 2018?
The delay (i.e. not 2017) and the removal of 3D from future models is making me consider a 55 or possibly 65 E6 OLED now, even though I had previously decided that they were not for me. (WRGB, low brightness, and bad motion handling)
 
Dude, what GPU you are using? Because unless you are doing SLI gtx1080 or Titan you cannot fully enjoy even 4k 60hz except for old games from xbox360 and ps3 era. The hardware requirements for 4k @ 120hz are just insane.

Just enjoy your TV's, you have the best of what both LED and OLED can offer.
My GTX 1080 runs the newest games just fine at 4K. Might see drops to around 45fps at times using max settings but it mostly sits at 60fps or higher. The only game that gives me trouble that I own is Watch Dogs 2. That's Ubisoft for you though.
 
For some TVs like Samsungs that have a separate box for inputs they could offer an upgrade kit that just replaces that. That said, it is unlikely.

We would also need support from other devices for variable refresh rate. AMD GPUs might work fine but with Nvidia insisting on their G-Sync I am not so sure if we could get VRR working with them. Consoles would be the ones that would benefit from it the most and maybe PS4 and Xbox could be upgraded to support it. As I understand even HDMI 2.0 should support VRR so a software update for both the consoles and TV could enable it.
 
My GTX 1080 runs the newest games just fine at 4K. Might see drops to around 45fps at times using max settings but it mostly sits at 60fps or higher. The only game that gives me trouble that I own is Watch Dogs 2. That's Ubisoft for you though.


I on the other hand cannot stand framerate dips at all. Stable 60hz 99% of the time or nothing. I play less demanding games at 1620p and more demanding ones at 1440p with eye candies maxed out on both (minus antique MSAA. SMAA/FXAA is good enough). The single GTX1080 just does not do it for me at 4k.
 
I on the other hand cannot stand framerate dips at all. Stable 60hz 99% of the time or nothing. I play less demanding games at 1620p and more demanding ones at 1440p with eye candies maxed out on both (minus antique MSAA. SMAA/FXAA is good enough). The single GTX1080 just does not do it for me at 4k.
Which display do you use again? (sorry if this is in your signature; I'm on mobile and can't currently see it).
 
Samsung 49KS7500, european equivalent to KS8500.
Is there any way to "force" 1080p at 120 on this TV? I've heard mixed things, though most people on [H] seem to be in agreement that there's not.
 
Is there any way to "force" 1080p at 120 on this TV? I've heard mixed things, though most people on [H] seem to be in agreement that there's not.

Unfortunately no. Panel is 120hz but the firmware does not accept anything beyond 60hz.
 
I on the other hand cannot stand framerate dips at all. Stable 60hz 99% of the time or nothing. I play less demanding games at 1620p and more demanding ones at 1440p with eye candies maxed out on both (minus antique MSAA. SMAA/FXAA is good enough). The single GTX1080 just does not do it for me at 4k.
You are picky, lol. 45-50 frames is hardly noticeable to me and like I said, games usually run at 60 frames or higher. With that said, I'll be upgrading to a 1080 Ti or the latest AMD card soon enough. That push my games to 60fps+. Except for Watch Dogs 2...
 
You are picky, lol. 45-50 frames is hardly noticeable to me and like I said, games usually run at 60 frames or higher. With that said, I'll be upgrading to a 1080 Ti or the latest AMD card soon enough. That push my games to 60fps+. Except for Watch Dogs 2...
Minimum framerates are all about your CPU. Watch Dogs 2 especially - that game needs an 8-core CPU.
Anything below 60 FPS on a 60Hz fixed refresh screen looks terrible.
 
Minimum framerates are all about your CPU. Watch Dogs 2 especially - that game needs an 8-core CPU.
Anything below 60 FPS on a 60Hz fixed refresh screen looks terrible.
Highly doubt an 8 core is going to give me much better frames in Watch Dogs 2. I have a 6800k @4.3Ghz.
 
You are picky, lol. 45-50 frames is hardly noticeable to me and like I said, games usually run at 60 frames or higher. With that said, I'll be upgrading to a 1080 Ti or the latest AMD card soon enough. That push my games to 60fps+. Except for Watch Dogs 2...

I know, but in my defense I came from Eizo 120hz screen so the drop to 60hz max was already jarring enough. :D
 
I know, but in my defense I came from Eizo 120hz screen so the drop to 60hz max was already jarring enough. :D
That I can understand. I'm used to 60Hz so a drop to 45-50fps isn't jarring for me.
 
Highly doubt an 8 core is going to give me much better frames in Watch Dogs 2. I have a 6800k @4.3Ghz.
Oh I would have assumed quad-core.
It really seems like the game needs an 8-core CPU to stay above 60 FPS.
There are no test results that I've seen for a 6 core but when you compare 4 to 8, I doubt 6 would be be enough.

BJ0Ve6a.png
 
Minimum framerates are all about your CPU. Watch Dogs 2 especially - that game needs an 8-core CPU.
Anything below 60 FPS on a 60Hz fixed refresh screen looks terrible.
If I may ask, what exactly is the difference between a fixed refresh screen and a varying refresh screen? I'm not sure what "fixed refresh" is.... some sort of inferiority in TV screens?

MaZa, tagging you because you have a Samsung KS and likely know what the difference is.
 
Oh I would have assumed quad-core.
It really seems like the game needs an 8-core CPU to stay above 60 FPS.
There are no test results that I've seen for a 6 core but when you compare 4 to 8, I doubt 6 would be be enough.

BJ0Ve6a.png
Would not make a difference between a 6 core and 8 core @1440p at above.
 
If I may ask, what exactly is the difference between a fixed refresh screen and a varying refresh screen? I'm not sure what "fixed refresh" is.... some sort of inferiority in TV screens
Fixed refresh means that the screen always updates at a single refresh rate like 60Hz.
Variable refresh means that the screen syncs the refresh rate to the framerate, which eliminates stutter due to framerate drops.

A constant 59 FPS on a 60Hz fixed refresh screen will stutter once every second.
59 FPS on a VRR display will be perfectly smooth, because the screen will be refreshing at 59Hz.

VRR technologies include G-Sync, VESA Adaptive-Sync, FreeSync, and now HDMI 2.1's Game Mode VRR.

Would not make a difference between a 6 core and 8 core @1440p at above.
Resolution has nothing to do with it, unless you are picking graphical settings that you GPU can't handle.
 
Fixed refresh means that the screen always updates at a single refresh rate like 60Hz.
Variable refresh means that the screen syncs the refresh rate to the framerate, which eliminates stutter due to framerate drops.

A constant 59 FPS on a 60Hz fixed refresh screen will stutter once every second.
59 FPS on a VRR display will be perfectly smooth, because the screen will be refreshing at 59Hz.

VRR technologies include G-Sync, VESA Adaptive-Sync, FreeSync, and now HDMI 2.1's Game Mode VRR.

Resolution has nothing to do with it, unless you are picking graphical settings that you GPU can't handle.
Watch Dogs 2 just Isn't well optimized. My 6800k gets the same frames as a 6900k. Can you prove otherwise?
 
Last edited:
VRR technologies include G-Sync, VESA Adaptive-Sync, FreeSync, and now HDMI 2.1's Game Mode VRR
Thanks for the information. I honestly feel quite dumb, as I should have realized that adaptive sync is the same thing as variable refresh.

Anyway, what are you talking about in HDMI's Game Mode? I've heard rumors of Freesync being added to TVs in 2017 or 2018 and that Freesync has been tested over HDMI, but I never realized that HDMI was making it's own version of adaptive sync all together. Can you tell me a bit more about this?
 
The only details so far are:

http://www.hdmi.org/press/press_release.aspx?prid=145 said:
  • Game Mode VRR features variable refresh rate, which enables a 3D graphics processor to display the image at the moment it is rendered for more fluid and better detailed gameplay, and for reducing or eliminating lag, stutter, and frame tearing.
 
  • Like
Reactions: Panel
like this
A constant 59 FPS on a 60Hz fixed refresh screen will stutter once every second.
59 FPS on a VRR display will be perfectly smooth, because the screen will be refreshing at 59Hz.
That's not how games work. You get tearing, not stutter.
 
That's not how games work. You get tearing, not stutter.
If you disable V-Sync, you always get tearing no matter what.
When the framerate is out of sync with the refresh rate - like 59 FPS at 60Hz - you also get stuttering, but the tearing might be so bad that you miss it.

And if you thought you disabled V-Sync but aren't seeing tearing, you're running in windowed mode and that's applying V-Sync to the game instead.
 
Why would you get stuttering? You don't get one repeared frame each second or similar.
Stuttering comes from highly variable framerate.
 
Last edited:
Why would you get stuttering? You don't get one repeared frame each second or similar.
Stuttering comes from highly variable framerate.
Because there is no way that you can update a display 60 times a second with less than 60 images (except for divisors) and have it appear smooth.
 
I don't see why not. Displaying 59 FPS or 60 FPS, without vsync, changes nothing but tearing points.
 
Back
Top