How HDMI 2.1 Makes Big-Screen 4K PC Gaming Even More Awesome

Zarathustra[H]

Extremely [H]
Joined
Oct 29, 2000
Messages
38,739
As we reported on last month, a new version of the HDMI standard is soon upon us. Many of you know that I use a large 4K TV as my primary monitor on my desktop. As fantastic as this solution has been, I've always wondered what if. What if I could have higher refresh rates than 60hz, or what if I could have 12bit color depth at 4k. Well, CNET's Geoff Morrison has an interesting article up (beware of the autoplay video) about how the new HDMI 2.1 standard may make this a reality, and may allow for a new Game Mode Variable Refresh Rate in TV's.

If this happens, I wonder if Nvidia will allow us to use it, or if they will restrict us to their proprietary G-Sync standard.

But now, thanks to the new HDMI 2.1 specification, future TVs could eliminate -- or at least diminish -- many of those compromises. The greater bandwidth possible with 2.1 means higher resolutions and higher framerates, but that's not all. The Game Mode VRR feature could potentially improve input lag, bringing TVs in closer parity to computer monitors for this important spec. It could also remove, or at least reduce, two serious visual artifacts: jutter and image tearing.
 
I wouldn't be surprised if HDMI's VRR feature is the only feature they wouldn't implement.

But I beg the question, what has HDMI got that DP based displays will not?
 
The existence of the ports on TV's? :p

Sigh, so true. We wont see DP coming anytime soon(in any meaningful presence) either because most panel makers don't want to cannibalize monitor sales by making their TVs too attractive to PC users (and the marginal cost of adding a DP socket is probably vastly outweighed by the lack of marketing and sales upside).
 
The existence of the ports on TV's? :p

Haha, touche...

But I was referring to the panel techs :p

We are getting high refresh 4k monitors (one is even coming out this year), and we are only reaching the point where HDMI will soon see high refresh adaptations on their panels.

Both types of displays got HDR, both will have VRR, anything else I am missing?

Oh, unless you want to count input lag on TVs >.>
 
The green team has a 2:1 advantage over Radeon (Source). I don't think that nVidia's standard matters too much. It would be nice if everyone jumped on a single bandwagon though.
 
Yeah, the cold hard truth is, there is no financial or business incentives for nVidia to support the open standard, especially when considering that they are also making money, even if relatively little, from G-Sync displays, since they'll get 0c from other displays.

I think we can all agree that we all wish nVidia would support this open standard, if for nothing beyond opening up more monitor options for all of us. FreeSync monitors vastly outnumber G-Sync ones
 
Considering Freesync is vastly more popular than G$tink, the writing is on the wall, especially if Vega and future AMD cards can do okay. Displayport was the only standard that could do it (HDMI2.0 was inferior junk in so many ways), but TV manufacturers wont implement that because the plebs don't know about it or use it much.

This will lead to Nvidia having to bend the knee or become an irrelevant choice for driving the increasing amount of 4k TVs which are appearing. 4k gaming is here now thanks to the upscaling PS-Apprentice. I've been driving a TV with my PC for 6-7 years now, in future this will be a 4K large size screen with active sync or similar like many of you already do here. If Nvidia doesn't support that and AMD does, even with a 5% penalty on launch and the bundled, mid 2000s AMD drivers to be game ready and all, it'll still be a vastly better experience, especially as frame rates increase on the panels used.

This is a case of Nvidia has to, or AMD guys will just turn around and point out this until their own fans whinge enough so they do include with the 1182 Ti. Because a feature is never cool until Nvidia does it.

Sigh, so true. We wont see DP coming anytime soon(in any meaningful presence) either because most panel makers don't want to cannibalize monitor sales by making their TVs too attractive to PC users (and the marginal cost of adding a DP socket is probably vastly outweighed by the lack of marketing and sales upside).

Modern TVs are fucking cheap POSs. Go look at any older late 00s TV, it'l have 4-5 HDMI ports. Want to plug your Praystation, PC and some stream box in to even the super fancy OLED ones today? Yeah, good luck. One plug, two if your very lucky. Fucking assholes would've never included Dp even if it was far, far more superior than it already was.


edit: Vega delays now making more sense. They already had TDP ratings out a while ago, so yields likely are well known for a while now, so they are potentially waiting for the damn HDMI chips to avoid a fury-x no HDMI2.0 debarcle again. Which also means, if there is a Ti, it'll come after those chips... I'd almost be willing to bet $ on that.
 
Last edited:
Considering Freesync is vastly more popular than G$tink, the writing is on the wall, especially if Vega and future AMD cards can do okay. Displayport was the only standard that could do it (HDMI2.0 was inferior junk in so many ways), but TV manufacturers wont implement that because the plebs don't know about it or use it much.

This will lead to Nvidia having to bend the knee or become an irrelevant choice for driving the increasing amount of 4k TVs which are appearing. 4k gaming is here now thanks to the upscaling PS-Apprentice. I've been driving a TV with my PC for 6-7 years now, in future this will be a 4K large size screen with active sync or similar like many of you already do here. If Nvidia doesn't support that and AMD does, even with a 5% penalty on launch and the bundled, mid 2000s AMD drivers to be game ready and all, it'll still be a vastly better experience, especially as frame rates increase on the panels used.

This is a case of Nvidia has to, or AMD guys will just turn around and point out this until their own fans whinge enough so they do include with the 1182 Ti. Because a feature is never cool until Nvidia does it.



Modern TVs are fucking cheap POSs. Go look at any older late 00s TV, it'l have 4-5 HDMI ports. Want to plug your Praystation, PC and some stream box in to even the super fancy OLED ones today? Yeah, good luck. One plug, two if your very lucky. Fucking assholes would've never included Dp even if it was far, far more superior than it already was.


edit: Vega delays now making more sense. They already had TDP ratings out a while ago, so yields likely are well known for a while now, so they are potentially waiting for the damn HDMI chips to avoid a fury-x no HDMI2.0 debarcle again. Which also means, if there is a Ti, it'll come after those chips... I'd almost be willing to bet $ on that.

A lot of ranting here. Right now, Nvidia is the only choice for large format 4k since AMD decided to not bother with HDMI 2.0. Maybe Nvidia uses it in the future, maybe Nvidia does not. Whomever does, gets my business.

And every 'modern' TV in my home, which includes 3 4k sets and a 2013 Panasonic plasma has more than 3 HDMI ports. If you need more than that, buy a receiver or a soundbar with multiple inputs.
 
I can get 80Hz @ 4k out of the Displayport connection of my 980Ti. That's the limit though, any more than that and the PC has a fit.
 
Less G-Sync monitors but more owners with the capability to use them.

You underestimate just how many people are willing to pair a $300~$500 gpu with a <$200 monitor, when the budget alocation should have been the other way round, in the same way that audiophiles recommend better sound system before getting a better sound card.

Then there is also the G-Sync premium on top, when you compare it to an otherwise almost identical monitor (EG MG279Q vs PG279Q).

I nearly always recommend AMD when the choice is between RX480 or GTX 1060, no questions if high refresh FreeSync monitor is thrown into the equation. The problem I have with AMD is that they completely screwed the pooch when they decided HDMI 2.0 wasn't on their Fury lineup, which IMHO, even from an nVidia user, is a colossal mistake.

Not so much of a mistake now that FreeSync 4k panels are now much more common, but first impressions are everything, and back then, 4k FreeSync was rare, and that soured rep is still in a lot of people's heads.

I honestly hope AMD doesn't make this mistake with HDMI 2.1, otherwise HDMI 2.1 VRR could very well never see the light of day until whatever HDMI is coming out next, barring the somewhat unlikely scenario that nVidia suddenly decides to support it, while I wouldn't get my hopes up, it would not be entirely surprising as HDMI 2.1 and G-Sync markets have very little overlap between them, but we definitely won't be seeing it until post Volta, at the very least, more realistically post post Volta. With AMD finishing up Vega, they might have the time advantage on their hands as they can get to work on HDMI 2.1 VRR earlier than nVidia.

That's the GPU side, TV side is a WHOLE different question.
 
I feel variable refresh is still only a secondary option when you can't get a stable 1:1 refresh. (Variable refresh, like G-Sync but just pared with a fixed performance level is worth it for the reduction in input lag at just 60hz. I wouldn't mind seeing that at all!)And also useful as a continued excuse for badly optimized games and consoles.

Higher color depth without any chroma subsampling is very welcome however.

Refreshes higher than 60, unless you need less input latency playing a competitive game. Without a backlight strobing mode, feel pointless as the motion blur reduction just by increasing the refresh rate is not significant when sample and hold ruins it all.
 
Modern TVs are fucking cheap POSs. Go look at any older late 00s TV, it'l have 4-5 HDMI ports. Want to plug your Praystation, PC and some stream box in to even the super fancy OLED ones today? Yeah, good luck. One plug, two if your very lucky.
That's probably because anyone buying a higher-end tv actually has a proper sound system with a good receiver/amp with everything plugged into it. Don't know who would spend thousands on a tv and then live with the integrated tinny speakers... Also, the new oled craze is on thinner tv's and with all the electronics moved into the base few people want a ton of hdmi inputs to reduce the amount of cable clutter. They could add more, but obviously the market doesn't need that.
 
That's probably because anyone buying a higher-end tv actually has a proper sound system with a good receiver/amp with everything plugged into it. Don't know who would spend thousands on a tv and then live with the integrated tinny speakers... Also, the new oled craze is on thinner tv's and with all the electronics moved into the base few people want a ton of hdmi inputs to reduce the amount of cable clutter. They could add more, but obviously the market doesn't need that.

My display is wall mounted, stuffed if I'm running any more than one HDMI cable up the damn wall! To make it worse I also have a projector, there's no way I'm running more than one HDMI cable up the wall and through the roof cavity.
 
Of course they won't. Nvidia use an open standard? Bahaha.

Here is the thing. Have you ever wondered why we dont see monitors with both FreeSync and Gsync included, even though there is no reason the two are not compatible?


Nvidia licensing. Nvidia will push Gsync and refuse to acknowledge the existence of alternatives.
 
Here is the thing. Have you ever wondered why we dont see monitors with both FreeSync and Gsync included, even though there is no reason the two are not compatible?


Nvidia licensing. Nvidia will push Gsync and refuse to acknowledge the existence of alternatives.

I am under the impression that both required dedicated hardware to work, and might prove problematic if you tried to force the two together (like two sets of controllers controlling the same set of pixels).
 
I am under the impression that both required dedicated hardware to work, and might prove problematic if you tried to force the two together (like two sets of controllers controlling the same set of pixels).

My understanding is that with G-Sync you need dedicated hardware in the monitor that Nvidia cards communicate with, but that with FreeSync you just need a panel capable of switching refresh rates on the fly, and the GPU does the work.

This could be a vast oversimplification, as I have never read up on it, but I feel like there shouldn't be anything technical precluding both from working on the same screen.

There should also be nothing stopping Nvidia from making their video cards FreeSync compatible through a driver update.

I would not be surprised if the license that monitor manufacturers have to sign in order to get G-Sync hardware includes an agreement that they will not make the monitors compatible with any other dynamic refresh rate standard.

Nvidia are not idiots. They realize most people buy monitors way less often than they buy GPU's, and once they have a G-Sync monitor they are less likely to experiment with AMD GPU's.

They have also shown time and time again that they are perfectly comfortable with customer hostile policies, like this kind of lock-in.

I love Nvidia's products. I have their current halo card in my rig, but IMHO this type of manipulative shit trying to lock users in, and the likes of GameWorks manipulating game developers to give their games Nvidia exclusive features are just purely shameless and evil.
 
Last edited:
I love Nvidia's products. I have their current halo card in my rig, but IMHO this type of manipulative shit trying to lock users in, and the likes of GameWorks manipulating game developers to give their games Nvidia exclusive features are just pure shameless and evil.
This is why I continue to buy AMD cards even when Nivida has superior performance. I refuse to support shady tactics like that.
 
I am under the impression that both required dedicated hardware to work, and might prove problematic if you tried to force the two together (like two sets of controllers controlling the same set of pixels).

FreeSync does not require any additional hardware, just a standard DP1.2b controller board and compatible panel, which nearly ALL Gsync monitors have. Nvidia forbids manufacturers from enabling Adaptive-Sync (brand-agnostic FreeSync) when they include Nvidia's proprietary Gsync hardware in their product.
 
If they put Gsync in I wouldn't buy it, already tried that buggy pos and waited months for drivers to adress their windowed mode single digit FPS rather than play full screen only like every stupid fanboy said
 
This is why I continue to buy AMD cards even when Nivida has superior performance. I refuse to support shady tactics like that.


I wish I had the freedom to do so. AMD Simply doesn't have a single GPU that's fast enough for me anymore, and I've been disappointed by Crossfire and SLI too many times to ever want to do that again.

4k takes quite a beast of a card to drive properly.

Make no mistake though. If there were an AMD card that were fast enough to run 4k ultra settings at 60fps min framerate, I'd switch in a heartbeat.
 
The green team has a 2:1 advantage over Radeon (Source). I don't think that nVidia's standard matters too much. It would be nice if everyone jumped on a single bandwagon though.
Something tell me you got that wrong more like 4:1 advantage just look at how fast the GeForce GTX 1060 is selling like hot cake with .50%+ each month I beat in a few more months it will be 3rd and by July maybe in 2nd if not 1st places by then.
 
Modern TVs are fucking cheap POSs. Go look at any older late 00s TV, it'l have 4-5 HDMI ports. Want to plug your Praystation, PC and some stream box in to even the super fancy OLED ones today? Yeah, good luck. One plug, two if your very lucky. Fucking assholes would've never included Dp even if it was far, far more superior than it already was.

Stop buying $299 TVs at Wal-Mart.

Every LG OLED that BB has listed has at least 3 and the majority have 4.

Samsung KS8000: 4
Samsung KU7000: 3
Sony X850D: 4
Vizio D3: 4
Sharp 7000U: 4
Toshiba 621I: 3
 
FreeSync does not require any additional hardware, just a standard DP1.2b controller board and compatible panel, which nearly ALL Gsync monitors have. Nvidia forbids manufacturers from enabling Adaptive-Sync (brand-agnostic FreeSync) when they include Nvidia's proprietary Gsync hardware in their product.

"standard DP1.2b controller board" This is what I meant by "additional" hardware. You can't make a DP1.2 monitor into a DP1.2b FreeSync via a firmware update for example, and it requires a new controller board that conforms to the DP1.2B specs.

I was, however, under the impression that G-Sync module didn't go through a DP1.2b board, rather just a standard DP1.2 board (considering G-Sync modules pre-dates FreeSync), and if FreeSync were to, technically, be integrated into G-Sync, you'd require two sets of controllers (one for G-Sync, the other being DP1.2b) that has nothing to do with each other.

Guess my impression wasn't correct
 
It is so funny how all these IDIOTS buy tv's at wal-mart and then complain that they suck. Humans need to go extinct.
 
I was surprised the new HDMI 2.1 specs reaching 10k at up to 120fps and it's coming soon. Most gamers use DisplayPort anyway but, it's good to finally see HDMI catch-up at bit.

* The physical connectors and cables look the same as today's
* Improved bandwidth from 18Gbps (HDMI 2.0) to 48Gbps (HDMI 2.1)
* Can carry resolutions up to 10K, frame rates up to 120fps
* New cables required for higher resolutions and/or frame rates
* Spec is still being finalized, expected to publish April-June 2017
* First products could arrive late 2017, but many more will ship in 2018

HDMI 2.1: What you need to know
https://www.cnet.com/news/hdmi-2-1-what-you-need-to-know/

8k
http://www.hdmi.org/manufacturer/hdmi_2_1/index.aspx

HDMI 2.1
https://en.wikipedia.org/wiki/HDMI#Version_2.1

HDMI 2.1 as Fast As Possible

 
If they can make a 65" TV with no input lag in game mode I will buy that shit tomorrow. Couch gaming with a mouse sucks when your TV still manages to put out 28ms of input lag in game mode.
 
If they can make a 65" TV with no input lag in game mode I will buy that shit tomorrow. Couch gaming with a mouse sucks when your TV still manages to put out 28ms of input lag in game mode.

I don't feel the need for higher than 4k resolutions. It's just needlessly GPU crippling, and I also don't much care for the couch/livingroom gaming experience, but I currently use a 4K 48" TV for my desk.

Since no one seems to want to make a ~42" 4K G-Sync monitor, if some company were to come out with a TV with low laq, true 4:4:4 chroma, a true usable 120hz panel and this new VRR variable refresh feature, I'd definitely give it some real consideration as the next best thing.
 
As we reported on last month, a new version of the HDMI standard is soon upon us. Many of you know that I use a large 4K TV as my primary monitor on my desktop. As fantastic as this solution has been, I've always wondered what if. What if I could have higher refresh rates than 60hz, or what if I could have 12bit color depth at 4k. Well, CNET's Geoff Morrison has an interesting article up (beware of the autoplay video) about how the new HDMI 2.1 standard may make this a reality, and may allow for a new Game Mode Variable Refresh Rate in TV's.

If this happens, I wonder if Nvidia will allow us to use it, or if they will restrict us to their proprietary G-Sync standard.

But now, thanks to the new HDMI 2.1 specification, future TVs could eliminate -- or at least diminish -- many of those compromises. The greater bandwidth possible with 2.1 means higher resolutions and higher framerates, but that's not all. The Game Mode VRR feature could potentially improve input lag, bringing TVs in closer parity to computer monitors for this important spec. It could also remove, or at least reduce, two serious visual artifacts: jutter and image tearing.


For tv makers to REALLY start giving a damn about adding the feature, AMD needs to lobby microsoft HARD for their upgraded scorpio console to include the feature. It might be too late to add that support into the console, but if they could add support, it would be a game changer. Now a console would have support for variable refresh rate at up to 120Hz. If they can't add support for hdmi 2.1, AMD should at LEAST try and get microsoft to add support for VRR in hdmi 2.0 on that console, that would get more tv makers to do that as a minimum to cater to the gaming crowd.


Once these markets begin to intersect as far as features goes, the tyranny of overpriced monitors with tiny screens will begin to slide into the abyss where it belongs.
 
...

edit: Vega delays now making more sense. They already had TDP ratings out a while ago, so yields likely are well known for a while now, so they are potentially waiting for the damn HDMI chips to avoid a fury-x no HDMI2.0 debarcle again. Which also means, if there is a Ti, it'll come after those chips... I'd almost be willing to bet $ on that.


I would LOVE for that to be true, but I have little expectation for amd to put that controller or whatever is needed onto vega in time for launch in a couple months. Is that even something that can be added that soon? I thought they had to do some more major redesigns of the gpu layout or something to add that kind of support. Is it as easy as adding some hdmi 2.1 controller chip?
 
I don't feel the need for higher than 4k resolutions. It's just needlessly GPU crippling.
There is no such thing as needlessly. Higher resolution is better, until you can no longer tell the difference on a game without AA. Remember games are not movies, there is no natural anti aliasing. You have to up the resolution until individual pixels are so small that they are completely indiscernible and you can't see jagged edges on any line.
 
There is no such thing as needlessly. Higher resolution is better, until you can no longer tell the difference on a game without AA. Remember games are not movies, there is no natural anti aliasing. You have to up the resolution until individual pixels are so small that they are completely indiscernible and you can't see jagged edges on any line.

You could do that, but at that point, wouldn't good anti-aliasing be computationally cheaper, and look just as good?

And once you hit a certain DPI, using DSR instead of increasing increasing the resolution should provide similar results.

Personally I like 100DPI for the desktop. I hate any kind of desktop scaling, and don't see any need to go above this. A desktop screen is not a phone, we don't hold it mere inches from our eyes. (at least most don't)
 
Back
Top