How HDMI 2.1 Makes Big-Screen 4K PC Gaming Even More Awesome

You could do that, but at that point, wouldn't good anti-aliasing be computationally cheaper, and look just as good?

And once you hit a certain DPI, using DSR instead of increasing increasing the resolution should provide similar results.

Personally I like 100DPI for the desktop. I hate any kind of desktop scaling, and don't see any need to go above this. A desktop screen is not a phone, we don't hold it mere inches from our eyes. (at least most don't)
The only good anti aliasing is supersampling. Everything else is inferior to upping the resolution.

DSR is just a fancy name for supersampling. The only difference compared to SSAA is that it supersamples everything not just apparent edges.

Scaling is not the devil. Bad scaling that unfortunately every windows version has is the problem. There is absolutely no problem with UI scaling on an Android phone, and the crisp clear fonts and menus are looking great. We're currently missing out on that on the desktop due to the absolutely shit implementation of scaling in windows.

So while I agree for desktop use scaling looks bad currently, but the topic is about gaming.

Saying scaling is shit therefore we shouldn't up the resolution ever again, is not a solution.

MS should be pressured to make it better, or die. And hope for everything's sake that's holy that windows 10 is not the last windows.
 
The only good anti aliasing is supersampling. Everything else is inferior to upping the resolution.

DSR is just a fancy name for supersampling. The only difference compared to SSAA is that it supersamples everything not just apparent edges.

Scaling is not the devil. Bad scaling that unfortunately every windows version has is the problem. There is absolutely no problem with UI scaling on an Android phone, and the crisp clear fonts and menus are looking great. We're currently missing out on that on the desktop due to the absolutely shit implementation of scaling in windows.

So while I agree for desktop use scaling looks bad currently, but the topic is about gaming.

Saying scaling is shit therefore we shouldn't up the resolution ever again, is not a solution.

MS should be pressured to make it better, or die. And hope for everything's sake that's holy that windows 10 is not the last windows.


Well, I still think it comes down to power vs performance. Yes, no aliasing looks better, but as you up the resolution the GPU requirements grow at 4 times the pace. Pretty quickly we are going to reach the point where the overwhelming majority of users will think that the marginal improvement they get is not worth having to buy a $1,200 GPU.
 
Pretty quickly we are going to reach the point where the overwhelming majority of users will think that the marginal improvement they get is not worth having to buy a $1,200 GPU.
Then they'll find other reasons to buy new GPUs. Or they'll come around.

In 1994 most people thought 320x200 is enough for gaming why do I want 640x480?
in 1999 most people thought 640x480 is enough for gaming why do I want 1024x768?
In 2004 most people though 1024x768 is enough for gaming why do I want 1440x900?
In 2009 most people thought 1680x1050 is enough for gaming why do I want 1920x1080?
In 2014 most people thought 1920x1080 is enough for gaming why do I want 3840x2160?

Do you see the pattern there?

And now in 2017 two years early most people say why do we need 8K?

Like it or not resolution is doubling roughly every 5 years. So by 2019 I expect 8K monitors to be widely available and used.
 
Then they'll find other reasons to buy new GPUs. Or they'll come around.

In 1994 most people thought 320x200 is enough for gaming why do I want 640x480?
in 1999 most people thought 640x480 is enough for gaming why do I want 1024x768?
In 2004 most people though 1024x768 is enough for gaming why do I want 1440x900?
In 2009 most people thought 1680x1050 is enough for gaming why do I want 1920x1080?
In 2014 most people thought 1920x1080 is enough for gaming why do I want 3840x2160?

Do you see the pattern there?

And now in 2017 two years early most people say why do we need 8K?

Like it or not resolution is doubling roughly every 5 years. So by 2019 I expect 8K monitors to be widely available and used.


I do, but screen size has also grown to keep maintaining desktop DPI at about 100.

I have a 48" 4k TV on my desk, I use at about 2.5ft distance. I don't think I either can or want to go any bigger :p Even the 48" screen is a tad big. (I think 4k would be perfect at ~42-43")

screen.jpg
 
So I just bought some GTX 1050s for my family's HTPCs, which has HDMI 2.0b standard... hope it gets updated for 2.1. :cautious:
 
I do, but screen size has also grown to keep maintaining desktop DPI at about 100.

I have a 48" 4k TV on my desk, I use at about 2.5ft distance. I don't think I either can or want to go any bigger :p Even the 48" screen is a tad big. (I think 4k would be perfect at ~42-43")

So do you need just my address to send that stuff to me? Or should I meet you somewhere? What works best for you? :p
 
So I just bought some GTX 1050s for my family's HTPCs, which has HDMI 2.0b standard... hope it gets updated for 2.1. :cautious:

Wow, those are some beefy GPU's for HTPC use.

I currently have GT720's in my three HTPC's I've been waiting for and hoping Nvidia will launch follow up low end GPU's so I can add newer versions that have hardware HEVC decoding, because 1050's seem like total overkill, are too expensive, and use too much power - IMHO - for my movie watching needs.

I'm starting to think that Nvidia has decided to permanenty cede the low end to on board graphics, which is a shame, as none of the on board solutions from either AMD or Intel have been fully stable in my implementations.
 
hdmi 2.1? that's nice.

But it will take another 7 years for it to appear standard on laptops. Laptops today still have the shitty hdmi 1.4 ports.

Does it really cost that much to make hdmi 2.0 the standard in all laptops?

and what about my Thunderbolt 3 port...... will there one day be an hdmi adapter for thunderbolt 3 that can support higher than 4k@60hz ?
 
Wow, those are some beefy GPU's for HTPC use.

I currently have GT720's in my three HTPC's I've been waiting for and hoping Nvidia will launch follow up low end GPU's so I can add newer versions that have hardware HEVC decoding, because 1050's seem like total overkill, are too expensive, and use too much power - IMHO - for my movie watching needs.

I'm starting to think that Nvidia has decided to permanenty cede the low end to on board graphics, which is a shame, as none of the on board solutions from either AMD or Intel have been fully stable in my implementations.
You sure we're talking about the same card? https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/25.html

Idle power draw is 3 watts, which IMO is incredibly efficient. TDP is only 75 watts. Noise is non-existent. I did a lot of research, and considered it probably the ultimate HTPC video card for a budget of ~$100.

All problems I have ever had with HTPCs have been due to using Intel drivers. Screen flicker and copy protection crap and what not always limited to my integrated graphics systems, resolved with a dedicated video card, which offers far better overall performance anyway since HTPCs are used for just about everything these days. Heck, on my own HTPC, I'm running a 290x, and when I retire my 295x2 from my primary rig, it will go in my HTPC. :D
 
You sure we're talking about the same card? https://www.techpowerup.com/reviews/MSI/GTX_1050_Ti_Gaming_X/25.html

Idle power draw is 3 watts, which IMO is incredibly efficient. TDP is only 75 watts. Noise is non-existent. I did a lot of research, and considered it probably the ultimate HTPC video card for a budget of ~$100.

All problems I have ever had with HTPCs have been due to using Intel drivers. Screen flicker and copy protection crap and what not always limited to my integrated graphics systems, resolved with a dedicated video card, which offers far better overall performance anyway since HTPCs are used for just about everything these days. Heck, on my own HTPC, I'm running a 290x, and when I retire my 295x2 from my primary rig, it will go in my HTPC. :D


You have a point. I was looking at TDP, because I use tiny little picoPSU's in my builds to make them more power efficient and quieter. My GT720's yse - what - 17W TDP? I can't remember, it's been a while. I could probably get away with a 1050, as I never load it up, but I'd still be uncomfortable having a device in my little HTPC's that could cause it to draw more power than the PSU can handle :p
 
I wouldn't be surprised if the new tv mode was chosen thinking of the two dominant consoles, you know, the usual video game machine attached to TV's and who happen to have AMD chips.
:p :)

Keep discussing pc market dominance tho :)
 
Anyone know if the newly increased HDMI 2.1 bandwidth is more or less than the USB3 standard? As much as I appreciate HDMI as a backwards-compatible standard, I'd still like to see USB-C connectors start appearing on monitors/televisions/gpus.
 
Anyone know if the newly increased HDMI 2.1 bandwidth is more or less than the USB3 standard? As much as I appreciate HDMI as a backwards-compatible standard, I'd still like to see USB-C connectors start appearing on monitors/televisions/gpus.


hdmi 2.1 is WAY more bandwidth.

usb 3.0 ~ 5Gbps
usb 3.1 ~ 10Gbps
thunderbolt 3 (through usb type c connector) ~ 40Gbps


hdmi 2.1 ~ 48Gbps


It supersedes all other video connectors, including displayport 1.3/1.4 by a wide margin.

The only standard that might be in the same ballpark is superMHL, but that one does not seem to want to show explicit bandwidth numbers.
 
Stop buying $299 TVs at Wal-Mart.

Every LG OLED that BB has listed has at least 3 and the majority have 4.

Samsung KS8000: 4
Samsung KU7000: 3
Sony X850D: 4
Vizio D3: 4
Sharp 7000U: 4
Toshiba 621I: 3

My $300 Samsung 6290 (basically bottom of the range 4K set) has 3 HDMI ports. Granted, only one supports HDR, but all 3 will take 4K@60Hz 4:2:2 which is enough for most sources.
 
Back
Top