lostin3d
[H]ard|Gawd
- Joined
- Oct 13, 2016
- Messages
- 2,043
For those who say 4k is niche, well I read the exact same comments 10 years ago about 1080p and a few years before that about 720p. I also remember explaining to my friends with consoles the visual difference of 30fps vs 60fps and most didn't get it until they saw it on my rig(at the time a hot rodded P4(3.4ghz) connected via s-video gold contacts to a 32" Sony Trinitron @ 1024x768). As hypocritical as it was I said the same about 120hz(based on outdated info fomr high school stating human vision was limited to ~60fps), and then I got a 3d laptop with 120hz. Well since then I've been off the deep with display tech but I reached my limit of what I could perceive at 144hz in terms of FPS but I can still make out differences of color depths but occasionally can be fooled with pixel manipulations(checkerboard/mirrors/or even AA).
HDMI 2.1-looks to be cool thing. I've commented in a number of places that I feel we're really at a point we need to be able to test the throughput of our ports/cables. I've had to do a lot of rearranging of things in our living room to accommodate new stuff and keep cables hidden. This has brought on new education and experiences regarding cable length thresholds vs. HDR10/4k/60hz/4:4:4 or 1440p/144hz. I can factually tell you that from DP1.2 to HDMI 2.0 we've reached limits anytime you use a cable past 3-4'. DolbyVision and the next couple of HDR specs on the way plus higher HZ look to populate HDMI 2.1 even before it gets released. When 2.0 came out many experts said it was fake and that any highspeed 1.4 would do. Took me about 3 months to find out how untrue that was if you wanted all the bells and whistles of 4k over 3 feet. It doesn't matter to me about DP vs HDMI anymore. I just want something that works and can hold its own for more than a couple of years. These days the options are evolving quicker than the ports or GPU's that drive them.
Back to the root of this thread. People who want to stay at 1080p, 1440p, or even ultra wide, well that's great and nothing wrong with it. I game in mostly in 1440p/144hz, sometimes in 4k/60 HDR or non-HDR, and occasionally will test things in 1080p/120hz just because I remember when I first started gaming in 1080p and thinking one day the textures should catch up to look more like a blu-ray movie and slowly(over a decade) they are. Ironically enough the bigger issue isn't really display resolution but texture resolutions(which are getting better but also show how even modern GPU's can even struggle at 1080p-KCD looking at you) and actual feature support. This too goes back to the fact that PC gaming HDR support is a hot mess. Throw in the 2 present dominating standards(HDR10 & DolbyVision) plus the half dozen or so HDR monitor spec types and it gets even worse. Add the stated fact that most dev's don't even know how to properly implement it for their games and it reaches a nightmare. From anally retentive engineers claiming how a few dots in one screen look better than the other version to money hungry marketing execs looking for the next big labelled payload, I think they need to just slow down and let the market settle into something that can gain a greater consumer confidence. It's not that the goal posts are moving so much as that every other day a new one is added and know one really knows which to aim for. Hence, PC has a HDR support problem.
HDMI 2.1-looks to be cool thing. I've commented in a number of places that I feel we're really at a point we need to be able to test the throughput of our ports/cables. I've had to do a lot of rearranging of things in our living room to accommodate new stuff and keep cables hidden. This has brought on new education and experiences regarding cable length thresholds vs. HDR10/4k/60hz/4:4:4 or 1440p/144hz. I can factually tell you that from DP1.2 to HDMI 2.0 we've reached limits anytime you use a cable past 3-4'. DolbyVision and the next couple of HDR specs on the way plus higher HZ look to populate HDMI 2.1 even before it gets released. When 2.0 came out many experts said it was fake and that any highspeed 1.4 would do. Took me about 3 months to find out how untrue that was if you wanted all the bells and whistles of 4k over 3 feet. It doesn't matter to me about DP vs HDMI anymore. I just want something that works and can hold its own for more than a couple of years. These days the options are evolving quicker than the ports or GPU's that drive them.
Back to the root of this thread. People who want to stay at 1080p, 1440p, or even ultra wide, well that's great and nothing wrong with it. I game in mostly in 1440p/144hz, sometimes in 4k/60 HDR or non-HDR, and occasionally will test things in 1080p/120hz just because I remember when I first started gaming in 1080p and thinking one day the textures should catch up to look more like a blu-ray movie and slowly(over a decade) they are. Ironically enough the bigger issue isn't really display resolution but texture resolutions(which are getting better but also show how even modern GPU's can even struggle at 1080p-KCD looking at you) and actual feature support. This too goes back to the fact that PC gaming HDR support is a hot mess. Throw in the 2 present dominating standards(HDR10 & DolbyVision) plus the half dozen or so HDR monitor spec types and it gets even worse. Add the stated fact that most dev's don't even know how to properly implement it for their games and it reaches a nightmare. From anally retentive engineers claiming how a few dots in one screen look better than the other version to money hungry marketing execs looking for the next big labelled payload, I think they need to just slow down and let the market settle into something that can gain a greater consumer confidence. It's not that the goal posts are moving so much as that every other day a new one is added and know one really knows which to aim for. Hence, PC has a HDR support problem.