And, you point, as it relates to my post, is..... what again?
Per the tests conducted by [H], the highest average framerate for a game using the 1080Ti at 2560x1440 resolution was 140FPS from Sniper Elite 4 and 159FPS using Doom. Those were the numbers that I was looking and what concerned me under my current setup. I didn't say "2K" or "4k".
I also stated that the performance of the card is monitor dependent. Where am I wrong on this?
And, what is the actual resolution you are talking about when you say "4K"? Because, when I hear "4K", I think of those "4K" television sets where the actual resolution is more like 2Kish.
If a thread goes on long enough they always seem to go off the rails. I guess this one is no different.
4K TV's and 4K monitor's are the same resolution. They aren't "2kish" as you put it. 3.8k'ish would probably be more accurate. You are thinking of the vertical pixel height of a 4K TV because for some stupid reason, that's been adopted as a way of describing resolutions in the past. Calling 1920x1080 "1080P" was always stupid in my opinion. A 4K TV has an actual native resolution of 3840x2160. This is often erroneously called "2160P" for some stupid reason. What's changed is that the industry decided to use the first half of the resolution for the name and called it "4K". The name comes from the DCI 4K (native resolution) or Digital Cinema Initiatives standard for 4K movie projection. This resolution is 4096x2160. That's where the "4" in "4K" really comes from. Many TV's like the Samsung JU6700, JS9000, and KS8500 display this as an over scan resolution despite having a native resolution of 3840x2160. It doesn't matter if we call a resolution a short name based on the horizontal pixel count or the vertical pixel height but the alternating between those two methods is what creates confusion.
We were at "2Kish" before the "1080P" standard came into play. 1920x1200 monitors appeared on the market before 1920x1080 models did.
Also, video card performance has fuck all to do with monitors. The video card outputs a given amount of FPS based on the applications, settings and ultimately, what resolution its running at. Your monitor only matters in that it limits what your resolution and refresh rate options are. Further limitations are imposed after the fact by enabling V-Sync, using G-Sync, FreeSync, or not using such settings at all. If you could simulate a monitor signal and configure benchmarks and games to run via RDP, you would see the same results on identical systems based on the configuration of those systems. Your monitor simply controls what you see and defines the video cards resolution. Refresh rate doesn't even come into play beyond what you see or unless you use a technology to synch the output of the graphics card to the refresh rate of the monitor.