So, obviously, the higher number cards perform better. But, do they perform better enough that it matters?
Meaning, if I want to game at 1080, does the 1070 give enough of a performance increase over the 1060 that the price premium (~$200) is warranted?
Similarly, if I want to game at 1440, is it the cost of the 1080 reflected in that much of a better game experience?
****************
The "market" for displays is broken into three segments: FHD, 1920x1080 (including 1920x1200); QHD, 2560x1440; and UHD (4k) 3840x2160.
Each display segment only "needs" to be driven at ~30fps. (YES, I know. It can go up to 144Hz for that smooooooth feeling.)
Let's say the "steps" for refresh rate are 30fps, 60fps, 120fps and 144fps.
Image quality settings can, obviously, be a huge driving factor.
All of the above are simplifications, but they do characterize the display "market".
****************
In an apples-to-apples comparison (meaning all the image quality settings are the same), I'd love to see how the 1060 fares against the 1070 at FHD. Is the 1070 able to "step" performance above the 60fps rate consistently enough to matter? If it cannot maintain 60fps, then it falls back to the 30fps step...which is where the 1060 sits.
Or, can the 1080, at FHD, drive the same game at 144 (or better) consistently, thereby gaining several steps of "smoothness"?
Likewise at QHD and UHD.
I've been reading reviews, and it is rare to find a 1060/1070 comparison which utilizes the same settings for each card.
TL;DR: It doesn't matter if a 1070 drives a game at 42fps. If it's above 30, but below 60, then it's performance is roughly the same as a 1060 at 31fps. Similarly, a 1080 at 72fps is not enough a performance premium over a 1070 at 60fps.
Ken
Meaning, if I want to game at 1080, does the 1070 give enough of a performance increase over the 1060 that the price premium (~$200) is warranted?
Similarly, if I want to game at 1440, is it the cost of the 1080 reflected in that much of a better game experience?
****************
The "market" for displays is broken into three segments: FHD, 1920x1080 (including 1920x1200); QHD, 2560x1440; and UHD (4k) 3840x2160.
Each display segment only "needs" to be driven at ~30fps. (YES, I know. It can go up to 144Hz for that smooooooth feeling.)
Let's say the "steps" for refresh rate are 30fps, 60fps, 120fps and 144fps.
Image quality settings can, obviously, be a huge driving factor.
All of the above are simplifications, but they do characterize the display "market".
****************
In an apples-to-apples comparison (meaning all the image quality settings are the same), I'd love to see how the 1060 fares against the 1070 at FHD. Is the 1070 able to "step" performance above the 60fps rate consistently enough to matter? If it cannot maintain 60fps, then it falls back to the 30fps step...which is where the 1060 sits.
Or, can the 1080, at FHD, drive the same game at 144 (or better) consistently, thereby gaining several steps of "smoothness"?
Likewise at QHD and UHD.
I've been reading reviews, and it is rare to find a 1060/1070 comparison which utilizes the same settings for each card.
TL;DR: It doesn't matter if a 1070 drives a game at 42fps. If it's above 30, but below 60, then it's performance is roughly the same as a 1060 at 31fps. Similarly, a 1080 at 72fps is not enough a performance premium over a 1070 at 60fps.
Ken