1080>1070>1060...but by how much?

c3k

2[H]4U
Joined
Sep 8, 2007
Messages
2,330
So, obviously, the higher number cards perform better. But, do they perform better enough that it matters?

Meaning, if I want to game at 1080, does the 1070 give enough of a performance increase over the 1060 that the price premium (~$200) is warranted?

Similarly, if I want to game at 1440, is it the cost of the 1080 reflected in that much of a better game experience?

****************

The "market" for displays is broken into three segments: FHD, 1920x1080 (including 1920x1200); QHD, 2560x1440; and UHD (4k) 3840x2160.

Each display segment only "needs" to be driven at ~30fps. (YES, I know. It can go up to 144Hz for that smooooooth feeling.)

Let's say the "steps" for refresh rate are 30fps, 60fps, 120fps and 144fps.

Image quality settings can, obviously, be a huge driving factor.

All of the above are simplifications, but they do characterize the display "market".

****************

In an apples-to-apples comparison (meaning all the image quality settings are the same), I'd love to see how the 1060 fares against the 1070 at FHD. Is the 1070 able to "step" performance above the 60fps rate consistently enough to matter? If it cannot maintain 60fps, then it falls back to the 30fps step...which is where the 1060 sits.

Or, can the 1080, at FHD, drive the same game at 144 (or better) consistently, thereby gaining several steps of "smoothness"?

Likewise at QHD and UHD.

I've been reading reviews, and it is rare to find a 1060/1070 comparison which utilizes the same settings for each card.


TL;DR: It doesn't matter if a 1070 drives a game at 42fps. If it's above 30, but below 60, then it's performance is roughly the same as a 1060 at 31fps. Similarly, a 1080 at 72fps is not enough a performance premium over a 1070 at 60fps.

Ken
 
For me, 31fps is nowhere near playable.

I bought 1080 because it's the most powerful GPU I could buy (Titan XP was not on sale, which would have been my first choice, by far), and I was running on a single 970 on a 1440p screen, which was annoying me GREATLY because it was severely limiting the games I could play (some of higher end games, the details I would need to turn down was too much that I would just rather not touch it than to get playable framerates at a huge sacrifice of visuals).

I let my games, and the way I play, decide my hardware, not the otherway round. Games are the things that you actually play, your hardware only serves to drive the game, so I have a lot less qualms about spending more money just so I could turn it as high as possible (higher GPU power also meant I get more room with modding).

I would have considered 1070 if I was only running on 1080p screen, but at 1440p, no. I wasn't interested in 980ti when it was released, 1070 definitely didn't spark my interest (money isn't an issue for me, no expenses, and I have long working hours, so I want my gaming as flawless as possible).

If money is your concern, then generally speaking, value argument nearly always end at X70 GPU's, as at X80 the performance difference is often smaller than the cash difference.
 
...TL;DR: It doesn't matter if a 1070 drives a game at 42fps. If it's above 30, but below 60, then it's performance is roughly the same as a 1060 at 31fps. Similarly, a 1080 at 72fps is not enough a performance premium over a 1070 at 60fps.

Ken
Performance reviews trump blanket statements.
Wonder where you can find them...
 
A 1070 is about 15% slower than a 1080 when they are all overclocked.

In some games the spread is as high as 30%, and in some games it is smaller than 10%. On average it is around 15%.

Is that worth it? Worth is relative, and only you can answer that for yourself.
 
A 1070 is about 15% slower than a 1080 when they are all overclocked.

In some games the spread is as high as 30%, and in some games it is smaller than 10%. On average it is around 15%.

Is that worth it? Worth is relative, and only you can answer that for yourself.
Also, there are the people with the means and desire to drop $400-500 on a compared to those spending $300 or even just $250.
 
A 1070 is about 15% slower than a 1080 when they are all overclocked.

In some games the spread is as high as 30%, and in some games it is smaller than 10%. On average it is around 15%.

Is that worth it? Worth is relative, and only you can answer that for yourself.

Well, this is nice. Really. But.

Who cares about 30%? Let me explain. Is 30% enough to jump from 60fps to 120fps? (Err, no.) So there is no benefit to having that extra fps. Really, can you tell 76fps vs 88fps? By creating performance "buckets" at 30, 60, 120, and 144 fps, you can more easily discern what you get for your purchase.

30% fps increase keeps you in the same bucket. You can trade off some of that performance increase and gain some image quality settings. But, how good is that? AA at 4k on a 24" screen is worthless. (Totally made up example. Really, who'd spend 4k money on a 24" screen?)

That's my point in asking about 30/60/120/144 buckets, at each of the three primary display resolutions of FHD, QHD, and UHD.

Ken
 
Well, this is nice. Really. But.

Who cares about 30%? Let me explain. Is 30% enough to jump from 60fps to 120fps? (Err, no.) So there is no benefit to having that extra fps. Really, can you tell 76fps vs 88fps? By creating performance "buckets" at 30, 60, 120, and 144 fps, you can more easily discern what you get for your purchase.

30% fps increase keeps you in the same bucket. You can trade off some of that performance increase and gain some image quality settings. But, how good is that? AA at 4k on a 24" screen is worthless. (Totally made up example. Really, who'd spend 4k money on a 24" screen?)

That's my point in asking about 30/60/120/144 buckets, at each of the three primary display resolutions of FHD, QHD, and UHD.

Ken

And since you desire such specific information, may I suggest this:

Go to google.com

Type in the name of the game and then the name of the GPU as well as "benchmark"

Look over the data from multiple sources and then decide for yourself.
 
Purely opinionated questions for gaming experience, because many people can easily game at lower FPS and are comfortable with that, I've gamed with a Radeon HD7770 (yes crazy old) for years and played at mostly medium to low settings, and gamed on a GTX970 with high settings though and there was little to no impact on my gaming experience.
If you are asking in reference to value, then yea the GTX1060 is probably the best value in terms of dollars to CUDA cores.
 
Well, this is nice. Really. But.

Who cares about 30%? Let me explain. Is 30% enough to jump from 60fps to 120fps? (Err, no.) So there is no benefit to having that extra fps. Really, can you tell 76fps vs 88fps? By creating performance "buckets" at 30, 60, 120, and 144 fps, you can more easily discern what you get for your purchase.

30% fps increase keeps you in the same bucket. You can trade off some of that performance increase and gain some image quality settings. But, how good is that? AA at 4k on a 24" screen is worthless. (Totally made up example. Really, who'd spend 4k money on a 24" screen?)

That's my point in asking about 30/60/120/144 buckets, at each of the three primary display resolutions of FHD, QHD, and UHD.

Ken
You're more than welcome to go out and create your own hardware review site to do this. I think in general that people spending money on hardware are informed enough to make appropriate decisions based on their needs without the explicit handholding.

And to your third rhetorical question: yes, I can, which is why I own the video card that I do.
 
"Each display segment only "needs" to be driven at ~30fps"

Fuck that noise. If I wanted to be limited to 30fps I'd play games on my PS4 more often.

As for the rest of your post, if that's your opinion then cool, but this is a hardware enthusiast website. I went from a 980 ti to a 1080 because I wanted that extra 15-25% performance improvement. It wasn't required, I could have played on my 144hz 1080p screen instead of the 4kTV or I could have got my ultrawide back from my brother and used that instead, I could have dealt with 30-40fps in some games, I could have lowered settings, but I didn't want to do any of those things. I want to play games at 4K at 60fps with the highest settings I can possibly push. If I would have had the money for a Titan XP I would have bought that instead.
 
I'm not sure where all the anger is coming from. Where did I say that [H] should change its methodology? Where did I put a value on YOUR experience? Sheesh. A lot of touchy folks.

I still maintain that there are discrete tiers of performance. My numbers are 30/60/120/144.

30 is a minimum framerate. You ever watch a movie? Yeah...24fps. Get over yourself. Now, is a movie (in a theater) the same as a game? Of course not. 30fps is widely recognized as a minimum acceptable game framerate. Notice the words I used. (Check some of the [H] reviews: there's a funny colored line at 30fps for some reason...)

60 is desired.

120 is better.

144 is smooth as butter.

Where did I say that 72 was not better than 68? If you can tell the difference, well, I'd like to test that statement...but it doesn't matter. Can you tell 88 is better than 31? Of course.

Silliness.

And my title itself is, wait for it: 1080>1070>1060". Anyone know what the ">" symbol means?

Sheesh.
 
I'm not sure where all the anger is coming from. Where did I say that [H] should change its methodology? Where did I put a value on YOUR experience? Sheesh. A lot of touchy folks.

I still maintain that there are discrete tiers of performance. My numbers are 30/60/120/144.

30 is a minimum framerate. You ever watch a movie? Yeah...24fps. Get over yourself. Now, is a movie (in a theater) the same as a game? Of course not. 30fps is widely recognized as a minimum acceptable game framerate. Notice the words I used. (Check some of the [H] reviews: there's a funny colored line at 30fps for some reason...)

60 is desired.

120 is better.

144 is smooth as butter.

Where did I say that 72 was not better than 68? If you can tell the difference, well, I'd like to test that statement...but it doesn't matter. Can you tell 88 is better than 31? Of course.

Silliness.

And my title itself is, wait for it: 1080>1070>1060". Anyone know what the ">" symbol means?

Sheesh.

Your posts (including this one) come off as if you are talking down to people. Your initial posts made it seem like you were telling people that bought more expensive cards because they wanted that "small" boost that they were wrong and were wasting their money.

Also, don't make generalized statements. 30fps is absolute bare minimum for SOME genres. Anything that requires quick reaction time and precise inputs is horrible at anything under 60. And don't forget that a console and a PC are very different beasts. Input lag tends to be a lot more noticeable PCs due to the super fast response time of monitors and that fact that most of us use devices with instantaneous response to our inputs. There is very little input lag when it comes to PC gaming which makes the slower response time of 30fps vs 60 or higher a lot more obvious.
 
I never said I can tell the difference between 72 and 68, and if you continue to make such false accusations, then this discussion is over and you're on my ignore list.

I ORIGINALLY said I can tell the difference between 70 and 45, because those are the FIRST TWO AVERAGE FRAME RATE TARGETS I MENTIONED. That's 55 percent higher!

If you think I can't do that, you're an idiot. I have this monitor refreshing at 74 Hz and I play with VSYNC off, so YES, I see everything that gets rendered up to 74 Hz with no stutter. No, I don't need to jump all the way up to 120fps to see a difference.

And meanwhile, you make up a bulllshit test case of "72 and 68," a five percent difference that's nowhere near the massive 35% performance increase of the 1070 over the 1060.

I SPECIFICALLY HIGHLIGHTED THE ACTUAL FRAME RATE DIFFERENCES IN MY POST so you could understand we're talking about sizable differences here. Anything over 30 percent increase tends to be VERY noticeable, which is why it's the minimum I recommend for people to upgrade (if they are selling their current card, so the differential upgrade cost is small).

Hmm. It may be best for you to go ahead and do that ignore list thing. Really. It'd help a lot of us out.

Thanks.
 
The "I don't need it so why should anyone else?" is strong with Ken.
 
Your posts (including this one) come off as if you are talking down to people. Your initial posts made it seem like you were telling people that bought more expensive cards because they wanted that "small" boost that they were wrong and were wasting their money.

Also, don't make generalized statements. 30fps is absolute bare minimum for SOME genres. Anything that requires quick reaction time and precise inputs is horrible at anything under 60. And don't forget that a console and a PC are very different beasts. Input lag tends to be a lot more noticeable PCs due to the super fast response time of monitors and that fact that most of us use devices with instantaneous response to our inputs. There is very little input lag when it comes to PC gaming which makes the slower response time of 30fps vs 60 or higher a lot more obvious.

The essential point of my initial point was to get away from all the "21% better framerate" reviews. If, using made up numbers, my rig gives me 42fps, a 21% increase would be about 50fps. That would be inside the same tier: better than 30fps, but not into 60. Therefore, it seems to me, that 21% really wouldn't be worthwhile. Now, some may claim to be able to see that. Shrug. Their leet skillz are better than my poor abilities. Good for them. (I really hope they don't need my approval for them to feel better. Really. Or, is that how low our world has slid?)

Whereas, I've seen a lot of these comparison charts, I really haven't seen enough apples-to-apples. I'd love to see the same image quality settings at the full gamut of display resolutions.

Shrug. It was a thought...
 
The "I don't need it so why should anyone else?" is strong with Ken.

LOL! Where does THAT come from???

I'd love better performance. I just don't see 73fps being that much better than 61fps...for another totally random example.

Seriously...where does THAT come from?
 
Well, even with your "tiers" of performance, you could choose to increase your settings without sacrificing performance by jumping card to card. Lets use your made up number from before, just for the example. Our fake X1070XT puts out 42 fps at medium settings (within your performance bracket of 30fps), our 480 GX2 puts out 50fps at medium settings (same performance bracket). What if the jump from medium to high costs 15 fps? Well, the X1070XT would be unplayable and would need to stay on medium, while the 480 GX2 would be able to roll with the high settings while still in the same tier of performance. So basically, the 480 GX2 is providing a better experience outside of raw fps.. Or is it? Even with this example you need to form an opinion on your own. Is the jump from 42 to 50 fps worth the extra money to you? Maybe. How about the difference in settings? Does going from medium to high in "random game here" REALLY make that big a difference? Well, that's up to each person to decide. Reviews can only provide you with the information, you must choose how to interpret it on your own.
 
Also, when you (the OP) calculated whether it's worth it or not, did you go by "X runs this game at Y fps" or did you go by "X runs this game at Y fps HIGHER than what I have"?

Because the former is completely the wrong metric to use when comparing 'value', it's an entirely BS metric infact.

When you buy a 1060, are you buying for every frame it churns out? Nope (unless you are buying a brand new computer and you never had one before).

Whenever any normal consumer buys an 'upgrade' for their computer, whatever they are paying for is the 'upgrade', IE the difference between what you have now and what you will get.

Let us, for the sake of argument, have a fake card 950ti that runs the game at 25fps, when you decide whether or not 1070 is worth it over 1060, do you still look at 31 vs 42 fps, or would you look at (31-25) vs (42-25), which becomes 6 vs 17, which makes 1060 only offer a little over 33% of an upgrade compared to 1070, making 1060 an extremely shoddy purchase, since 1060 is NOT 33% of a 1070.

My point here is, sometimes, whether a GPU is worth the upgrade or not, is HEAVILY dependent on what GPU you are coming from. If you came from a relatively close performing GPU, then the 'best' GPU upgrade for you isn' necessarily the bang for the buck GPU. In fact, I'd go out and argue that whoever invented the concept of 'best bang for the buck' when it comes upgrading by using its ENTIRE performance as a baseline should be beaten to death with his own shoes, because you NEVER buy ALL of the fps the GPU churns out, you pay for the DIFFERENCE.

Because of this difference, the 'best' GPU can change, depending on what GPU you are coming from.

Personally, because I want both Visuals and Performance (I put the latter with heavier weighting), I go high, because lower tiered GPU will simply not give me the level of enjoyment I would otherwise.

Sure, some games would look fairly similar between medium and high details, some games would look drastically different.

The performance difference between 1060 may not be that different from 1070, and 1070 may not be that different from 1080, but using those two comparisons and then concluding that 1060 may not be that different from 1080 is similar to saying that 7 million is not that different from 8 million, by drawing the conclusion that 7,000,000 is almost equal to 7,000,001 and 7,000,001 is almost equal to 7,000,002, etc etc until 7,999,999 is almost equal to 8,000,000.

For me, coming from an 970, the performance to cost ratio of upgrading to 1070 is actually very similar to upgrading to 1080 (about 10% of each other, this is caused by higher than usual 1070 prices and lower than usual 1080 prices), both of which are hands down better than 1060, which, from my PoV, made 1080 a LOT more sense, because at the same perf/price ratios, higher performance is better value, because around this level, the perf/price usually increases with increasing performance.
 
Last edited:
Also, when you (the OP) calculated whether it's worth it or not, did you go by "X runs this game at Y fps" or did you go by "X runs this game at Y fps HIGHER than what I have"?

Because the former is completely the wrong metric to use when comparing 'value', it's an entirely BS metric infact.

When you buy a 1060, are you buying for every frame it churns out? Yes, only if you start off with no computer, having 0 fps.

Whenever any normal consumer buys an 'upgrade' for their computer, whatever they are paying for is the 'upgrade', IE the difference between what you have now and what you will get.

Let us, for the sake of argument, have a fake card 950ti that runs the game at 25fps, when you decide whether or not 1070 is worth it over 1060, do you still look at 31 vs 42 fps, or would you look at (31-25) vs (42-25), which becomes 6 vs 17, which makes 1060 only offer a little over 33% of an upgrade compared to 1070, making 1060 an extremely shoddy purchase, since 1060 is NOT 33% of a 1070.

My point here is, sometimes, whether a GPU is worth the upgrade or not, is HEAVILY dependent on what GPU you are coming from. If you came from a relatively close performing GPU, then the 'best' GPU upgrade for you isn' necessarily the bang for the buck GPU. In fact, I'd go out and argue that whoever invented the concept of 'best bang for the buck' when it comes upgrading by using its ENTIRE performance as a baseline should be beaten to death with his own shoes, because you NEVER buy ALL of the fps the GPU churns out, you pay for the DIFFERENCE.

Because of this difference, the 'best' GPU can change, depending on what GPU you are coming from.

Personally, because I want both Visuals and Performance (I put the latter with heavier weighting), I go high, because lower tiered GPU will simply not give me the level of enjoyment I would otherwise.

Sure, some games would look fairly similar between medium and high details, some games would look drastically different.

The performance difference between 1060 may not be that different from 1070, and 1070 may not be that different from 1080, but using those two comparisons and then concluding that 1060 may not be that different from 1080 is similar to saying that 7 million is not that different from 8 million, by drawing the conclusion that 7,000,000 is almost equal to 7,000,001 and 7,000,001 is almost equal to 7,000,002, etc etc until 7,999,999 is almost equal to 8,000,000.

For me, coming from an 970, the performance to cost ratio of upgrading to 1070 is actually very similar to upgrading to 1080 (about 10% of each other, this is caused by higher than usual 1070 prices and lower than usual 1080 prices), both of which are hands down better than 1060, which, from my PoV, made 1080 a LOT more sense, because at the same perf/price ratios, higher performance is better value, because around this level, the perf/price usually increases with increasing performance.

Absolutely agree with you. I'm actually going to be stepping from my 970 to a 1080 soon enough as well.
 
1070 prices are bloated as hell. Without any significant competition from AMD nvidia almost doubled the price of enthusiast level cards in a very short time. I don't like it, not at all.
 
This is click bait, yes?

No. I was thinking out loud. Some folks (I guess) took it to mean that I was devaluing their $650 purchase or that I don't want an improvement in framerate. I was trying to be precise in the verbiage I used in my initial post. It was supposed to show that 30fps is not desired, but is, rather, the minimum deemed usable. Shrug. Let me try again...

If I have a given display resolution (of which there are three main ones: FHD, QHD, and UHD), then my target fps must be balanced with my image quality. I can choose 1080p resolution (FHD), minimum image quality and maximum fps. Let's say that gives me 90fps. Well, sure, 90 is better than 60. BUT, (I use caps for emphasis...in case that got missed somewhere in the internet), BUT, what is the image quality difference if I increase all the post-processing adjustments? If I give up 90fps of "raw" frames for 60fps of "refined" frames, will I be able to see the difference?

Card 1 gives me 60fps "raw".
Card 2 gives me 90fps "raw", or 60fps "refined".
Visually, is the card 2 "refined" image really that much better than the card 1 "raw" image?

Some reviews have screenshots with sliders which show differences between textures, for example. I'd love to see more of that. In some cases, it takes a very discerning eye to tell a difference between that type of slider comparison.

The genesis of this thought was that an increase of 15% framerate from 70 to 80 is barely noticable...if at all. But, what could I gain in image quality? Taking that 80fps card, could I go from 2xAA at 70fps as a base, to 2xAA at 80fps with a new card and then tweak thatt IQ to 4xAA at 70. Would that 2x to 4x really be a noticable improvement? I'd like to see that...on the same screenshot with a slider.

Then, if 15% really can't improve IQ to a noticable level, what good is it? Sure, 80fps is better than 70fps...but it's not as useful as a jump up to 120fps. (And, based on gsync/freesync, anything less than the monitor refresh rate really doesn't do anything, does it?)

So, not clickbait. Rather an actual thought about whether small improvements (say, less than 50%) in gpu rendering are really useful or noticable...for the vast majority of users and displays.
 
The official figures have inflated gaps because the 1080 has higher clocks than a 1070 which also has higher clocks than a 1060, in reality they will all clock to around the same average speeds meaning the actual gaps are smaller, e.g. the 15% gap quoted in an earlier post.
In addition amsung vram 1070s can clock to speeds where the speed is very close to the gddr5x memory speeds on the 1080s.
 
The official figures have inflated gaps because the 1080 has higher clocks than a 1070 which also has higher clocks than a 1060, in reality they will all clock to around the same average speeds meaning the actual gaps are smaller, e.g. the 15% gap quoted in an earlier post.
In addition amsung vram 1070s can clock to speeds where the speed is very close to the gddr5x memory speeds on the 1080s.

I'll wager that you meant to say stock GDDR5X, because no GDDR5 can clock to 11,114MHz like my Micron GDR5X VRAM can. And my VRAM OC isn't even on the very high end.
 
1070 prices are bloated as hell. Without any significant competition from AMD nvidia almost doubled the price of enthusiast level cards in a very short time. I don't like it, not at all.
Prices for the x70 cards for the past 6 years look pretty stable to me, especially if you take inflation into account.
  • GTX 470 (2010) = $349
  • GTX 570 (2010) = $349
  • GTX 670 (2012) = $399
  • GTX 770 (2013) = $399
  • GTX 970 (2014) = $329
  • GTX 1070 (2016) = $379 - $429
There are a multitude of 1070s available right now from several retailers between $370 and $430 USD.
 
Prices for the x70 cards for the past 6 years look pretty stable to me, especially if you take inflation into account.
  • GTX 470 (2010) = $349
  • GTX 570 (2010) = $349
  • GTX 670 (2012) = $399
  • GTX 770 (2013) = $399
  • GTX 970 (2014) = $329
  • GTX 1070 (2016) = $379 - $429
There are a multitude of 1070s available right now from several retailers between $370 and $430 USD.

That might be the case in the US. I can't comment on that. But for us here in the EU, it's very different.

For the sake of saving time I only looked at non-strix non-oc ASUS models prices, with the same retailer (which is one of the cheapest in my country) Here are the results, roughly translated to USD: (I don't have data for before 2011, they only started logging prices in 2011)

  • GTX 570 (mid 2011) $330
  • GTX 670 (mid 2012) $420
  • GTX 770 (mid 2013) $410
  • GTX 970 (mid 2015) $400
  • GTX 1070 (mid 2016) $540
  • GTX 1070 (right now) $510

And I could show the same trend for the XX80 cards as well. The cheapest 1080 right now is the inno3d at ~ $700
 
Last edited:
in the EU the 10 series cards have insane prices.

I paid £420 for my card and this was one of the cheaper cards, it was approx 580usd.
 
in the EU the 10 series cards have insane prices.

I paid £420 for my card and this was one of the cheaper cards, it was approx 580usd.

I just bought MSI GTX 1080 Gaming for 820 Euro :) In Finland. It is cold here a noone wants to bring cards for cheap to us.
 
Last edited:
Card 1 gives me 60fps "raw".
Card 2 gives me 90fps "raw", or 60fps "refined".
Visually, is the card 2 "refined" image really that much better than the card 1 "raw" image?

Some reviews have screenshots with sliders which show differences between textures, for example. I'd love to see more of that. In some cases, it takes a very discerning eye to tell a difference between that type of slider comparison.

The genesis of this thought was that an increase of 15% framerate from 70 to 80 is barely noticable...if at all. But, what could I gain in image quality? Taking that 80fps card, could I go from 2xAA at 70fps as a base, to 2xAA at 80fps with a new card and then tweak thatt IQ to 4xAA at 70. Would that 2x to 4x really be a noticable improvement? I'd like to see that...on the same screenshot with a slider.

So you just wish more reviews were like Geforce.com's? They do a lot of what you mentioned, such as interactive screenshot comparisons between almost every possible setting you can adjust/toggle, alongside traditional bar graphs to showcase the performance impact of each setting, and then they make GPU recommendations based on a target resolution (but only assume maximum settings).

Sure, they don't go as in-depth as you described here, but close enough. Is this basically what you were trying to get at?
 
Back
Top