Wanting to get a 1060 for 1440p instead of a 1070, a lot of naysayers though.

Today, you can barely tell the difference between Medium and Ultra settings, and there are some people, who think that running less than Ultra is somehow inadequate, and many of them congregate here.

Ignore that kind of snobbery.

A 1060 will be fine, just turn down the Quality a bit.

Check out this video and others like it. You practically need to freeze frame and pixel hunt to tell the difference between adjacent quality levels.


IMO, you need a 1070 mainly for bragging rights, not gameplay.


You know it's funny. You say 1070 is for bragging rights and then tell someone to turn down settings. That my friend is what we call contradiction. It's not bragging rights if it does help you get better visuals and faster speed.
 
Get an RX480 or RX580, right now they're better value than a 1060. They have the memory to run at high resolutions -- it'll be a huge upgrade over a GTX660. And if in 1 year you want an upgrade, then you can buy an even better card with the money you didn't spend now.
 
Today, you can barely tell the difference between Medium and Ultra settings, and there are some people, who think that running less than Ultra is somehow inadequate, and many of them congregate here.

Ignore that kind of snobbery.

A 1060 will be fine, just turn down the Quality a bit.

Check out this video and others like it. You practically need to freeze frame and pixel hunt to tell the difference between adjacent quality levels.


IMO, you need a 1070 mainly for bragging rights, not gameplay.


By that logic spending $200-250 on a GPU is bragging rights when you can buy an entire Xbox One for $250. In general, youtube videos provide poor comparisons. I can tell the difference between high/low settings in most games. I've never played the Witcher 3 though.
 
I've decided to get a 1060 and stay at 1080p for now. The monitor I want is like 450$.
Today, you can barely tell the difference between Medium and Ultra settings, and there are some people, who think that running less than Ultra is somehow inadequate, and many of them congregate here.

Ignore that kind of snobbery.

A 1060 will be fine, just turn down the Quality a bit.

Check out this video and others like it. You practically need to freeze frame and pixel hunt to tell the difference between adjacent quality levels.


IMO, you need a 1070 mainly for bragging rights, not gameplay.



Good video, I always knew that going for ultra wasn't worth it. People spending big money on new cards when they already got last years top model, because that can't play the latest game on ultra settings.
 
I think it has more to do with this:

If you get a video card that can play say high for this years game, there is a good chance that games released next year you may need to drop it back down to medium, which would drastically decrease IQ more than Ultra to high for example.

If you get a video card that can play ultra for this years game, then the chance of you having to drop visuals to unacceptable levels drop significantly.

I didn't buy 1080 as bragging rights (I already removed my computer's specs from my signature), in fact I wanted to buy Titan X, but I couldn't because it was never sold here, and my 970 was p-ing me off (due to it being literally the only 970 on the market to not come with 3 DP/1HDMI, so I couldn't use both of my monitors fully when I went down to 1 970), so I didn't wait for 1080ti.

I was using "TW3 with 60fps minimum at 1440p at ultra" as a benchmark for the performance I am looking for in the GPU, but it didn't have as much to do with playing TW3 at 60fps locked, more that the performance is going to be used for modding games, especially Bethesda ones. I haven't started on TW3 yet, as I still need to run through TW1 again before I go through TW2 and finally TW3.

If you do not have that kind of expectation from your GPU, 1070 would be my choice for 1440p, 1060 for 1080p. I wouldn't go higher than that for either card personally.

If you do not find visual difference between medium and ultra, more power to you. Personally, I like sharp images and textures, blurry images detract me quite a lot (I'd rather have less grass than same amount of grass but blurry grass blades, as an example), and sharpness are usually the first things to go going from high to med.
 
I think it has more to do with this:

If you get a video card that can play say high for this years game, there is a good chance that games released next year you may need to drop it back down to medium, which would drastically decrease IQ more than Ultra to high for example.

If you get a video card that can play ultra for this years game, then the chance of you having to drop visuals to unacceptable levels drop significantly.

I didn't buy 1080 as bragging rights (I already removed my computer's specs from my signature), in fact I wanted to buy Titan X, but I couldn't because it was never sold here, and my 970 was p-ing me off (due to it being literally the only 970 on the market to not come with 3 DP/1HDMI, so I couldn't use both of my monitors fully when I went down to 1 970), so I didn't wait for 1080ti.

I was using "TW3 with 60fps minimum at 1440p at ultra" as a benchmark for the performance I am looking for in the GPU, but it didn't have as much to do with playing TW3 at 60fps locked, more that the performance is going to be used for modding games, especially Bethesda ones. I haven't started on TW3 yet, as I still need to run through TW1 again before I go through TW2 and finally TW3.

If you do not have that kind of expectation from your GPU, 1070 would be my choice for 1440p, 1060 for 1080p. I wouldn't go higher than that for either card personally.

If you do not find visual difference between medium and ultra, more power to you. Personally, I like sharp images and textures, blurry images detract me quite a lot (I'd rather have less grass than same amount of grass but blurry grass blades, as an example), and sharpness are usually the first things to go going from high to med.

I think that it is still more about bragging that you use ULTRA. Also games aren't jumping up requirements year to year by very much.

Also you greatly exaggerate the difference between the difference in how modern quality levels look. ULTRA is mainly for bragging rights: "I can run Ultra at xx FPS...".

But when it comes down to it I doubt many could even tell the difference between Ultra and Medium. Someone complained that the Video was no good. How about still shots right of NVidias page:
http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide

Here is the TW3 Foliage Shot, Ultra vs Medium. Should be easy to spot according to you:
http://international.download.nvidi...interactive-comparison-1-ultra-vs-medium.html

Anyone saying there is some kind of big difference there is either lying to us, or to themselves. The differences are so subtle, you need to stack them and flip back and forth to spot any difference. IMO very likely completely undetectable in actual game play.
 
Strictly from the Image: no, not the grass. (I am not viewing that image from its native resolution, so the image will not look like that when I play the game)

The thing that IS noticeable is the foliage rendering distance, that alone I would increase the foliage details.

Also, bear in mind that:

1. I am not, anywhere in my post, trying to deter the OP from not making his 1060 and going to 1070 needlessly, I am merely stating my opinion, and I fully agree with OP's choice of 1060, and that would have been my exact choice, but I would personally not use it for 1440p for reasons I already discussed.

2. Not every graphical detail has the same impact to IQ as the foliage you have depicted. In the very article you have quoted, I could equally point out that the difference between having HairWorks on or off has dramatic IQ effects. Every visual effect will have differing impacts. View distance is another thing you can turn that would also affect image quality.

3. If you find it condescending that everyone on these forums use 1080ti or Titan X as mere bragging rights, and no one can notice the difference between ultra and medium, I suggest you to actually look at IQ settings individually and comparing every one before making that conclusion, instead of picking out just 1 setting. I find it equally condescending that you are trying to prove that medium settings is as good as ultra when in a lot of the cases, it's not, and it's SUBJECTIVE.

Like the foliage example you gave. The biggest thing that stood out for me wasn't the indivudual grass blades, but the drawing distance of the foliage, and THAT would be something I want to turn up as high as possible because it avoids pop-ins.
 
Last edited:
As an addendum, the foliage setting you have quoted, is foliage DISTANCE setting, not foliage QUALITY setting, so foliage difference is not being visible could be because that particular setting has nothing to do with.
 
As an addendum, the foliage setting you have quoted, is foliage DISTANCE setting, not foliage QUALITY setting, so foliage difference is not being visible could be because that particular setting has nothing to do with.

True. I didn't read enough. I thought it was just Overall settings, but it wasn't. I refer you back to the Video then.
 
I think that it is still more about bragging that you use ULTRA. Also games aren't jumping up requirements year to year by very much.

Also you greatly exaggerate the difference between the difference in how modern quality levels look. ULTRA is mainly for bragging rights: "I can run Ultra at xx FPS...".

But when it comes down to it I doubt many could even tell the difference between Ultra and Medium. Someone complained that the Video was no good. How about still shots right of NVidias page:
http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide

Here is the TW3 Foliage Shot, Ultra vs Medium. Should be easy to spot according to you:
http://international.download.nvidi...interactive-comparison-1-ultra-vs-medium.html

Anyone saying there is some kind of big difference there is either lying to us, or to themselves. The differences are so subtle, you need to stack them and flip back and forth to spot any difference. IMO very likely completely undetectable in actual game play.
Wut? I can tell a big difference, if you look at your feet all the time no, look out towards the horizon at the vegetation. If you are moving in the game that will be popping in and out as you travel - I would definitely notice that.

In Rise Of The Tomb Raider High and Very High texture is one that I really could not tell the difference in the game at all so yeah there are some games that you will need 4K big screen monitors to start telling the difference. If you are at 1080p the need to max out the settings become less except for things like AA and AF. Plus using a smaller image like that vice full size of your monitor like my 34" 3440x1440 monitor even more differences become apparent. Doom Ultra shadows and Nightmare shadows I can tell a significant difference. Games are different on the settings and does lead some games to good optimizations that can help performance while not hurting IQ, in fact some settings turning off to me gives better IQ. Each their own on this.
 
You don't plan to use medium, you just use medium when you have to. Sure, medium can be great, but having the extra HP isn't going to hurt you.

IMO, get the GPU with the most power that you can afford, enjoy the hell out of it, adjust settings accordingly, and eventually upgrade CPU platform or GPU when an enjoyable experience is no longer possible.

That said, OP already made their decision: to stay at 1080p and enjoy the heck out of their GTX 1060. Good luck end enjoy Subzerok11
 
If you can afford the 1070 get it. Even if you're sticking at 1080 it's still worth it as you can max everything and keep the average frame rate up.

Hell I'm using a 980TI at 1440 and it eats most things at high/ultra settings.
 
I can barely tell the difference between medium and high textures in Doom, when you're actually fighting.

Why do we have Ultra-settings again? Definitely can't tell the difference between that and high, even when I stop to stare.

I have a lot better eyesight than most people I know.
 
1070 is certainly what you'd want for 1440p. I also don't get this fascination with the back plate. Yeah it's nice to have but not to the point where having one is putting you over budget and thus you significantly downgrade your GPU for the sole purpose of having one.
 
I bought a Omen 32 inch 1440p Free sync monitor for 300 bucks, and a R9 Fury Nitro for 225. Easily the best upgrade I've purchased in a very long time, especially considering the price. I'd factor in the cost of the monitor, generally G-Sync is more expensive. You could probably rock a RX 580 with high clocks and a Freesync monitor for cheaper and have the same experience. The 1070 is a good bit more performance, and if you're planning on getting a higher end GSync monitor it would be worth it i'm sure. If you're not concerned with more than 60 FPS, Freesync and RX 580 might be a cheaper route when you package them together.
 
I can barely tell the difference between medium and high textures in Doom, when you're actually fighting.

Why do we have Ultra-settings again? Definitely can't tell the difference between that and high, even when I stop to stare.

I have a lot better eyesight than most people I know.

Ultra settings don't mean much any more, they're tacked on after a game is ported. Most games are designed around a spec to run on consoles. The extra effects don't make a readily discernible difference in motion. Having said that, I still run everything at mostly high/ultra 1440p haha. Then again, I haven't had a super graphics intensive game to play since DOOM and Witcher 3 that interests me at all.
 
running two 980 sli for 3440 * 1440 gaming. Most games 60FPS+ (high to ultra)

But I'm moving to vega to cover my 1440p gaming. Mainly, my monitor is freesync and hates nvidia -- wont even game at the monitors 75Mhz RR without severe frame-skipping. Only at 60 does it game well which is adequate for my games, currently. Even still, I'm going to Vega to get away from SLI issues, or lack of SLI support.
 
running two 980 sli for 3440 * 1440 gaming. Most games 60FPS+ (high to ultra)

But I'm moving to vega to cover my 1440p gaming. Mainly, my monitor is freesync and hates nvidia -- wont even game at the monitors 75Mhz RR without severe frame-skipping. Only at 60 does it game well which is adequate for my games, currently. Even still, I'm going to Vega to get away from SLI issues, or lack of SLI support.

i have the same screen as you, and with an nvidia as well. heres hoping vega is good and amd supports it and doesnt go bankrupt
 
I could tell a big difference with the 1080 over the 1070 for 1440 stuff. I should of just bought the 1080 first and saved me some money in the long run. The 1070 is still a bitchin card for 1440 but it wasn't going to last.

Btw I agree with the comments about the difference in graphic settings. DOOM when I went from medium to highest settings in gameplay I couldn't tell the difference while playing.
 
YouTube videos are overly compressed garbage for games most of the time, you can't really see the differences in a YouTube enough to make an accurate judgment in visual fidelity most of the time.
 
YouTube videos are overly compressed garbage for games most of the time, you can't really see the differences in a YouTube enough to make an accurate judgment in visual fidelity most of the time.

most people aren't smart enough to notice that.
 
I think that it is still more about bragging that you use ULTRA. Also games aren't jumping up requirements year to year by very much.

Also you greatly exaggerate the difference between the difference in how modern quality levels look. ULTRA is mainly for bragging rights: "I can run Ultra at xx FPS...".

But when it comes down to it I doubt many could even tell the difference between Ultra and Medium. Someone complained that the Video was no good. How about still shots right of NVidias page:
http://www.geforce.com/whats-new/gu...-hunt-graphics-performance-and-tweaking-guide

Here is the TW3 Foliage Shot, Ultra vs Medium. Should be easy to spot according to you:
http://international.download.nvidi...interactive-comparison-1-ultra-vs-medium.html

Anyone saying there is some kind of big difference there is either lying to us, or to themselves. The differences are so subtle, you need to stack them and flip back and forth to spot any difference. IMO very likely completely undetectable in actual game play.

Maybe your eyesight is poor. This is not meant as an insult, but the difference in draw distance is huge. It makes a forest look like a forest or a patch of bush. I can tell the difference in most games. Some effects, such as HBAO aren't too notable. You can see the difference, but not enough to be worthwhile. Always the first thing to go down if my frame rates are low. Otherwise I certainly can see the difference between a game set to low, no AA, and one set to high/ultra with lots of AA.
 
I don't think anyone is comparing Low with no AA to Ultra with lots of AA and saying they can't tell a difference.
 
Considering you can still find an RX 470 4GB for around $160, I'd go with that over a 1060 6GB.

But, if you can afford it: buy the 1070.
 
Dude, you can get 1060 3gb and it will work for the next four years on your 1440p monitor flawlessly. There is not much difference between the two except that of RAM type. Yes 1060 is a little slow. But as you mentioned you are fine with mid-high level graphics so it is fine. Plus it will cost less than other one.
 
Dude, you can get 1060 3gb and it will work for the next four years on your 1440p monitor flawlessly. There is not much difference between the two except that of RAM type. Yes 1060 is a little slow. But as you mentioned you are fine with mid-high level graphics so it is fine. Plus it will cost less than other one.

I would LOVE to know what you're basing this on. A 1060 3GB won't run many games TODAY at mid-high 1400p flawlessly, much less in 4 years from now.
 
Was in the same boat as you, and decided to just go for the 1070. It's basically a $100 step up per each level when going from 1060 to 1070 to 1080 if you're looking at buying the card used.
 
Which game is it not running? Work fine on Battlefield, Warhammer and Fallout 4

'Working fine' has different meanings to different people, and is extremely vague without describing what you consider fine.

One might consider med @ 30fps fine, others only consider high @ 60fps fine.

In the specific case of Fallout 4, I don't know your expectations, but I could not find a setting that gave me satisfactory performance at a satisfactory settings on a single 970 (ball park of 1060 3GB), much less trying to mod it, especially considering some of the object textures in this game look rather outdated.

If you have no such expectations, then the ideal GPU would be different. For me, I go by the rule of "just enough isn't enough".
 
Which game is it not running? Work fine on Battlefield, Warhammer and Fallout 4

I didn't say any games "aren't running" Check out the reviews for the card. Most modern AAA game with the exception of DOOM and a small handful of others are running below 60fps @ 1440p. That's average FPS, which means minimums are much lower. That's right now, not 4 years from now.
 
A used 980 Ti would probably be perfect for OP. Runs 1440p like a champ, can be had cheap, roughly as fast as a 1070.
 
Video cards, unlike CPUs, are still growing in raw power at a rate that makes the current model "mid-grade" in 2 years and obsolete in 4. Purchasing a 1060 with 3GB or 6GB today and hoping it will work for the next 4 years in is wishful thinking no matter the resolution. Not because of the memory, but because of the horsepower behind it.

Always purchase the highest-end version of a video card you can afford if you want to get the maximum lifespan of usefulness out of it. Any other choice will result in more compromises as time goes on.
 
I think it has more to do with this:

If you get a video card that can play say high for this years game, there is a good chance that games released next year you may need to drop it back down to medium, which would drastically decrease IQ more than Ultra to high for example.

If you get a video card that can play ultra for this years game, then the chance of you having to drop visuals to unacceptable levels drop significantly.

I didn't buy 1080 as bragging rights (I already removed my computer's specs from my signature), in fact I wanted to buy Titan X, but I couldn't because it was never sold here, and my 970 was p-ing me off (due to it being literally the only 970 on the market to not come with 3 DP/1HDMI, so I couldn't use both of my monitors fully when I went down to 1 970), so I didn't wait for 1080ti.

I was using "TW3 with 60fps minimum at 1440p at ultra" as a benchmark for the performance I am looking for in the GPU, but it didn't have as much to do with playing TW3 at 60fps locked, more that the performance is going to be used for modding games, especially Bethesda ones. I haven't started on TW3 yet, as I still need to run through TW1 again before I go through TW2 and finally TW3.

If you do not have that kind of expectation from your GPU, 1070 would be my choice for 1440p, 1060 for 1080p. I wouldn't go higher than that for either card personally.

If you do not find visual difference between medium and ultra, more power to you. Personally, I like sharp images and textures, blurry images detract me quite a lot (I'd rather have less grass than same amount of grass but blurry grass blades, as an example), and sharpness are usually the first things to go going from high to med.



The guy who could rock "GTX 660 for the last 5 years" can rock 1060 for next four years easily. besides I am a budget concious person and average (mid-high) graphics are really fine with me. The above reply was obviously based on my perception.
 
How much difference is there in gaming on 1080p vs 1440p? Running an R9 390 at this point.

Thanks
 
How much difference is there in gaming on 1080p vs 1440p? Running an R9 390 at this point.

Thanks


IIRC it's like 1.7 times more pixels to be driven (1920 x 1080 = 2,073,600 pixels & 2560 x 1440 = 3,686,400 pixels)

Dude, you can get 1060 3gb and it will work for the next four years on your 1440p monitor flawlessly. There is not much difference between the two except that of RAM type. Yes 1060 is a little slow. But as you mentioned you are fine with mid-high level graphics so it is fine. Plus it will cost less than other one.

Coming from someone who tried a 1060 3gb on my 1440@120 he's better off going with a 1070 if he can swing it. Four years of life from the 1060 is reaaaaaaaaaaally pushing it, I could see 1-2 years perhaps but just trying one out put a lot of strain on that GPU and I felt that going forward with a lot of the newer titles I would need something with a bit more power to maintain a stable experience.

Now if he's looking for 30/60 fps I could see that but if he wanted a longer term card the 1070 would be a better option. When it comes to GPUs for me I try and find the best value for my top budget and then milk that card as long as possible and that usually means shelling out $3-400 and the card lasting me about 3.5 years before I sell it and upgrade.
 
Which game is it not running? Work fine on Battlefield, Warhammer and Fallout 4
You probably have lower standards than most here. Running Fallout 4 at Ultra on my 980 Ti I still had dips down into the 30-40fps range @ 1440p Ultra in areas with a lot of shadows, and the 980 Ti is leagues ahead of a 1060.
 
The guy who could rock "GTX 660 for the last 5 years" can rock 1060 for next four years easily. besides I am a budget concious person and average (mid-high) graphics are really fine with me. The above reply was obviously based on my perception.

I ran a GT640 for 4 years or so and didn't feel the need to upgrade until I tried to play Paragon.

Different people use their cards differently and have different expectations.

The guys saying 1060 3Gb won't run 1440 now and will be obsolete in 2 years have much different expectations than many of us.

As I said earlier, I love my 1060 3GB on my Monoprice 1440 monitor and for me, anything more brings significantly diminishing returns.
 
You probably have lower standards than most here. Running Fallout 4 at Ultra on my 980 Ti I still had dips down into the 30-40fps range @ 1440p Ultra in areas with a lot of shadows, and the 980 Ti is leagues ahead of a 1060.
So you judge standards by GPU? Good to hear that. Stay out of my way please.
 
Back
Top