Single GTX770 at 2560x1440 opinions

Joined
Jan 31, 2002
Messages
621
Is anybody playing at this res with a single 770? Is the 770 powerful enough to push this? My current card is a 2gb model, btw.
 
Do you only play the most demanding games? Extra screen real-estate is always awesome.

If you play older titles or not extremely demanding games, higher resolution is the best way to get a better experience out of those titles.

It will work well enough for most titles.
 
I have catleap 2560x1440 monitor, and a gtx 650ti 2gb. It can run battlefield 4 on low/medium settings pretty well.
 
Not if you wanna max out BF4. I am right up against the limit with max graphics settings at 1080p 60 with my single GTX770 4gb.

It does occasionally dip below 60.
 
I'm not running a gtx770, but I am running an HD7990 (pretty much, 2x7970's... which is similar in performance to 2x770) and I sometimes have problems running games at 1440p. I run a 120hz though, so I try to aim for as close to 120fps as I can.

Long as you don't mind turning some settings down or can deal with sub 60fps, a single 770 should be ok for most newer games. Older games shouldn't be too much of an issue
 
All you have to do is look at reviews to get a good idea of what to expect. Its just common sense that you will have to reduce settings but only you can decide what level of performance is acceptable. If you want 60 fps then of course you will be reducing lots of settings in demanding games.
 
Yeah I have x2 680's and are basically 770's on 1440p and I have to turn down a few settings(mostly AA) if I want a constant 60fps but I find 30+fps perfectly fine so all my games are on max settings and using AA but a single 770 will struggle on 1440p unless you turn down a bit

If your opinion is a constant 60fps then you'll have to turn down quite a bit of settings, 30fps and you'll be just fine
 
Last edited:
I was thinking with the higher res, I could drop the AA down a bit anyway. I'm still considering adding a second card for SLI, but I've only got the 2gb version, so I may hit a wall anyway a bit down the line. Is micro stuttering still an issue on SLI? Any other weird issues with SLI?
 
You can make a single 770 work. Everyone here seems to operate under "maxed out or nothing". Which, of course, is stupid. You don't have to do this. If you don't mind lowering settings, you will be fine. In fact, I've tried this in many games and I can more or less assure you that AAA titles do not look any different at very high vs high settings. Random examples: You can make metro 2033 run absolutely great with AAA, DX11, and ADOF + physx turned off. But if you turn all of those on (ADOF of physx) it will literally chop your framerate in half. BUT THE GAME DOESNT look different. Same for crysis 3. To my 20/20 vision, at 1600p,high looks 100% identical to very high. You can also lower shadows and water another bit for a higher framerate. Try it for yourself. These games do not look different with a few dials turned down. You can also use FXAA instead of MSAA which is also a huge framerate boost.

Regardless of what anyone states here, you CAN make it work. Just use common sense. Common sense means you won't max everything out with ridiculous levels of 8X MSAA or anything along those lines. Use high instead of ultra, and the games WILL NOT look different. This I assure you. Crysis 3 looks exactly the same at very high and high, but the framerate difference is immense.

And there's also the fact that not all games are super demanding AAA titles, for all of those, they will run great.

It all boils down to whether you're willing to spend another 330$+ for ultra settings. It's up to you - i'd say give it a try with a single card, and if you aren't ridiculous about maxing stuff out, you will enjoy it just fine. Use common sense about game settings. I personally think the maxed out or nothing mindset is pretty dumb, but whatever. It's all subjective. And if you try a single card, feel like it isn't doing it for you, buy another card later. My suggestion is to try a single card first and see how you like it. I've tried it at various times and found that it was perfectly workable. PERFECTLY do-able without much compromise at 1440/1600p.
 
Last edited:
Yeah, that was kind of my thinking. I usually back off ultra settings, and run with vsync disabled anyway. I'll (generously!) let my wife buy me a new card for Christmas, and get the monitor now, I think.
 
I have a 670 and 1440p. I max out every game I play.

I don't play BF4 though, so no advice there. I'm also not bothered by sub 60 fps, as long as it doesn't stutter (generally starts happening sub 30).
 
You can make a single 770 work. Everyone here seems to operate under "maxed out or nothing". Which, of course, is stupid. You don't have to do this. If you don't mind lowering settings, you will be fine. In fact, I've tried this in many games and I can more or less assure you that AAA titles do not look any different at very high vs high settings. Random examples: You can make metro 2033 run absolutely great with AAA, DX11, and ADOF + physx turned off. But if you turn all of those on (ADOF of physx) it will literally chop your framerate in half. BUT THE GAME DOESNT look different. Same for crysis 3. To my 20/20 vision, at 1600p,high looks 100% identical to very high. You can also lower shadows and water another bit for a higher framerate. Try it for yourself. These games do not look different with a few dials turned down. You can also use FXAA instead of MSAA which is also a huge framerate boost.

Regardless of what anyone states here, you CAN make it work. Just use common sense. Common sense means you won't max everything out with ridiculous levels of 8X MSAA or anything along those lines. Use high instead of ultra, and the games WILL NOT look different. This I assure you. Crysis 3 looks exactly the same at very high and high, but the framerate difference is immense.

And there's also the fact that not all games are super demanding AAA titles, for all of those, they will run great.

It all boils down to whether you're willing to spend another 330$+ for ultra settings. It's up to you - i'd say give it a try with a single card, and if you aren't ridiculous about maxing stuff out, you will enjoy it just fine. Use common sense about game settings. I personally think the maxed out or nothing mindset is pretty dumb, but whatever. It's all subjective. And if you try a single card, feel like it isn't doing it for you, buy another card later. My suggestion is to try a single card first and see how you like it. I've tried it at various times and found that it was perfectly workable. PERFECTLY do-able without much compromise at 1440/1600p.

<slow clap>. Good post.

My thoughts exactly

Yeah, that was kind of my thinking. I usually back off ultra settings, and run with vsync disabled anyway. I'll (generously!) let my wife buy me a new card for Christmas, and get the monitor now, I think.

I downgraded about 4 months ago from a 7970 to a 660ti becasue I wasn't gaming a lot. I still can run a lot of games just fine.

It comes down to as he said above what you are comfortable running. If you want to run the newest games on ultra at 1440p, you might as well buy a 780/290. But if you are able to make concessions you will be just fine.


btw, once you get a 1440p monitor you will be ruined. You will never be able to use a low res/non-ips monitor again. You have been warned.
 
I have a 670 and 1440p. I max out every game I play.

I don't play BF4 though, so no advice there. I'm also not bothered by sub 60 fps, as long as it doesn't stutter (generally starts happening sub 30).
Then you certainly are not playing some of the really demanding games and/or have your own definition of maxed out.
 
My thoughts exactly



I downgraded about 4 months ago from a 7970 to a 660ti becasue I wasn't gaming a lot. I still can run a lot of games just fine.

It comes down to as he said above what you are comfortable running. If you want to run the newest games on ultra at 1440p, you might as well buy a 780/290. But if you are able to make concessions you will be just fine.


btw, once you get a 1440p monitor you will be ruined. You will never be able to use a low res/non-ips monitor again. You have been warned.

Good to hear! My new monitor (An Asus PB278Q) landed this evening, but I live in Barbados in the Caribbean so I have to wait another week or longer for it to clear Customs - that's the price of living in the summer all year round, so it all balances out :) I'm not joking when I say that there's only ONE place in the ENTIRE country that has a SINGLE 1440 monitor for sale, an HP model, and it goes for $900 USD. I've literally never seen a 1440 panel in action, so I'm pretty excited after reading the reviews, even for an average panel such as this. You guys are so lucky that you can read some reviews, hop in a car, and actually buy whatever you want, whenever: with me, it's order from Amazon, wait, organize my freight-forwarder, add more shipping costs, wait some more, product arrives, wait even longer, pay duty, and pick up. I was in Fry's in Escondido in 2012, and it damn-near killed me!
 
Then you certainly are not playing some of the really demanding games and/or have your own definition of maxed out.

I love how utterly wrong everyone is about 1440p. It is incredibly demanding. There is no point in using all your resources on resolution alone. There are more factors to image quality than that, and that's what you give up for raw resolution.

what's the point of 1440 with low IQ settings?

EXACTLY. If 1440p were so easy to drive, everyone would do it.
 
I HATE jaggies. I'd rather step back a bit on IQ than notice them. For as many pro-1440 articles, you'll find the same amount of anti-1440. I've got an older 1080 monitor: what I'll lose in IQ, I'll gain in contrast, viewing angles, response etc. My 770 may not push this to the max, but the card I'll buy a few months down the line will split the expense up a bit. Sure, I could splurge on a 780 or a SLI set-up now (Last time I had a multi-card setup was a Matrox Mystique + 2x Voodoo 2 8MB SLI PCI setup), but I've got the income to wait a while for the next gen, and enjoy the older games I've got queued up to play right now, at decent FPS. FWIW, this system I have now has been updated incrementally over the the last 21 years from a DOS based 486-DX2 66Mhz Vl-Bus system in 1993, where I used to have to run Mo'Slo to slow games down to a playable speed: my first monitor was a generic 0.39DPI 640x480 CRT, so I'm at ease with compromise :) AND GET OFF MY LAWN!
 
I HATE jaggies. I'd rather step back a bit on IQ than notice them. For as many pro-1440 articles, you'll find the same amount of anti-1440. I've got an older 1080 monitor: what I'll lose in IQ, I'll gain in contrast, viewing angles, response etc. My 770 may not push this to the max, but the card I'll buy a few months down the line will split the expense up a bit. Sure, I could splurge on a 780 or a SLI set-up now (Last time I had a multi-card setup was a Matrox Mystique + 2x Voodoo 2 8MB SLI PCI setup), but I've got the income to wait a while for the next gen, and enjoy the older games I've got queued up to play right now, at decent FPS. FWIW, this system I have now has been updated incrementally over the the last 21 years from a DOS based 486-DX2 66Mhz Vl-Bus system in 1993, where I used to have to run Mo'Slo to slow games down to a playable speed: my first monitor was a generic 0.39DPI 640x480 CRT, so I'm at ease with compromise :) AND GET OFF MY LAWN!

aliasing is noticeable at 1440p, even 1600p. also, you came here (HARD forum, i remind you) seeking opinions, but instead, only accept validation on your soft purchasing decision? :confused:
 
You can always use FXAA which has, basically, no performance hit at any resolution. Also, the hard vs soft stuff is long in the tooth. Let's not be silly and condense every budgetary issue into that, christ. Not everyone here spends 1000s on GPUs. I mean it's fine if you want to do that, i've done it MANY times myself. But when you have real budgetary concerns, sometimes budgetary concessions have to be made. Sometimes when you have a massive mortgage and multiple car payments, you're not always going to want to spend 600$ on another GPU. And sometimes even when you can, the woman will raise too many issues with it. ;) And let's face it. Having an angry wife around the house. NOT fun stuff.

That's just how it is for many folks in the real world. And when that is the case, you can easily make a single GPU work at 1440p or 1600p - I have done it. Simply lowering 2-3 settings in most games while using FXAA will result in the games NOT looking different at all (I have examples of this in a prior post) while running significantly better.

If you want to max everything out across the board, i'm not knocking that decision. It's all in the eye of the beholder. Either decision is a valid one, and you CAN MAKE a single GPU in fact work fine at 1600p. Period. The minimal concessions you do make don't affect image quality to any significant extent, but if you want to spend more money for ultra settings, you're free to do so. I will say that I believe the "maxed out or nothing" mindset is kinda silly. Think about it. Even a GTX 780ti or 290X cannot max Metro: LL or crysis 3 out at 1080p ultra and maintain 60 fps 100% of the time. When the best of the best can't max games out and maintain the best framerate, something is wrong. Sure you can spend 600 more bucks for SLI or Xfire with another GPU. Then you can run ultra to your hearts extent. But is that the best decision for 100% of buyers? Nah. Sometimes lowering those 2-3 settings is preferable.

I mean, i've played so many games where I turned settings down and the resulting image quality was absolutely no different @ 1600p. Yes, if you go to the lowest settings there is a difference. But you can generally gain 30-40 fps by simply lowering some of the overkill settings, using high instead of ultra, using FXAA instead of MSAA, and get a huge framerate boost without any real image quality difference. Like I said, I've experimented with this and couldn't really find IQ differences when I tried.
 
Last edited:
aliasing is noticeable at 1440p, even 1600p. also, you came here (HARD forum, i remind you) seeking opinions, but instead, only accept validation on your soft purchasing decision? :confused:

Thank you for your interest in the position of thread Troll. Unfortunately, we are not hiring at this time, but best of luck in your future endeavors.
 
Thank you for your interest in the position of thread Troll. Unfortunately, we are not hiring at this time, but best of luck in your future endeavors.

not trolling. grow up and face the facts. a 770 @ 1440 is a compromised experience unless you're playing older games. are you asking for opinions or seeking validation?... :rolleyes:
 
You can always use FXAA which has, basically, no performance hit at any resolution. Also, the hard vs soft stuff is long in the tooth. Let's not be silly and condense every budgetary issue into that, christ. Not everyone here spends 1000s on GPUs. I mean it's fine if you want to do that, i've done it MANY times myself. But when you have real budgetary concerns, sometimes budgetary concessions have to be made. Sometimes when you have a massive mortgage and multiple car payments, you're not always going to want to spend 600$ on another GPU. And sometimes even when you can, the woman will raise too many issues with it. ;) And let's face it. Having an angry wife around the house. NOT fun stuff.

That's just how it is for many folks in the real world. And when that is the case, you can easily make a single GPU work at 1440p or 1600p - I have done it. Simply lowering 2-3 settings in most games while using FXAA will result in the games NOT looking different at all (I have examples of this in a prior post) while running significantly better.

If you want to max everything out across the board, i'm not knocking that decision. It's all in the eye of the beholder. Either decision is a valid one, and you CAN MAKE a single GPU in fact work fine at 1600p. Period. The minimal concessions you do make don't affect image quality to any significant extent, but if you want to spend more money for ultra settings, you're free to do so. I will say that I believe the "maxed out or nothing" mindset is kinda silly. Think about it. Even a GTX 780ti or 290X cannot max Metro: LL or crysis 3 out at 1080p ultra and maintain 60 fps 100% of the time. When the best of the best can't max games out and maintain the best framerate, something is wrong. Sure you can spend 600 more bucks for SLI or Xfire with another GPU. Then you can run ultra to your hearts extent. But is that the best decision for 100% of buyers? Nah. Sometimes lowering those 2-3 settings is preferable.

I mean, i've played so many games where I turned settings down and the resulting image quality was absolutely no different @ 1600p. Yes, if you go to the lowest settings there is a difference. But you can generally gain 30-40 fps by simply lowering some of the overkill settings, using high instead of ultra, using FXAA instead of MSAA, and get a huge framerate boost without any real image quality difference. Like I said, I've experimented with this and couldn't really find IQ differences when I tried.

i don't disagree with anything you've said. i often use fxaa over msaa, as the latter is impractical due to what i perceive as diminishing returns on IQ vs performance;
 
FXAA makes things look like muddy water though, make it a blurry mess.

Heck even on 28" 4K monitor you need AA to stop things having jaggy edges.

If you want no jaggy edges and the image to not look blurry then you want to inject an AA such as SMAA.
 
Yes and no. Not entirely correct. FXAA 1.0 was blurry, but there are 3 versions of FXAA (IIRC) and the one that has been used for some time now has sharpness filters which are automatically applied (I think) through the NV control panel, or if that doesn't do it for you, you can instead use FXAA injector which lets you directly manipulate the sharpness filters. FXAA injector has all sorts of tweaks and sliders, and you can make FXAA very sharp if you want to. - If you want FXAA without blurring you can get that. Using the override FXAA in NV control panel isn't blurry to my eyes, I do remember FXAA 1.0 from years ago being blurry. But it isn't like that anymore. I do think FXAA is far less blurry than MLAA is. I don't know if AMD updated MLAA since 2012/early 2013, but it was pretty blurry. Then again you get the same option with AMD - you can use FXAA or SMAA injector and just bypass that entirely and use whatever sharpness filter you want.

The main point here isn't to argue the merits of which shader based AA to use, SMAA in fact is very good as well. SMAA also has sharpness filters just as FXAA does. SMAA or FXAA, these both work pretty well and without a huge performance drop which MSAA definitely will give you a performance drop. Personally speaking, MSAA isn't very sharp either when you get down to it. But it's all up to preference.

I guess what i'm saying is, you can change the AA type from MSAA to a shader based AA without a performance penalty if you make the effort to do so. That effort can be simply enabling it in the control panel with a profile or you can use FXAA/SMAA injector to your satisfaction and apply whatever sharpness filter you want. Also, using these types of common sense changes will allow you to use a single GPU at 1600p and still maintain a great 60 fps + gameplay experience. Yes, if you want ultra, you'll need more. But then again, i've done direct before and after comparisons of ultra maxed out games versus games with a few dials turned down and really couldn't spot an IQ difference.

In the end either decision is a valid one. Get SLI and crank everything up without thinking. Use a single card, turn 2-3 dials down, and get a great framerates without a real image quality change. SLI will let you turn the dials way up with SGSSAA and all that sort of thing, whereas you wouldn't want to do that on a single card. Either decision is valid, and either decision works. You can make both a single and dual card work, I guess that's the main thing I wanted to get across. It isn't a situation that a dual GPU is a pre-requisite at 1600p, it's not like that. Not unless you want to simply max everything out, and I don't think there's a need to do that personally. I mean if you want to, go for it. But it isn't' a pre-req.
 
not trolling. grow up and face the facts. a 770 @ 1440 is a compromised experience unless you're playing older games. are you asking for opinions or seeking validation?... :rolleyes:

It's been a good run: I've been posting to this forum for over 12 years, and yours is the first needlessly obnoxious reply I've ever gotten. The guys here are awesome, taking time out to offer up opinions and advice, so you're an aberration. I normally don't respond to people like you online, but I will (this once) address your comment, just because you're acting like such a little girl: if you read back through the thread, you'll notice that I ask for advice, receive varying opinions, then make a decision, which I then lay out my reasoning behind. I didn't crap on anybody that laid out an opinion that I ultimately didn't hold to be the same as mine - I simply made a different choice. I'm baffled as to why you're fixated on me "Justifying" (Read: "Explaining") my decision, but probably you're just lonely. Anyway, please take your little confused/rolleyes emoticons and your high blood-pressure, and go someplace else, 'cos you're not welcome here.
 
Anybody know when the next-gen is coming? :) Seems like there's always gonna be some compromise at 1440, but reading through the replies I think I can make the single 770 work for a while. Next-gen I'll go with the top-end model to avoid this again in the future. Plus, I've got a 4 week old baby, all my money seems to be going on diapers...
 
Anybody know when the next-gen is coming? :) Seems like there's always gonna be some compromise at 1440, but reading through the replies I think I can make the single 770 work for a while. Next-gen I'll go with the top-end model to avoid this again in the future. Plus, I've got a 4 week old baby, all my money seems to be going on diapers...

Seems like a good bet honestly.

Another thing to think about, you can sell your 770 on ebay for between 300-350. For the same price you can buy a reference 290x. :D

Had a little one not that long ago. Diapers are expensive. lol
 
It's been a good run: I've been posting to this forum for over 12 years, and yours is the first needlessly obnoxious reply I've ever gotten. The guys here are awesome, taking time out to offer up opinions and advice, so you're an aberration. I normally don't respond to people like you online, but I will (this once) address your comment, just because you're acting like such a little girl: if you read back through the thread, you'll notice that I ask for advice, receive varying opinions, then make a decision, which I then lay out my reasoning behind. I didn't crap on anybody that laid out an opinion that I ultimately didn't hold to be the same as mine - I simply made a different choice. I'm baffled as to why you're fixated on me "Justifying" (Read: "Explaining") my decision, but probably you're just lonely. Anyway, please take your little confused/rolleyes emoticons and your high blood-pressure, and go someplace else, 'cos you're not welcome here.

relax. you asked for opinions and you got one you didn't want to hear. no need for the personal attacks.
 
I have a GTX 770 in the same resolution and played Skyrim and battlefield 4 on high settings and it was perfectly smooth
 
770 is not enough if you do not want to dial down settings. With adjustments though it will do.
 
Got my monitor over the weekend and Watch Dogs plays perfectly well. Unfortunately, the gameplay itself is a bit of a dog! I'm glad I didn't get this monitor just for this one game - got the DLC for Dishonored and Bioshock Infinite and they look amazing at this rez with everything cranked. Watch Dogs is obviously NOT optimized for PC, even from purely a controller point of view, let alone the graphics. It's just really not all that fun. That's the weakness in hardware reviews: are the latest games they're being tested on really all that good when you disregard the graphics? My previous (Old) 5850 played Saint's Row IV/Just Cause 2/Far Cry 3 at great framerates, and they're probably the most fun I've ever had with games. I know I'm not stopping during the action to admire the ambient occlusion.
 
geforce experience works, believe it or not. try it, and it'll make the game work. and if you want turbo speed, just run it at 720p. it's still higher dpi than a console game on a tv!
 
Back
Top