Best Bang for Buck GPU Config to Achieve +100fps on a 1440p Monitor Set to Ultra

kill8r

Limp Gawd
Joined
Feb 27, 2014
Messages
172
I have ordered most of my components except for the GPU/s
I have ordered:
CPU - 4930k
RAM - Corsair Vengeance 2400 4x4gb
Motherboard - Asus Rampage Black Edition
GPUs -I want to play BF4, Titan Fall and Elder scrolls maxed out on a single ROG Swift
HD - Samsung Pro series ssd 512gb
PSU - EVGA Super Nova 1300w gold
Case - Corsair 900D

As my monitor will have Gsync (ROC Swift), I would like some advice on what value setup will allow me to max out setting on ultra at 1440p and hopefully close to 120fps. I will be upgrading at the end of the year to the new Maxwell cards,so this is a short/medium term solution,

I am happy to purchase either a single high end card (780ti or 290x) or two of the less expensive cards (290 or 770s). Also please note noise in an important factor for me, hence why I have the MSI cards listed below.

Here are the UK prices:

3GB:
MSI 780ti
£542

MSI 780
£390

4GB:
MSI r9 290x
£436

MSI r9 290
£357

Gigabyte 4GB OC 770
£310

As you can see 2 x 4GB 770s cost just a bit more than 780ti. Would they out perform a 780ti for this purpose? The bus is only 256 on the 770s though compared to 512 on the 290s.

If dual cards is the way to go, would you choose the 2 770s over 2x r9 290s, when considering gsync and price?

Would you choose 2 x 290s over a Titan Black?

Thanks for the helping me on this journey!
 
Last edited:
You are not going to max out recent games at 1440p with a single card. If you want gsync, you need an Nvidia card. My advice is go with the 780ti to get the most performance in a single card. While the 770 SLI might have a little more performance, it has a smaller bus (as you mentioned) and SLI scaling is never 100%. In terms of bang for the buck on the Nvidia side, the 780 wins. But since you want to drive 1440p @ 120fps, I suggest the most powerful single card, hence the 780ti.
 
Legit question:
Can we get away with turning gsync off at 100 or 120fps 1440p and using a frame limiter or vsync if we go 780x2 SLi instead?
I mean shit, he's got an x79 platform here, let's not limit the OP to overpriced single GPU configs. You will get a LOT more performance for the little bit of extra money going SLi, and your cards won't instantly depreciate at such a high % when Maxwell drops in bigger doses than the 750Ti lol.
 
You are not going to max out recent games at 1440p with a single card. If you want gsync, you need an Nvidia card. My advice is go with the 780ti to get the most performance in a single card. While the 770 SLI might have a little more performance, it has a smaller bus (as you mentioned) and SLI scaling is never 100%. In terms of bang for the buck on the Nvidia side, the 780 wins. But since you want to drive 1440p @ 120fps, I suggest the most powerful single card, hence the 780ti.

Legit question:
Can we get away with turning gsync off at 100 or 120fps 1440p and using a frame limiter or vsync if we go 780x2 SLi instead?
I mean shit, he's got an x79 platform here, let's not limit the OP to overpriced single GPU configs. You will get a LOT more performance for the little bit of extra money going SLi, and your cards won't instantly depreciate at such a high % when Maxwell drops in bigger doses than the 750Ti lol.

So would dual msi 780s achieve my goal of 1440p @ 120fps and gsync?
 
So would dual msi 780s achieve my goal of 1440p @ 120fps and gsync?

In my experience, you will need to turn down some settings to get 120fps with a 1440p display for the more demanding new games. For instance, with my msi 780 SLI, I get < 80fps for a game like Farcry 3 completely maxed out. This is with a 1600p display and gpu utilization ~90% on both gpus. Although you will be driving fewer pixels with 1440p, the difference is still to small to increase the fps to 120.

I too am planning to get the asus swift. Sadly, I am preparing myself to accept making compromises on graphics settings. But this might not be all bad. Arguably, the difference between 4x SMAA and 2x SMAA on a 1440p display might not be noticeable (depending on how good your eyesight is) and you will get a significant fps boost.
 
An important note for you...you're chasing 2 nearly mutually exclusive things, in name at least. If you were able to run your games at a solid 120Hz, GSync is pointless. It's meant to keep things looking nice when framerates are varying. That said, your Swift is going to support ULMB which should be great if you're running at those high framerates. You'll be turning off GSync.

See the end of the article HERE for a description of ULMB. (It's similar to lightboost, if you're familiar)


As for your videocard choice, that's a tough one. 2x770 or 1x780ti...I'd likely go with the 780ti (it would be so nice if you could get 2 regular 780's tho). This is not a firm recommendation, though...I think it's just simpler and more quiet with the single card. You're not going to get your 120fps but if you can squeeze out 90 with ULMB you'd likely still have a very nice experience. You're really not going to need AA, then you turn down some shadows and maybe one or two other options to Very High instead of Ultra and you'll be ok. Then when Maxwell comes out you can boost things up and reach another level.
 
To push the performance you are wanting for the least investment you'll need 2x 780. The 780 ti is never worth the additional cost over the 780. Both cards running at the same clocks will give you the same gaming experience. You will still need to occasionally reduce a setting or two in a few games.

If you are actually going to spend $800 on a TN panel to get GSync that leaves AMD out of the equation.
 
Doesnt the 780ti have 25 % more texture units and shaders than the 780? How can they perform the same at the same clocks? It would make more sense for the 780ti to be 15-20% faster at the same clocks.
 
The point of Gsync is that you don't need 120fps. That said, I (and probably everyone who is responding to you) have not used Gsync first hand, so I don't know what the experience is like at 30fps versus 60fps versus 120fps, so I can't tell you. Were I in your position, I would boil it down to this:

A) Do I have enough money to do multi-GPU

and

B) Do I care if it really makes a difference or not

If B, then buy a single 780 now, and possibly add on a second one later. If you can afford it, go 780 Ti. That way you can see what the experience with Gsync is like, and if you aren't happy, just grab a second.
 
User impressions of G-Sync seem to suggest that it makes 60 Hz feel like 120 Hz (and so forth). I actually think that's a pretty dubious claim, but I've not had any first-hand experience with it. That NVIDIA G-Sync bus tour that should probably exist doesn't exist.

I do think, however, that chasing up 120 fps and beyond with G-Sync will end up being not all that important. Somewhere in the ballpark of 90-100 fps is going to look extremely smooth, with frame rates higher looking only negligibly (maybe imperceptibly) smoother. A pair of 780s will get you in that range at 1440 at least some of the time, and at least above 60 a lot of the time.
 
I plan on getting the RoG Swift g-sync as well. But my reasons are just the opposite. I have 2x 680s SLI, @ 1440p with out g sync I would see frame rate variation more so than @1080p. So in theory I would get a smoother gaming experience at higher resolutions with little sacrifice in eye candy. I already turn off all AA in BF4 @ 1080 for smoothness, but I hate tearing so I play with v-sync on. So @ 1440p it would look better once I turn off vsync and let gsync do the work.

@OP I really don't have an answer to your question sorry.
 
I still don't understand the obsession with maxing everything out on ultra settings. That's asking for a lot at 1600p+ with a single GPU - the 780ti can't do it with every game, and the 290X can't either, because some games just have a GPU tax for "maxing" out. If time has shown anything, it is that most AAA titles have extraneous settings that do nothing in terms of image quality but have a substantial performance impact - crysis 1 was the beginning of this, and it snowballed from there.

It someone is into that, cool. But keep in mind you can lower from 8X MSAA (complete OVERKILL setting by a MILE) to FXAA/2x MSAA, and lower to high instead of ultra and get similar image quality with a completely huge performance boost in most cases. Heck, crysis 3 gains an absolutely gigantic framerate boost using this method. And, unless my 20/20 fails me, I have played tons of AAA titles doing comparisons between maxed out ultra versus 1-2 settings turned down with FXAA. Generally in 95%+ of games, there is absolutely no image quality difference aside from subtle AA differences. I personally like the sharpness of FXAA better than the 8X MSAA, or even *IF* I use MSAA i'll use only 2X. 8X MSAA is complete overkill and a waste. Not to mention, 8X MSAA has a ridiculous performance hit without a corresponding image quality increase compared to 2X.

Lastly, even at 1080p there is no single GPU that can max every game out at a smooth 120 fps constant at ultra quality in every game. With that being the case, like I said, I absolutely don't understand why every game has an obligation to be maxed out - now if the game can be maxed without the performance loss, sure why not? But Crysis 3? Sounds like a waste of time and money to gear a GPU upgrade solely towards maxing crysis 3 out. Because not a single GPU in existence can do that with a straight 120 fps. And at that point you're about to waste tons of money just for bragging rights.

Like I said, if someone is into that, cool. I don't quite get it though. There seems to be heavy diminishing returns or even no returns when enabling certain graphical settings in games like crysis 3, metro : LL, I could go on.
 
you will need crossfire or sli. you will not be able to play at ultra settings and get those fps.
 
I still don't understand the obsession with maxing everything out on ultra settings. That's asking for a lot at 1600p+ with a single GPU - the 780ti can't do it with every game, and the 290X can't either, because some games just have a GPU tax for "maxing" out. If time has shown anything, it is that most AAA titles have extraneous settings that do nothing in terms of image quality but have a substantial performance impact - crysis 1 was the beginning of this, and it snowballed from there.

It someone is into that, cool. But keep in mind you can lower from 8X MSAA (complete OVERKILL setting by a MILE) to FXAA/2x MSAA, and lower to high instead of ultra and get similar image quality with a completely huge performance boost in most cases. Heck, crysis 3 gains an absolutely gigantic framerate boost using this method. And, unless my 20/20 fails me, I have played tons of AAA titles doing comparisons between maxed out ultra versus 1-2 settings turned down with FXAA. Generally in 95%+ of games, there is absolutely no image quality difference aside from subtle AA differences. I personally like the sharpness of FXAA better than the 8X MSAA, or even *IF* I use MSAA i'll use only 2X. 8X MSAA is complete overkill and a waste. Not to mention, 8X MSAA has a ridiculous performance hit without a corresponding image quality increase compared to 2X.

Lastly, even at 1080p there is no single GPU that can max every game out at a smooth 120 fps constant at ultra quality in every game. With that being the case, like I said, I absolutely don't understand why every game has an obligation to be maxed out - now if the game can be maxed without the performance loss, sure why not? But Crysis 3? Sounds like a waste of time and money to gear a GPU upgrade solely towards maxing crysis 3 out. Because not a single GPU in existence can do that with a straight 120 fps. And at that point you're about to waste tons of money just for bragging rights.

Like I said, if someone is into that, cool. I don't quite get it though. There seems to be heavy diminishing returns or even no returns when enabling certain graphical settings in games like crysis 3, metro : LL, I could go on.
I don't like how FXAA blurs textures in most titles, so I usually stick with 4X MSAA. To be honest there isn't usually a huge difference between 2X and 4X, but I have the power to run 4X so why not.

To tack on to your statements about image quality, I find that SSAO is usually the biggest performance hog outside of traditional or super-sampling AA. Turning SSAO from HBAO or HDAO down to standard SSAO usually is a big performance boost with a negligible difference in quality.
 
Guys I have 2 x MSI 290s (meant to have the best cooling and they run quiet) on order with Amazon due to be shipped out Thursday. I could really use some advise on the following:

1) Are there major performance gains to be had water cooling this card even if it comes with one of the best air coolers? I can get 2x MSI 290x for the same price as 2 x 290 with water blocks.

2) would you go for 2 x wc 290 or 2 x MSI 290x?

2) Does a wc 290x far surpass a wc 290?

3) What is the best wc block?

4) Do I need a backplate?

5) Do these blocks work on cards with aftermarket coolers or do I need to go with the reference versions?

Thanks


Thanks
 
If you have 2 cards removing the coolers and setting up a liquid rig is NOT worth it. Yes it will be quieter but how does MSI look at taking off the cooler as far as the warranty goes. Probably cheaper/eaiser to get a 3rd card, and you will have MUCH better FPS results. Granted I do not know what MB, PSU you currently have but add the cost up first.
 
Probably a couple more generations to go before I can afford to game at 1440 :(

If you have superfast hardware as you say in your sig, why are you gaming @ 1080P. Good 1440P monitors with an American warranty can be had for under $500.00 Are you a student?
 
If you have 2 cards removing the coolers and setting up a liquid rig is NOT worth it. Yes it will be quieter but how does MSI look at taking off the cooler as far as the warranty goes. Probably cheaper/eaiser to get a 3rd card, and you will have MUCH better FPS results. Granted I do not know what MB, PSU you currently have but add the cost up first.

The rig is already setup for water cooling.

What is considered the best 290 for water cooling and who makes the best waterblocks?
 
Guys I have 2 x MSI 290s (meant to have the best cooling and they run quiet) on order with Amazon due to be shipped out Thursday.

Why did you order 290's if you plan on using G-SYNC? G-Sync is a Nvidia proprietary feature that only works with Nvidia Keplar cards and above. Doesn't work with AMD cards.
 
If you play a lot of BF4 then the choice is a no brainer IMO. I am absolutely killing this game with dual 290's, a 4770k and Mantle on my overclocked 1440p monitor. My current settings have the resolution slider at 140%, all Ultra, no AO, 2x AA, and the framerate is still way above my target of 96FPS most of the time. If you are getting a SWIFT you will absolutely be able to take full advantage of the 144Hz refresh rate!

With such a fast setup you aren't going to miss G-SYNC very much.
 
Last edited:
Why did you order 290's if you plan on using G-SYNC? G-Sync is a Nvidia proprietary feature that only works with Nvidia Keplar cards and above. Doesn't work with AMD cards.

This is a reasonable question :)
Unless he didn't read the responses to his own thread he decided on 290s because he felt like Gsync wasn't going to be useful and will be using ToastyX StrobeLight for LB? Not sure.
 
I have a XFX 290 reference playing BF4 mainly with 1600p. Ultra settings with 2x MSAA my temps hit 95c max (didnt check VRMs). I put in my watercooling gear over the weekend with the EK block and backplate for the 290. Temps hit ~50c after a few hours on BF4 same settings (VRMS were around 30c-40c). Overall very happy I got almost 50c difference with watercooling over reference.
 
I only play great games. Luckily for me, great games stopped being produced many years ago (early to mid 2000s), so new hardware is a non-issue, even at 4K resolution.
 
Why did you order 290's if you plan on using G-SYNC? G-Sync is a Nvidia proprietary feature that only works with Nvidia Keplar cards and above. Doesn't work with AMD cards.

1) The AMD cards are only a temporary option for me until the maxwell cards are released.

2) I plan on running at over 100fps at which point most people would disable gsync.

3) I wanted the 4gb of memory which I couldn't get with Nvidea without going for Titan black

4) the 290s were good vale compared to the 780s here in London
 
Back
Top