GTX 1080 vs. EVGA GTX 780Ti Kingpins in SLI

garetjax27

Limp Gawd
Joined
Feb 24, 2006
Messages
186
What kind of performance gains/increases will I see if I upgrade from two EVGA GTX 780Ti Kingpins in SLI to a single GTX 1080? Will the difference truly be night and day?
 
A single 1080 is roughly double the performance of a single 780Ti. So on non-SLI titles, you're going to see a huge difference. On really good SLI titles, you're going to see a small performance boost.

More importantly, you're going to see less than half the power use, heat and noise. And no SLI issues to deal with. Well, ymmv on that one...some people see lots of problems, other people run SLI with no complaints, but regardless, if you can get your performance goals without SLI, it's always better to go with the simpler solution.
 
Short answer: yes.

The 1080 uses less power, produces less heat and wouldn't impose the headaches caused by SLI. Also, it has more than double the available memory. Fan noise will vary, but even my Founders Edition 1080 with the squirrel cage fan is pretty quiet.
 
Thank you all for the replies.

Would my current system (EVGA X58 E760 Classified motherboard, 12GB Corsair Dominator RAM, Intel i7-950, Intel SSD) be a bottleneck for any reason while playing new titles, such as Battlefield 1 or Titanfall 2? Would using a GTX 1080 with this system present a bottleneck for the videocard?
 
I'll be in the minority, but I suspect the answer is yes. X58 is quite a ways back, and while the IPC improvement from generation to generation is not huge, over several generations, it does start to make a significant difference. With a 2500K and a GTX 970, I found that the last couple of Assassin's Creed games were CPU bound, at 1080P. A 3770 mostly eliminated that bottleneck.

Most games will perfectly playable with that setup, but you'll definitely find that in some, the CPU is holding the rest of the system back. I have no idea about Battlefield 1 or Titanfall 2 specifically, as those aren't the sorts of games I play.
 
Getting rid of SLI was the best thing I ever did. Went from SLI 680s to a single 970. Was pretty much a side-grade, but it just works so much better.

Also, if you don't have that i7-950 overclocked, then do it.
 
lol, I guess some things never change. I have been seeing these type of posts since the very first SLI... :)

I played with the First SLI (6800?), had 7800 SLI... too much of a PITA, doesn't work most of the time, cost too damm much.. Save you're money and upgrade the one card more often. IMO... yes, get 1x 1080
 
I went from 3 680s to 2 1080s on an old i7-980x system to the sig rig. Clearly a night a day difference. But yeah, SLI is just becoming annoying. Which isn't so bad for a single monitor as it's easy to disable, there's NO way to disable via software SLI in Surround that I know of.
 
wow, I never ventured into the 3 or 4 way SLI... brave sole.

Now and then the results could be impressive. But like many others note, it seems like SLI support is getting worse, not better. But then there are those times when it does deliver well. I'd thought about selling by 1080s for a single Titan XP but just decided otherwise. Overall I can't complain about performance with this setup. The Surround situation is tricky but I still like it. I think I'll add a secondary 4k for situations where that might work better for some games.
 
Thank you all for the replies.

Would my current system (EVGA X58 E760 Classified motherboard, 12GB Corsair Dominator RAM, Intel i7-950, Intel SSD) be a bottleneck for any reason while playing new titles, such as Battlefield 1 or Titanfall 2? Would using a GTX 1080 with this system present a bottleneck for the videocard?

BF1 pushes my 3770k to 100% pretty much the entire time I'm playing. This is paired with a 980Ti on a 1440p screen. To put it in perspective, a more powerful CPU then yours with a weaker GPU then what you're planning to get and it's maxing out the CPU, so I'd say most certainly you will be CPU bottlenecked in BF1, and by a good amount too.
 
BF1 pushes my 3770k to 100% pretty much the entire time I'm playing. This is paired with a 980Ti on a 1440p screen. To put it in perspective, a more powerful CPU then yours with a weaker GPU then what you're planning to get and it's maxing out the CPU, so I'd say most certainly you will be CPU bottlenecked in BF1, and by a good amount too.

Is that a that on 64, 32, sub 20 servers or single player?
 
I've only played on 64 player servers thus far

Ahh, yes then the 64 player servers seem to get the best of even new CPU's.

OP, I would get the BF1 demo through Origin and see how the performance goes.

What resolution are you running at and what FPS are you trying to achieve? It may or not be worthwhile depending
 
A single powerful card is always better than two slower ones, less overhead and driver headaches.
 
I will second some of the other comments here towards SLI. I had 680s in SLI and then 290s in crossfire but my best setup to date has been my 980ti paired with a g-sync monitor. Even if the card can't push 60fps it still looks smooth; void of tearing and other issues I ran into with my other setups.
 
I will second some of the other comments here towards SLI. I had 680s in SLI and then 290s in crossfire but my best setup to date has been my 980ti paired with a g-sync monitor. Even if the card can't push 60fps it still looks smooth; void of tearing and other issues I ran into with my other setups.

I too am running a 980Ti with a gsync monitor and yes, certainly my best setup. That said, the gsync legend that states it makes 30fps feel like 60fps is complete BS. I know that isn't what you're saying, just wanted to put that out there because I know it's been touted around on more then a couple occasions.
 
run at 60FPS and enable vsync, gsync not needed ;)

Big difference between the two. Standard vsync doesn't come close. If you think it does, you're just deluding yourself into thinking that if you don't have it, you don't want it.
 
Big difference between the two. Standard vsync doesn't come close. If you think it does, you're just deluding yourself into thinking that if you don't have it, you don't want it.

...as long as you're actually maintaining 60 FPS, no drops below that? Yeah, vsync at 60Hz and G-Sync at 60Hz are the same damn thing. No delusions involved. G-Sync and FreeSync are only useful when your setup won't allow you to maintain your display's ideal refresh rate at the settings you want to run.
 
980Ti with a 2K monitor, I maintain 60 FPS in most games on ultra. Gsync might be helpful with a 4K monitor but I don't plan on upgrading my apple cinema DP display any time soon.
 
...as long as you're actually maintaining 60 FPS, no drops below that? Yeah, vsync at 60Hz and G-Sync at 60Hz are the same damn thing. No delusions involved. G-Sync and FreeSync are only useful when your setup won't allow you to maintain your display's ideal refresh rate at the settings you want to run.

Not quite. What about above 60? Most GSYNC monitors are 144hz so there are appreciable benefits to going above 60fps and still maintaining sync, even if you can't get all the way up to 144. Something that standard VSYNC won't do.

The only time they're the same thing is if the only games you play are hard locked at 60 while at the same time never dip below 60. How many games are you playing that fall into this category?
 
I too am running a 980Ti with a gsync monitor and yes, certainly my best setup. That said, the gsync legend that states it makes 30fps feel like 60fps is complete BS. I know that isn't what you're saying, just wanted to put that out there because I know it's been touted around on more then a couple occasions.

Yeah I wouldn't say it makes 30 feel like 60 but i have a hard time telling when it drops into the 40s. I went from 2560x1440 to 3440x1440 with g-sync and while I lost a good chunk of fps I gained a more pleasurable experience.
 
Back
Top