1080 SLI vs Titan X Pascal

1080 SLI or Titan X Pascal

  • 2x 1080 in SLI

    Votes: 38 19.0%
  • The Titan!

    Votes: 162 81.0%

  • Total voters
    200
What good are improved frametimes when you have those nasty microstutter spikes that thief clearly shows? That's a good case for avoiding SLI, not embracing it. You still haven't shown that 1080 SLI consistently gets higher min fps either. And you forget that a lot of recent games don't support SLI and no dx 12 game does.

PS I've used SLI the past 5 or so years and it's always had perceptible microstutter. It was somewhat improved with my last pair of Titan X but was still there. It's just not worth getting anymore unless you run >4K.

" recent games don't support SLI and no dx 12 game does."

Huh yeah they do? What games? As for DX12, here are two new games that support it
index.php
index.php

And DX12 games will make multi gpu setups better than ever. DX12 is brand new, just give developers time.

Thief is not the norm for sli. Its bad there, but generally the microstuttering is better than that game, and the microstuttering has improved since 780ti days.

The only reason Nvidia dropped 3-4 way sli is because Nvidia's scaling sucks past 2 cards, and it would be dumb to go with that. AMD still supports 3-4 way crossfire because
A) Crossfire scaling is better
B) 3 Way crossfire can be an option for some gamers unlike 3 way sli (albeit a very narrow market of gamers)
C) Crossfire can use just 4x PCIE lanes while SLI cannot.
 
Last edited:
I think I'm being forced out of runing sli cards. Pretty sad.

I've had a pair of zotac 1080 amp edition cards. Single good cooling but once I added the 2nd with 1 slot spacing in between they were cooking eachother. Wanted to water cool them but no blocks out for anything other then fe cards.

Exchanged for 2 pny 1080 fe cards with the idea to water cool them. Ran good and cooled good but with 85-100% fans. Pny don't want you water cooling so I exchanged them for evga.

2x evga 1080 fe cards with plans on water cooling. Overclocked great but the small single 8 pin power connector limits the power to 185watts. The card clocks with dance up and down basicly constantly. If it's not from heat it's from power! So no thanks.. like that on all fe cards.

Then bought 2 evga ftw 1080 hybrids. Those I thought were perfect. Hybrid cooling for only $30 more over a fe card! No spending $125 for ea water block and another $34 for the bridge and another $34 for back plate.

In the end.. there should be a recall on the registration and hybrid evga 1080 ftw cards.

Evga states a small amount of cards have a hardware issue that's been found that will make the cards fan go 100% and the screen go black!

Happends on one of the 2 I have.. had it happen atleast 4 times.

Bought them from evga so I either send the one back free cross ship or return both cards. Members have said the new card they got back is clocked lower.. yuck.

So now I don't know what direction to go!

Single titan??? Evga classified sli with home made water cooling or order blocks for it down the road?

I've been going threw this upgrade phase for prob over 2 months now. The wife is sick of it and I'm about done myself.


"I've had a pair of zotac 1080 amp edition cards. Single good cooling but once I added the 2nd with 1 slot spacing in between they were cooking eachother. "

You turned the fan speed up right? I find this hard to believe that they were that hot unless poor case airflow, because I just came from dual r9 290x's which are probably designed to be the hottest running cards on the planet, but they still did crossfire well within their thermal thresholds
when I turned the fan speed up. 3 years and heavy overclocking later they still run perfectly, despite the long hours of high temps They were reference blower type cards by the way. Do you know the temps exactly?
 
"I've had a pair of zotac 1080 amp edition cards. Single good cooling but once I added the 2nd with 1 slot spacing in between they were cooking eachother. "

You turned the fan speed up right? I find this hard to believe that they were that hot unless poor case airflow, because I just came from dual r9 290x's which are probably designed to be the hottest running cards on the planet, but they still did crossfire well within their thermal thresholds
when I turned the fan speed up. 3 years and heavy overclocking later they still run perfectly, despite the long hours of high temps They were reference blower type cards by the way. Do you know the temps exactly?

Ran them both at max fan speeds. Could not hear the fans even at max. I guess those thin fans suck at pushing air threw the fins.
 
SLi ofcourse. These just showed up.

Who did you buy those from? I'd be pissed off they not only didn't double box for something so pricey but also that those display boxes aren't exactly known for being sturdy enough for shipping.
 
Right, but what were the temps?

Edited:

Actually I think it was more around 68-71c for the first and 10c less for the 2nd. Adding the 120mm fan between the cards blowing down between them lowered the first card to around 4-5c difference from the 2nd. Basement at 70f.
 
Last edited:
That is one hot basement

haha edited that ;)

Actually I think it was more around 68-71c for the first and 10c less for the 2nd. Adding the 120mm fan between the cards blowing down between them lowered the first card to around 4-5c difference from the 2nd.

I've had many 1080s and now testing 2 evga 1080 classifieds.. first one looks like a good clocker. Going to need water cooling for these as well.
 
Edited:

Actually I think it was more around 68-71c for the first and 10c less for the 2nd. Adding the 120mm fan between the cards blowing down between them lowered the first card to around 4-5c difference from the 2nd. Basement at 70f.

You said you also tried 2 PNY FE cards, do you remember the temps on those? I have two as well and with a +210/+400 OC running at 2070 boost they seem to both sit in the low 70's under load (around 71-72.) I'm just curious how yours did.
 
" recent games don't support SLI and no dx 12 game does."

Huh yeah they do? What games? As for DX12, here are two new games that support it
index.php
index.php

And DX12 games will make multi gpu setups better than ever. DX12 is brand new, just give developers time.

Thief is not the norm for sli. Its bad there, but generally the microstuttering is better than that game, and the microstuttering has improved since 780ti days.

The only reason Nvidia dropped 3-4 way sli is because Nvidia's scaling sucks past 2 cards, and it would be dumb to go with that. AMD still supports 3-4 way crossfire because
A) Crossfire scaling is better
B) 3 Way crossfire can be an option for some gamers unlike 3 way sli (albeit a very narrow market of gamers)
C) Crossfire can use just 4x PCIE lanes while SLI cannot.

So a game that is only used for benchmarking (ashes) and total war, that certainly makes the case for dx 12 sli right? :D Microstuttering with SLI is still a big problem, like I said, I recently got rid of my SLI setup so it's not like I haven't used it for years. Check this article out, it benchmarked several games in Crossfire/SLI: The (sad) State of CrossFire and SLI Today - BabelTechReviews
 
I can't remember the temps but might have it on notes. I remember one doing 2050 stable and the other doing 2114mhz. Bought the blocks for water cooling them and everything to redue the whole loop and found out pny don't want you to remove the cooler so I seen evga come into stock at mc and exchanged them. Those did a little better at 2088 and 2124mhz.

I'm assuming the temps were better on the fe with the blower fans. I ran them full bore. As I planned on water cooling so I was only interested in the top clocks atm. Prob 50-60s.
 
Huh yeah they do? What games? As for DX12, here are two new games that support it
index.php
index.php

And DX12 games will make multi gpu setups better than ever. DX12 is brand new, just give developers time.58

lol, at the 26% and 36% scaling.

You just paid 100% for that second gpu but only get to use 26% or 36% of it. Now that is a good return.
 
lol, at the 26% and 36% scaling.

You just paid 100% for that second gpu but only get to use 26% or 36% of it. Now that is a good return.
Yeah those benchmarks are awful. If anything it highlights why SLI is not worth it. Buy a Titan XP if you need the extra performance.
 
lol, at the 26% and 36% scaling.

You just paid 100% for that second gpu but only get to use 26% or 36% of it. Now that is a good return.

Lol he also conveniently omitted the fact that a single Titan X beats 1080 SLI in that war hammer dx12 benchmark:

image.pngimage.png

Edit: Just saw they used fxaa in the Titan X benchmark but the single 1080 only lost 1 fps with mlaa so it's probably negligible in impact. Regardless, look at those results, it makes a good case to avoid 1080 SLI and go with a single Titan X.
 
Last edited:
After 2 attempts at SLI systems, I hate it, never again. Get the best single GPU you can afford, it is the best advice anyone can give.
 
SLI is worth it if you can afford to get two of the best cards available. Because then if SLI doesn't work, oh well, you still have the best performance money can buy. Other than that, you're rolling the dice that the game + drivers play nicely and you get good scaling, which has seemingly become more of a crap shoot.
 
I've been watching this thread and just have one thing to say:

Forget SLI and get one Titan X Pascal and be a happy person.

I'm waiting anxiously for my Txp , going from a 1080GTX FE SLI w/ SLI HB bridge, too many bugs, SLI is for beta testers and benchmarkers LOL, if you wanna play your games in peace, without stress, please, go with one Titan Xp, without SLI you can enable FastSync w/Gsync, avoid lazy devs, avoid inumerous bugs as DX12 OSD overlay only works with SLI disabled, avoid GPU memory usage bug in Win10 Anniv. Update + 372.70 driver in OCSoftware OSD, you became a more peaceful person, more reasonable, do not getting disturbed when you see a game that simply ignores multiGPU implementation or optimization, you realize that your money becames trash for the second card. There are so many others SLI negatives, the list would be giant.

So yeah, I get 13k point in TimeSpy with my SLI and an Oced single Txp will give about 10k, so what?! I cannot play this s**t, these are, unfortunately, just numbers.

When 100% optimized, hell yeah, things became beautiful, also DX12 seems to be starting w/ DX12 mGPU, you've mentioned some games, I play ROTR in DX12 API w/ SLI, so beautiful, but if I pay 2x cards, I wanna see all games running flawlessly w/ SLI, and this appears it'll never happen.

SLI no more.
 
Lol he also conveniently omitted the fact that a single Titan X beats 1080 SLI in that war hammer dx12 benchmark:

View attachment 7827View attachment 7828

Edit: Just saw they used fxaa in the Titan X benchmark but the single 1080 only lost 1 fps with mlaa so it's probably negligible in impact. Regardless, look at those results, it makes a good case to avoid 1080 SLI and go with a single Titan X.


These are very early developments in dx12. Again, give developers time. Few even know how to program dx12
 
These are very early developments in dx12. Again, give developers time. Few even know how to program dx12

The typical "I'll believe it when I see it" argument applies here.

Right now it's pretty hard to expect DX12 EMA being a miracle for mGPU setups when that feature isn't used in any of the DX12 games except Ashes, and DX12 mode in general are either pretty badly implemented, or DX12's benefits are overblown
 
These are very early developments in dx12. Again, give developers time. Few even know how to program dx12

I think it's best to treat tech how a bank treats your income for loans. Prove you can do it right now or no deal.

I've been following tech long enough to know not to bank on something you want to happen to actually happen.

If I was OP I'd get the Titan XP. If/when SLI actually works it'll be time to upgrade and you can reassess there. You don't go SLI and hope for a miracle. We have enough user stories in this thread...
 
Last edited:
I've been watching this thread and just have one thing to say:

Forget SLI and get one Titan X Pascal and be a happy person.

I'm waiting anxiously for my Txp , going from a 1080GTX FE SLI w/ SLI HB bridge, too many bugs, SLI is for beta testers and benchmarkers LOL, if you wanna play your games in peace, without stress, please, go with one Titan Xp, without SLI you can enable FastSync w/Gsync, avoid lazy devs, avoid inumerous bugs as DX12 OSD overlay only works with SLI disabled, avoid GPU memory usage bug in Win10 Anniv. Update + 372.70 driver in OCSoftware OSD, you became a more peaceful person, more reasonable, do not getting disturbed when you see a game that simply ignores multiGPU implementation or optimization, you realize that your money becames trash for the second card. There are so many others SLI negatives, the list would be giant.

So yeah, I get 13k point in TimeSpy with my SLI and an Oced single Txp will give about 10k, so what?! I cannot play this s**t, these are, unfortunately, just numbers.

When 100% optimized, hell yeah, things became beautiful, also DX12 seems to be starting w/ DX12 mGPU, you've mentioned some games, I play ROTR in DX12 API w/ SLI, so beautiful, but if I pay 2x cards, I wanna see all games running flawlessly w/ SLI, and this appears it'll never happen.

SLI no more.

What will you do when your titan xp can no longer play games at the settings you like?
Will you buy a brand new top of the line card again? Or simply buy a second for cheap, after they have come down in price?

Because buying a new top end one everytime is not as cost effective.
 
What will you do when your titan xp can no longer play games at the settings you like?
Will you buy a brand new top of the line card again? Or simply buy a second for cheap, after they have come down in price?

Because buying a new top end one everytime is not as cost effective.
Nobody buys a Titan XP because they're cost effective dude.
 
It's like he just discovered SLI and it's like the next best thing since sliced bread.
 
"Nobody buys a Titan XP because they're cost effective dude."

Ya but you are massively understating how UN-COST effecttive they really are. There is paying extra for the top of the line cards, and then there is literally throwing your money away. Titan XP
is, in a sense, throwing your money away IF you are not rendering with it.

amd-nvidia-multi-gpu-performance-per-dollar-1.png


This is last generation, but you can see that four fury X's are more cost effective than a SINGLE Titan X. That is how bad Titans are for price vs performance. So yes, obviously people who buy them have more money than sense, but they might as well be on their knees to suck Nvidia off if they are only using it for games.

I'll buy a new high end card, always.

SLI is not a reliable option, it would be a good option when all PC games were optimized for it with a nice scaling, but this may never happen. DX11 scenario was a little better for SLI, sure, not 100%, but a lot better than now with the new APIs.

I'm just too freakin tired to see those Nvinspector manual fixes for SLI to work better, it is clear that most game developers don't give a f**k to SLI or CF.

APIs x SLI support:

OpenGL = f**k SLI
Vulkan = SLI support not seen yet
DX11 = obsolete, medium to poor SLI support, a lot of manual fixes by users via nvinspector
DX12 = new, poor SLI support, dev dependent, no more exclusive support via gpu drivers.

I ask you, is this a good scenario?

I could talk all day long here, just listing all SLI bads here, the last one for me was crashing in Project Cars w/ SLI enabled, disabling it run flawlessly, in addition I can use Fastsync and use more GPU power without tearing above my gsync panel refresh rate, because Fastsync is SLI incompatible.

Just don't get tempt by those 3DMark/Heaven numbers in SLI, they are awesome I know, just stay away from SLI and your problems are gone, always get the most powerful single graphics card and enjoy your games without being a troubleshooting expert.



When dx11 came out, devs didnt know how to program for it either, its to be expected. Nvidia Single cards arent optimized for it either, especially considering nvidia doesnt have async compute

Why do you say "dev dependant"? Its like you expect dx12 to somehow be different from other apis. Of course its dev dependant, all programming is, all optimization is.

Everyone has their preferences, I guess, but personally i enjoy fast twitch games where sli gamers will always dominate single gpu gamers due to the improved latency.
 
Last edited:
Dude. I'm running a 770gtx SLI and its a good system for my usage. I play at 1080p and this setup has lasted me a couple of years. Not as long as I'd have liked to be perfectly honest. I built this system with a 5 year plan and after watching trends at the time SLI was taking off. But as soon as I built my system 4K was the buzzword with panels crashing down in price all over the place. 4K was the new nirvana and I had no chance especially with the new consoles coming out at the same time. 4K was the new goal.
I couldn't have picked a worse time to build a system.
But my system is still great. I love it. It runs flawlessly and while not exactly world beating on the performance front it holds its own nicely.
I'm totally wanting to upgrade my system to a SINGLE card. But I'm waiting for the 1080TI or greater before I upgrade.
 
When dx11 came out, devs didnt know how to program for it either, its to be expected. Nvidia Single cards arent optimized for it either, especially considering nvidia doesnt have async compute

Why do you say "dev dependant"? Its like you expect dx12 to somehow be different from other apis. Of course its dev dependant, all programming is, all optimization is.

Everyone has their preferences, I guess, but personally i enjoy fast twitch games where sli gamers will always dominate single gpu gamers due to the improved latency.

What can I say?

You are right about the latency, period. :)

I'm a SLI freak too, so...
 
What can I say?

You are right about the latency, period. :)

I'm a SLI freak too, so...

The latency difference is less than one frame in the best case so it's completely irrelevant. What is relevant is the inherent microstutter that happens while playing twitch fps games in SLI. New engines like unity and ue4 don't support SLI natively and so the onus is on developers and there is a very high chance they will not focus resources on SLI because of the multi platform nature of most games. There will be SLI support in some AAA games but it's pretty evident that it will be a steep decline from dx 11. In short, SLI and crossfire are pretty much at the end of their usefulness.

It's always better to buy the single best card and then sell it in six months when a new one is released. I never understood why some people hang on to cards for years.
 
The latency difference is less than one frame in the best case so it's completely irrelevant. What is relevant is the inherent microstutter that happens while playing twitch fps games in SLI. New engines like unity and ue4 don't support SLI natively and so the onus is on developers and there is a very high chance they will not focus resources on SLI because of the multi platform nature of most games. There will be SLI support in some AAA games but it's pretty evident that it will be a steep decline from dx 11. In short, SLI and crossfire are pretty much at the end of their usefulness.

It's always better to buy the single best card and then sell it in six months when a new one is released. I never understood why some people hang on to cards for years.


"The latency difference is less than one frame in the best case so it's completely irrelevant."

The latency is not measured in frames, it is measured at milliseconds. And on average modern SLI is around 5-7ms faster to deliver each frame.
Now that is not much granted, but add those up over the course of an hour of game time and you lose about 120 seconds.

"What is relevant is the inherent microstutter"

The microstutter is so small you would barely ever notice it. Like the FCAT charts show, there are only a few frames in all the examples that spiked up towards to 25 ms marks.
The people who experience it have issues or bottlenecks elsewhere in their computer.

"New engines like unity and ue4 don't support SLI natively"

Unreal is not a new engine, but since you brought this up, I am a developer, and we do implement SLI in our Unreal games that need it. (never used Unity) But it's always been up to developers and we have a massive incentive to include
SLI/Crossfire support because if we don't we potentially lose out on all the sales from the gamers who have multiple cards.

"SLI and crossfire are pretty much at the end of their usefulness."

They are only less usefull now because we can hit 4k (the new standard that is being pushed for 2017) using a single card. When a new standard comes along, 6k or 8k, multi gpu's will rise in popularity again.
Keep this in mind: parallel computing is becoming more popular, not less. And when its uses are utilized properly they provide some great advantages.
 
I'd go single 1080 before the Titan for 5 bills more. I mean people tell people with a 980ti to not upgrade to the 1080 because it's just a little faster yet they tell people it's ok to choose the Titan xp over the 1080?? It's also a little faster.
 
"The latency difference is less than one frame in the best case so it's completely irrelevant."

The latency is not measured in frames, it is measured at milliseconds. And on average modern SLI is around 5-7ms faster to deliver each frame.
Now that is not much granted, but add those up over the course of an hour of game time and you lose about 120 seconds.

"What is relevant is the inherent microstutter"

The microstutter is so small you would barely ever notice it. Like the FCAT charts show, there are only a few frames in all the examples that spiked up towards to 25 ms marks.
The people who experience it have issues or bottlenecks elsewhere in their computer.

"New engines like unity and ue4 don't support SLI natively"

Unreal is not a new engine, but since you brought this up, I am a developer, and we do implement SLI in our Unreal games that need it. (never used Unity) But it's always been up to developers and we have a massive incentive to include
SLI/Crossfire support because if we don't we potentially lose out on all the sales from the gamers who have multiple cards.

"SLI and crossfire are pretty much at the end of their usefulness."

They are only less usefull now because we can hit 4k (the new standard that is being pushed for 2017) using a single card. When a new standard comes along, 6k or 8k, multi gpu's will rise in popularity again.
Keep this in mind: parallel computing is becoming more popular, not less. And when its uses are utilized properly they provide some great advantages.

When I say one frame I'm referring to this: FCAT GeForce GTX 1080 Framepacing review

Frametime - Basically the time it takes to render one frame can be monitored and tagged with a number, this is latency. One frame can take, say, 17 ms.

Sorry but I don't buy the excuse of developers being motivated to add SLI for financial reasons. SLI is such a tiny segment of the overall market that most devs wouldn't go the extra mile to add it unless Nvidia sponsors it.
 
Last edited:
I'd go single 1080 before the Titan for 5 bills more. I mean people tell people with a 980ti to not upgrade to the 1080 because it's just a little faster yet they tell people it's ok to choose the Titan xp over the 1080?? It's also a little faster.

Depends on which 1080 you're talking about. Some of the better ones cost around $800. And it also depends on what you need it for and how you use it. I use a 1440p display at 144hz and the 1080 didn't deliver high enough frames in some games to hit the 120 fps mark consistently. If someone is playing 1080p then yeah, Titan X would be a waste. Those that have 4K 60hz will also find it necessary over a 1080.
 
When I say one frame I'm referring to this: FCAT GeForce GTX 1080 Framepacing review



Sorry but I don't buy the excuse of developers being motivated to add SLI for financial reasons. SLI is such a tiny segment of the overall market that most devs wouldn't go the extra mile to add it unless Nvidia sponsors it.

The name of the game is to reach as wide a market as possible. The more sales, the more money. And if the profits of sales for multi gpu sales outweigh the costs of implementation (which in our case they do if the game is demanding) SLI/Crossfire will be implemented.
The same goes for any feature. A lot of thought goes into things like ratings, ideally you don't want a Mature rated game, because it limits you to the over 17 crowd. (not that it stops young kids from getting their hands on GTA anyway lol)
Or the cost/benifit of multiplayer etc.

"add it unless Nvidia sponsors it."

And often times you can get them (or AMD) to sponsor it, making the costs minimal. At the very least, you can pay them a percentage of revenue only after the game releases etc.
It's generally not monetary costs that are the concern when implementing SLI/Crossfire, but rather time costs. Managers are always pushing for deadlines, and not all features get in at the end, but trust me SLI/Crossfire is closer to the top of the pile
IF the game needs it.

"When I say one frame I'm referring to this: FCAT GeForce GTX 1080 Framepacing review"

what you posted is a single 1080 vs a single fury.
 
Depends on which 1080 you're talking about. Some of the better ones cost around $800. And it also depends on what you need it for and how you use it. I use a 1440p display at 144hz and the 1080 didn't deliver high enough frames in some games to hit the 120 fps mark consistently. If someone is playing 1080p then yeah, Titan X would be a waste. Those that have 4K 60hz will also find it necessary over a 1080.

I too have a 1440 monitor thathe will run at 164hz. I didn't feel a single Titan xp would get me 144+ gps so the 2080 sli was the no brainer at 2 bill more. My fast twich online gaming uses sli so it was an easy choice.. so yeah monitor res and gaming type should play a roll.in
 
I too have a 1440 monitor thathe will run at 164hz. I didn't feel a single Titan xp would get me 144+ gps so the 2080 sli was the no brainer at 2 bill more. My fast twich online gaming uses sli so it was an easy choice.. so yeah monitor res and gaming type should play a roll.in

And this is why I tire of mgpu. You gotta stick to the proven sli games or it runs like ass so you have to disable it in such titles. I run a higher megapixel load than you and can make use of 144hz panels. At some point I got over my fondness of sli and got real, and sick of only being able to use a 1/3rd of the gpus I was running 90% of the time. With a 360 game deep steam library, maybe a handful truly have good sli support. What the five man, what kind of odds is that?
 
Back
Top