Name games that don't work with Multi-GPU

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
10,085
Name games that don't work with multi-GPU. Specifically - Buggy - where you have to disable SLI or Crossfire in order to play the game without glitches, or random other issues that cause frustration. I'm trying to figure out where all the multi-GPU negativity stems from. Is it all old frustrations?


I'd like recent examples:
Let's put a timestamp on this little inquiry. Specifically, in the last year. I don't care what happened with your multi GPU graphics card setup 5 years ago. How is that at all relevant to today with different drivers, different cards, different DX, Open GL, Vulcan etc.?


Please post in the appropriate child thread:

Two child threads:

Name games that don't work in AMD Crossfire
https://hardforum.com/threads/name-games-that-dont-work-with-crossfire.1930286/

Name games that don't work in Nvidia SLI
https://hardforum.com/threads/name-games-that-dont-work-with-sli.1930282/
 
Last edited:

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
A game supporting SLI or not is another question.

A game BENEFITTING from SLI or not is a completely different question. A game might support SLI out of the box, but SLI benefits will still be limited if it runs 50 fps on a single GPU with an engine built in limit of 60fps.

Both will affect how valuable SLI is for that particular reason.

Hence, higher the resolution, higher the benefit of SLI. If one consistently plays at 4k, you probably wouldn't have to worry TOO much about the latter. At resolution below that, with the current GPUs, SLI becomes much more questionable.
 

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
Occasional zero scaling, negative or below 50% scaling seems pretty common.

http://www.babeltechreviews.com/gtx-1080-ti-sli-performance-25-games/
Inconsistent is the main one here...

FO4 for example, seems to have below 50% for 1080p and 4k, but somehow managed 100% at 1440p.

My statement came come here:

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_SLI/20.html

TPU had 1080 scaling 33% on average at 1440p, 45% if they removed non-scaling games, less than that at 1080p, but at 4k it's still respectable.

I guess the inconsistent result further adds to the mGPU disdain in general.
 

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
5,902
I remember reading that review, but I wished they would have shown the percentage result of negative scaling. In some cases it is bad (look at Siege).

I have SLI on one machine and found I left it off for daily use as it was more of a hassle to figure out which games worked or did not. Of course, if I was playing just one game that needed it, I would enable.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Why name games that dont support it, since it accounts for 99%+ of all games. How many games outside the AAA titles support it?

Subnautica, Conan Exiles etc. Pretty much everything using the Unity engine for example. And since indie games tends to sell more these days than AAA games you get the rest.
 

Nebell

[H]ard|Gawd
Joined
Jul 20, 2015
Messages
1,769
Ubisoft games have shit multi GPU support. I've experienced this with The Division and it's also shown in the review. I was interested in Wildlands but look at that performance gain.... what a joke. Thansk but no thanks. If games won't support multi GPU for high resolution, I won't buy them.
 
Joined
Apr 1, 2008
Messages
2,621
Back then when I had a pair of GTX 1080's in SLI, Doom never worked in SLI mode, no matter what settings I tried it never worked, with the default settings it would cause the game to run like a slideshow.

Homefront: The Revolution had poor SLI scaling at 4K as well, as both GPU's would run around 60%, seeing a little performance increase. I had FPS issues with Rise Of The Tomb Raider, as scaling for both cards GPU usage were constantly fluctuating, giving me inconsistent framerates. Same issue with Far Cry 4 as well.

Titanfall 2 doesn't work with SLI enabled, though you can implement the SLI profile from the first Titanfall into NVIDIA Inspector or whatever it was called, but both GPU's would utilize at half usage which pretty much give you the performance of a single card.... don't bother not to mention I ran into crashing issues as well.

I'm done with SLI, I simply gave up after a month of poor support and sold both my GPU's and waited for a decent card to arrive, now using a Gigabyte GTX 1080Ti Aorus and couldn't be happier.
 
Last edited:

macksomerville

[H]ard|Gawd
Joined
May 18, 2000
Messages
1,976
Back then when I had a pair of GTX 1080's in SLI, Doom never worked in SLI mode, no matter what settings I tried it never worked, with the default settings it would cause the game to run like a slideshow.

Homefront: The Revolution had poor SLI scaling at 4K as well, as both GPU's would run around 60%, seeing a little performance increase. I had FPS issues with Rise Of The Tomb Raider, as scaling for both cards GPU usage were constantly fluctuating, giving me inconsistent framerates. Same issue with Far Cry 4 as well.

Titanfall 2 doesn't work with SLI enabled, though you can implement the SLI profile from the first Titanfall into NVIDIA Inspector or whatever it was called, but both GPU's would utilize at half usage which pretty much give you the performance of a single card.... don't bother not to mention I ran into crashing issues as well.

I'm done with SLI, I simply gave up after a month of poor support and sold both my GPU's and waited for a decent card to arrive, now using a Gigabyte GTX 1080Ti Aorus and couldn't be happier.

Shit, nVidia gave up on SLI too... Youre not the only one tired of the bs ;)
 

noobferguson

Limp Gawd
Joined
Oct 6, 2013
Messages
463
Damn, and here I was thinking of sinking some money into 1070 SLI (I usually never had too many problems with multi-GPU with the games I played). If this trend continues, it's just nvidia getting away with doing less work for the same amount of money we give them.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Damn, and here I was thinking of sinking some money into 1070 SLI (I usually never had too many problems with multi-GPU with the games I played). If this trend continues, it's just nvidia getting away with doing less work for the same amount of money we give them.
With DX12/Vulkan CF/SLI support is 100% on the game developer and not the IHV. With DX11 it requires work from both parties, but a lot easier to do.
 
Joined
May 20, 2016
Messages
712
With DX12/Vulkan CF/SLI support is 100% on the game developer and not the IHV. With DX11 it requires work from both parties, but a lot easier to do.
Also the gaming ecosystem is getting seriously fractured compared to say 4-5 years ago when pretty much 99.9% of games were DirectX. Nowadays games are being released on multiple platforms, multiple-DRM schemes, and multiple APIs. Trying to deal with that clusterf--- and releasing multi-GPU drivers seems like a waste of time, especially with Nvidia being so far ahead of the performance curve this console generation (the lowest common denominator for game development), single card performance has gotten so good that SLI is far less important than it used to be.

I remember back in 2007 when you could drop $830 on a single GPU like the 8800 Ultra and you could look forward to 20 fps in Crysis and 30 fps in TES: Oblivion at 1080p. Single card performance has been increasing multiplicatively for several generations now. Multi-GPU is probably more and more going to be mostly for the realm of professional work rather than gaming.
 
Joined
May 20, 2016
Messages
712
Too bad the table doesn't actually show negative scaling it just shows everything as 0, which is quite misleading imo.
There are three columns to that table, two of which are the raw data, so you are free to peruse it and make your own calculation. Also the third column specifically says "% increase", not "%change". Negative scaling IS 0 percent increase. It's not being misleading at all.
 

M76

[H]F Junkie
Joined
Jun 12, 2012
Messages
11,134
There are three columns to that table, two of which are the raw data, so you are free to peruse it and make your own calculation. Also the third column specifically says "% increase", not "%change". Negative scaling IS 0 percent increase. It's not being misleading at all.
How do you think I noticed that there is huge negative scaling in many games if I didn't read the rest of the columns?

And technically, a change for the worse is negative if measured in increase. Not zero increase, but even if we assume that -50% equals 0 increase. Which is mathematically incorrect. I'd still think the table is misleading because it hides the the seriousness of the issue in the fineprint. Most will just glance at the highlighted column and say it's all fine and dandy when obviously it is not.
 

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
Well, it makes sense if you treat % gain as "% gain on the best setup vs single GPU solution", for greater-than-0%, SLI will work better than Single GPU, for 0%-or-less, you wouldn't use SLI.

Negative SLI would just reinforce the decision to switch SLI off for that game.
 

cybereality

Supreme [H]ardness
Joined
Mar 22, 2008
Messages
5,902
Yeah, that makes sense, in that you could just disable SLI for negative scaling games. But this is one of the reasons SLI is annoying, you have to figure out which games support it and be constantly turning it off and on.
 

bman212121

[H]ard|Gawd
Joined
Aug 18, 2011
Messages
1,645
So out of the table that pippenainteasy posted, it looks like there are 3 titles I would consider getting an advantage using SLI at 4k. Fallout 4, Assassin's Creed Syndicate, and Crysis 3 all are able to break the 60 fps barrier at 4k resolution. Outside of that, it doesn't really make enough of a difference where even on games that do scale well, it's not getting you performance that will make or break a title. At 1440p, if you were running an adaptive sync monitor, I could see a few more titles that would be benefitting from it, which might be enough to justify the price. I think it's be a hard sell for 4k, maybe there are enough titles you'd be playing at 1440p where it would provide a benefit.
 

noko

Supreme [H]ardness
Joined
Apr 14, 2010
Messages
5,190
Yeah, that makes sense, in that you could just disable SLI for negative scaling games. But this is one of the reasons SLI is annoying, you have to figure out which games support it and be constantly turning it off and on.
In driver, game profile - turn off - will be automatic. In Doom Vulkan, 1070 with everything maxed out runs doom consistently 60fps+ for the most part at 3440x1440 resolution. So for games that just naturally have high fps not having SLI to me makes no difference a.k.a Indie games. It is the games that really needs SLI to perform halfway decent with higher settings that need it. DX 12 and two 1070's run great in Rise of the Tomb Raider! For an example that give you better then a TitanX (pascal) game play. Personally I am have mixed feelings about SLI 1070's in the end.
 

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
Nvidia wants to kill SLI. I've talked about why this is, and how its actually much more beneficial to them to get rid of it instead of promote it.

Long story short: when your biggest competitor is yourself from last year, you don't want to give the customer options to buy from your competition and get similar performance.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Nvidia wants to kill SLI. I've talked about why this is, and how its actually much more beneficial to them to get rid of it instead of promote it.

Long story short: when your biggest competitor is yourself from last year, you don't want to give the customer options to buy from your competition and get similar performance.
Nvidia doesn't want to kill SLI. Its just a reality of the future due to developers. If you think an IHV wants to sell less cards on purpose....oh boy.

SLI/CF was invented by the IHVs for that reason. If you say NVidia want to kill SLI, then AMD wants to kill CF just as well.
 

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
Nvidia doesn't want to kill SLI. Its just a reality of the future due to developers. If you think an IHV wants to sell less cards on purpose....oh boy.

SLI/CF was invented by the IHVs for that reason. If you say NVidia want to kill SLI, then AMD wants to kill CF just as well.
Adobe, Autodesk, Microsoft and many other major companies (even outside of computing) are pushing towards smaller monthly/periodic payments instead of a big single purchase. This flies in the face of your argument: why would these massive companies want to take a smaller amount of money over the big purchase? Easy: a steady flow of cash is better than sporadic cash dumps. Nvidia would rather have a customer buy a $200 card every year than buy a $600 card every 3 years, given the exact choice between the two options.

Let's break down the reasons people would buy mGPU over upgrading/buying a new, beefier GPU:

1: You buy two top-end cards because you want the best

2: You buy two mid-range cards because you think you're beating the system and getting top-tier performance for less money

3: You have an older card and want to grab another to get your PC up to spec to keep performance up.


These are the only real scenarios I can think of. Let's break down the results:

1. This is a great initial source of quick cash for Nvidia. However the number of people who have the disposable income to take advantage of this scenario is small, a drop in the ocean compared to the average consumer. and many people who do go this route (not all, or even most) are dropping this serious money to future-proof their system long-term. Future-proofing is a bad thing for Nvidia, because this means the customer doesn't want to purchase a video card/cards for a while, hence the term 'future proofing'. That means their purchases are more sporadic, less constant, less reliable. in other words: scenario 1 has little benefit to Nvidia, but a tiny net positive benefit to be sure.

2. This is bad for Nvidia, plain and simple. They make more margin off of one 1080 than they do off of two 1060s, less now because of recent price cuts, but still a bit. In a world where SLI scales perfectly, two 1060s would be as fast or even faster than a single 1080, however, in the world that Nvidia is helping to build: SLI is broken and unreliable. This means people are more likely to go for the higher-margin big-boys. In reality, if Nvidia wanted to kill this scenario totally they would just completely disable SLI as a feature on mid range cards OH WAIT. This scenario is a net loss for Nvidia.

3. This is bad BAD SUPER BAD NEWS for Nvidia. This is the opposite of good. Imagine a world with perfect scaling, someone with a 780 could easily grab another second hand 780 on Ebay and get 1070-like performance without handing Nvidia a damn dime. Nvidia sees NO money from this. One could argue that whoever sold that 780 obviously bought a new card, thus money still goes to Nvidia, but that is ONE scenario of many: some people just stop gaming, sell their stuff to make a buck, some people would have bought a competitor's product and sell off their old card: In other words, $100 on a second hand 780 is NOT $100 in nvidia's pocket. Nvidia would much rather you pony up and buy that 1070. This scenario is a HUGE net loss for Nvidia.


As you can see, the negatives outweigh the positives massively. Nvidia has little to gain, and so much to lose.


And to your comment about "Well AMD does it too!" Yeah, I agree: AMD would probably HATE the idea of mGPU for the exact same reasons Nvidia does except ONE: advertising. "WE SUPPORT MGPU!" AMD likes being able to say they support it for the sake of being "Good Guy AMD" and winning over fanboys. Hell, CFX support is used as a bullet point on 1060 versus 480 debates online. AMD probably hates it just as much as Nvidia, but they can use their 'support' (I use that term nearly sarcastically) as marketing leverage over Nvidia: that is their only incentive to support it.

Also, claiming this is just 'a reality of developers' is a circular argument. Nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody ......



Do you know how much money Nvidia throws at developers to integrate Gameworks features? PhysX features? Do you know how much aid Nvidia gives developers in optimizing their products and timing driver releases? Nvidia hands out fistfuls of cash to get these features integrated into as many games as they can. Why is this relevant? because if Nvida wanted SLI to be optimized and accessible it would be. They would make that happen. Nvidia is letting SLI die, because as I stated above: it has a net negative impact on their bottom line.
 
Last edited:

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Adobe, Autodesk, Microsoft and many other major companies (even outside of computing) are pushing towards smaller monthly/periodic payments instead of a big single purchase. This flies in the face of your argument: why would these massive companies want to take a smaller amount of money over the big purchase? Easy: a steady flow of cash is better than sporadic cash dumps. Nvidia would rather have a customer buy a $200 card every year than buy a $600 card every 3 years, given the exact choice between the two options.

Let's break down the reasons people would buy mGPU over upgrading/buying a new, beefier GPU:

1: You buy two top-end cards because you want the best

2: You buy two mid-range cards because you think you're beating the system and getting top-tier performance for less money

3: You have an older card and want to grab another to get your PC up to spec to keep performance up.


These are the only real scenarios I can think of. Let's break down the results:

1. This is a great initial source of quick cash for Nvidia. However the number of people who have the disposable income to take advantage of this scenario is small, a drop in the ocean compared to the average consumer. and many people who do go this route (not all, or even most) are dropping this serious money to future-proof their system long-term. Future-proofing is a bad thing for Nvidia, because this means the customer doesn't want to purchase a video card/cards for a while, hence the term 'future proofing'. That means their purchases are more sporadic, less constant, less reliable. in other words: scenario 1 has little benefit to Nvidia, but a tiny net positive benefit to be sure.

2. This is bad for Nvidia, plain and simple. They make more margin off of one 1080 than they do off of two 1060s, less now because of recent price cuts, but still a bit. In a world where SLI scales perfectly, two 1060s would be as fast or even faster than a single 1080, however, in the world that Nvidia is helping to build: SLI is broken and unreliable. This means people are more likely to go for the higher-margin big-boys. In reality, if Nvidia wanted to kill this scenario totally they would just completely disable SLI as a feature on mid range cards OH WAIT. This scenario is a net loss for Nvidia.

3. This is bad BAD SUPER BAD NEWS for Nvidia. This is the opposite of good. Imagine a world with perfect scaling, someone with a 780 could easily grab another second hand 780 on Ebay and get 1070-like performance without handing Nvidia a damn dime. Nvidia sees NO money from this. One could argue that whoever sold that 780 obviously bought a new card, thus money still goes to Nvidia, but that is ONE scenario of many: some people just stop gaming, sell their stuff to make a buck, some people would have bought a competitor's product and sell off their old card: In other words, $100 on a second hand 780 is NOT $100 in nvidia's pocket. Nvidia would much rather you pony up and buy that 1070. This scenario is a HUGE net loss for Nvidia.


As you can see, the negatives outweigh the positives massively. Nvidia has little to gain, and so much to loose by


And to your comment about "Well AMD does it too!" Yeah, I agree: AMD would probably HATE the idea of mGPU for the exact same reasons Nvidia does except ONE: advertising. "WE SUPPORT MGPU!" AMD likes being able to say they support it for the sake of being "Good Guy AMD" and winning over fanboys. Hell, CFX support is used as a bullet point on 1060 versus 480 debates online. AMD probably hates it just as much as Nvidia, but they can use their 'support' (I use that term nearly sarcastically) as marketing leverage over Nvidia: that is their only incentive to support it.

Also, claiming this is just 'a reality of developers' is a circular argument. Nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody ......



Do you know how much money Nvidia throws at developers to integrate Gameworks features? PhysX features? Do you know how much aid Nvidia gives developers in optimizing their products and timing driver releases? Nvidia hands out fistfuls of cash to get these features integrated into as many games as they can. Why is this relevant? because if Nvida wanted SLI to be optimized and accessible it would be. They would make that happen. Nvidia is letting SLI die, because as I stated above: it has a net negative impact on their bottom line.
Your statement doesn't make sense. Equally you can say Nvidia lose money from the cards they dont sell for SLI.

And every statement you make about Nvidia applies to AMD as well.

The reason is simply that mGPU have always been a tiny niche and that developers, specially in the DX12/Vulkan era dont like to pay for it since they sit on the entire implementation. And that's why mGPU only exist when Nvidia and AMD pay the bill.

People dont really pickup old cards to SLI with, and for good reason. Support or not. Its rarely worth it or economical contra a faster card with new features and support as well. Either you SLI/CF with flagship products or you dont.

And that you even use software companies with an entirely different business model as compare is beyond me.

I get it, you hate the death of mGPU and its easier to try and blame a single entity than what is really the entire software ecosystem giving it the go.
 
Last edited:

chenw

2[H]4U
Joined
Oct 26, 2014
Messages
3,977
1060 is a bad example, it doesn't support SLI, period.

The SLI two mid-ranged cards to beat a top performing card for less argument died along with Kepler, that was the last time where the performance difference of the top 3 cards was small enough that this was feasible, 960 was the first GPU for it not to be feasible, and it is still not feasible now even if 1060 could have been SLI'ed.

Either nVidia stopped people from using 2 x60's for a single x80 performance deliberately, or the chips are now good enough that the difference between x80 and x60 is big enough for x60 SLI to never even hope to reach it in performance.

EDIT: there is also one issue with nVidia SLI that AMD's crossfire currently does not suffer from: its dependence on the Z/X chipsets.

SLI deliberately requires your GPUs to run, at minimum, X8 lanes on every GPU participating in the SLI, which effectively removes every non-Z/X megaman chipsets from the consideration: IE your average Joe and Jane. Essentially, unlike crossfire which is more 'friendly' to spur of the moment decisions, SLI is not, meaning that those that are going SLI must have plans to do so from the very beginning, that alone shrinks its market dramatically.

Conversely, it doesn't help matters much when the latest iteration AMD GPUs are more power hungry than nVidia's, and thus more likely to require PSU upgrades, and that is often the most overlooked part of any computer your average Joe/Jane has.
 

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
Your statement doesn't make sense. Equally you can say Nvidia lose money from the cards they dont sell for SLI.

And every statement you make about Nvidia applies to AMD as well.
Sorry, I addressed this. I also never mentioned AMD in my original post.

The reason is simply that mGPU have always been a tiny niche and that developers, specially in the DX12/Vulkan era dont like to pay for it since they sit on the entire implementation.
See my gameworks/PhysX argument, you point is irrelevant.

People dont really pickup old cards to SLI with, and for good reason. Support or not.
Please explain. Try to explain without using "poor sacling" or "poor support" as justification for paying $100 versus paying $400+ for similar enough performance.

And that you even use software companies with an entirely different business model as compare is beyond me.
How? please explain. "Different business models" is irrelevant because years ago, the business models weren't that different. Software used to be one massive purchase every couple of years, now one by one, everyone is shifting to periodic payments. Name one way Nvidia (or AMD, as you seem to think I'm leaving them out of this) can keep as many customers buying their cards as regularly as possible with the supposed death of Moore's law making generational gaps smaller and smaller... I'll wait while you figure that one out.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
See my gameworks/PhysX argument, you point is irrelevant.
Gameworks/PhysX/TressFX/Etc applies to all buyers. While SLI/CF applies to sub 1%.

Please explain. Try to explain without using "poor sacling" or "poor support" as justification for paying $100 versus paying $400+ for similar enough performance.
Show me an example of paying 100$ to get 400$+ performance.

How? please explain. "Different business models" is irrelevant because years ago, the business models weren't that different. Software used to be one massive purchase every couple of years, now one by one, everyone is shifting to periodic payments. Name one way Nvidia (or AMD, as you seem to think I'm leaving them out of this) can keep as many customers buying their cards as regularly as possible with the supposed death of Moore's law making generational gaps smaller and smaller... I'll wait while you figure that one out.
I dont think you understand how these companies worked in the first place. MS for example have been "leasing" out software for over 10 years.

If anything, what you want to compare with is AMDs and Nvidia attempts to stream games for a monthly payment.
 

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
Gameworks/PhysX/TressFX/Etc applies to all buyers. While SLI/CF applies to sub 1%.
Nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody ......

Chicken/Egg. As I said, if Nvidia wanted SLI to be mainstream, it would be. Nvidia STILL allows and supports people to use dedicated PhysX cards. How niche of a market is that?


Show me an example of paying 100$ to get 400$+ performance.
You didn't answer my question, I'll take that as an admission on your part that you can't answer it without conceding to my point.

however, It does not take a lot of effort to see that cheap second-hand cards are available. Some sell for $100, some for closer to $150. Does that extra $50 make my original argument irrelevant? if so, please explain.

I dont think you understand how these companies worked in the first place. MS for example have been "leasing" out software for over 10 years.
Office 365 is a relatively new thing. Same with Adobe Creative Cloud. These are companies that previously sold very expensive software as one-time purchases (much like hardware). Please explain what aspect of their business I don't understand.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody supports SLI because nobody uses SLI because nobody ......

Chicken/Egg. As I said, if Nvidia wanted SLI to be mainstream, it would be. Nvidia STILL allows and supports people to use dedicated PhysX cards. How niche of a market is that?
You are trying to defend a concept that only works with centralized payment from those earning the money at best. A developer gets nothing from adding SLI/CF support. You dont pay extra or buy more copies of the game. And this is the reason SLI and CF is dead. You dont work pro bono either do you?

If SLI/CF was a DLC maybe or required 2 game copies. We could talk about it.

You didn't answer my question, I'll take that as an admission on your part that you can't answer it without conceding to my point.

however, It does not take a lot of effort to see that cheap second-hand cards are available. Some sell for $100, some for closer to $150. Does that extra $50 make my original argument irrelevant? if so, please explain.
Why not just buy a single 980TI then for barely more. And omg, its outcompeting the 1070 right? You forgot that angle didn't you? A 980TI is even cheaper than a 1060. There goes your entire case!

Office 365 is a relatively new thing. Same with Adobe Creative Cloud. These are companies that previously sold very expensive software as one-time purchases (much like hardware). Please explain what aspect of their business I don't understand.
And the equal to that is monthly subscriptions to stream games. Not SLI or CF.

While office 365 is new, the concept isn't.
 
Last edited:

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
You are trying to defend a concept that only works with centralized payment from those earning the money at best. A developer gets nothing from adding SLI/CF support. You dont pay extra or buy more copies of the game. And this is the reason SLI and CF is dead. You dont work pro bono either do you?
Adding Gameworks or PhysX to a game does not merit more copies sold, either, please try again.



Is GTX780 in SLI equal to a GTX1070? I think not. Not to mention all the other issues. So you simply made up a fairly tale that didn't hold.
Reading comprehension is a wonderful thing. Please read my previous posts and come back to this one. For the sake of time, I'll help you out:

2x 780s would be roughly the same as 980 Ti/ 1070 in raw FLOP performance, in a world where SLI scales near perfectly, grabbing a second 780 would be the no-brainier option. This was my original point. I have asked you to justify buying the 1070 over grabbing a second 780 (in a situation wherein you already have one 780) in this world of perfect scaling. Highly hypothetical, I know, but the real world shows SLI support as quite bad, dying even. In this real world we live in, grabbing two 780s would be a fool's task: quick question: do you think Nvidia is happy about that, or sad? In other words, yes: SLI 780's would suck because the support isn't there: proving my point that Nvidia has nothing to gain by trying to support SLI.

You still haven't answered my question.



And the equal to that is monthly subscriptions to stream games. Not SLI or CF.

While office 365 is new, the concept isn't.
Please explain the first line. Just because one product is software and one is hardware does not make the business any different. One-time purchase software can be moved from one system to another, just like hardware. one-time purchase software competes with the newer versions in pesky ways, just like hardware. Now, you can't get people to 'rent' their video cards, so what could you do to make sure they buy each new generation?

Subscription-based software is not new in the enterprise market. It is relatively new in the consumer market. Interestingly enough, Nvidia actually has a Geforce rental service in the enterprise market: showing that once again, business practices across the two industries can be parallel.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
Adding Gameworks or PhysX to a game does not merit more copies sold, either, please try again.
It sells more cards and the tools from NVidia makes it easier for the developer. See the different case?


Reading comprehension is a wonderful thing. Please read my previous posts and come back to this one. For the sake of time, I'll help you out:

2x 780s would be roughly the same as 980 Ti/ 1070 in raw FLOP performance, in a world where SLI scales near perfectly, grabbing a second 780 would be the no-brainier option. This was my original point. I have asked you to justify buying the 1070 over grabbing a second 780 (in a situation wherein you already have one 780) in this world of perfect scaling. Highly hypothetical, I know, but the real world shows SLI support as quite bad, dying even. In this real world we live in, grabbing two 780s would be a fool's task: quick question: do you think Nvidia is happy about that, or sad? In other words, yes: SLI 780's would suck because the support isn't there: proving my point that Nvidia has nothing to gain by trying to support SLI.

You still haven't answered my question.
You forgot that you can buy used GTX980TI cards for 150$ or less. Your entire case is flawed. But I assume you missed it because all your focus is on mGPU. And the used GTX980TI doesn't have all the flaws as your mGPU setup.

Raw flop doesn't equal gaming performance either.

Again, for the nTh time. Nvidia and AMD likes SLI/CF. Developers dont. And with DX12/Vulkan, Developers pay the full bill. there is no easy way like with a high level API. And with the high selling indie titles that is killing AAA titles. Not wasting money or resources is even more important than ever. And engines used my indie developers also tends to lack mGPU support even if they wanted.

You can try advocate instead that mGPU should be a DLC for a game. Or maybe even require 2 game copies. You be better off getting support that way because it creates incentive for the developers.

Please explain the first line. Just because one product is software and one is hardware does not make the business any different. One-time purchase software can be moved from one system to another, just like hardware. one-time purchase software competes with the newer versions in pesky ways, just like hardware. Now, you can't get people to 'rent' their video cards, so what could you do to make sure they buy each new generation?

Subscription-based software is not new in the enterprise market. It is relatively new in the consumer market. Interestingly enough, Nvidia actually has a Geforce rental service in the enterprise market: showing that once again, business practices across the two industries can be parallel.
Nvidia and AMD tried/tries a monthly revenue stream by leasing you hardware and games like O365 etc. SLI/CF have nothing in common with that business case.
 
Last edited:

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
It sells more cards and the tools from NVidia makes it easier for the developer. See the different case?
I'll give you that. The selling more cards thing can be said of supporting SLI: but you have a valid point with some features being easier to integrate for devs, but the same can be said if Nvidia bundled in SLI support to their Gameworks packages, I.E it would make supporting SLI easier and less of an investment, making a higher return on investment. Why doesn't Nvidia do that?


You forgot that you can buy used GTX980TI cards for 150$ or less. Your entire case is flawed. But I assume you missed it because all your focus is on mGPU.

Really!? sign me up!


* this is the fourth time you haven't answered the original question.

Nvidia and AMD tried/tries a monthly revenue stream by leasing you hardware and games like O365 etc. SLI/CF have nothing in common with that business case.
I know they don't. That's my original point. I'm glad you finally agree with my idea.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
I'll give you that. The selling more cards thing can be said of supporting SLI: but you have a valid point with some features being easier to integrate for devs, but the same can be said if Nvidia bundled in SLI support to their Gameworks packages, I.E it would make supporting SLI easier and less of an investment, making a higher return on investment. Why doesn't Nvidia do that?
Because it cant be done with mGPU. AMD cant do it either, they send engineers and developers to do the implementations. But its on a per game basis and very expensive method.

Maybe AMD dont want you to CF old cards too..or something? ;)

Really!? sign me up!


* this is the fourth time you haven't answered the original question.
Yep, just check the listing.

What question do you feel unanswered?

I know they don't. That's my original point. I'm glad you finally agree with my idea.
So your compare with Office365 etc was just plain BS? Good you admit it.
 
Last edited:

KazeoHin

Supreme [H]ardness
Joined
Sep 7, 2011
Messages
8,084
Because it cant be done with mGPU.
What makes you say this? SLI/CFX is something that can be added to any game engine, the same with Gameworks features. What would stop Nvidia from supporting such a product?

Yep, just check the listing.
Don't really see them... most are $250+

What question do you feel unanswered?
Please explain. Try to explain without using "poor sacling" or "poor support" as justification for paying $100 versus paying $400+ for similar enough performance.
The point being that if Nvidia supported SLI flawlessly (or nearly) it would harm new GPU sales. This is impossible to refute.


So your compare with Office365 etc was just plain BS? Good you admit it.
Yes, CF/SLI does not fit with Nvidia's move to pushing customers towards more frequent purchases.
 

Shintai

Supreme [H]ardness
Joined
Jul 1, 2016
Messages
5,691
What makes you say this? SLI/CFX is something that can be added to any game engine, the same with Gameworks features. What would stop Nvidia from supporting such a product?
It requires engine support etc. What did you imagine, some type of checkbox flip?

AMD got developer tools as well, why dont they deliver it instead of sending out teams to do the implementation? How hard can it be, right?

Welcome to the work of the developers.
https://developer.nvidia.com/explicit-multi-gpu-programming-directx-12

Don't really see them... most are $250+
http://www.ebay.com/itm/Manli-NVIDI...438381?hash=item3616d73c2d:g:4ggAAOSw53NY~PQS
http://www.ebay.com/itm/Asus-GeForc...651093?hash=item25d4d70455:g:VJ0AAOSwN6JY-gj8
http://www.ebay.com/itm/Evga-Gtx-98...804441?hash=item3d39079999:g:0B8AAOSwWiBY~Lm8

The point being that if Nvidia supported SLI flawlessly (or nearly) it would harm new GPU sales. This is impossible to refute.
So you cant prove it. And now you want me to prove otherwise? You have to do better than that.

Yes, CF/SLI does not fit with Nvidia's move to pushing customers towards more frequent purchases.
So in your world, AMD and Nvidia wants to kill mGPU to sell more cards? Despite it was created to sell more cards from the 2 same IHVs?
 

AlexisRO

Limp Gawd
Joined
Feb 27, 2014
Messages
172
Just some of my thoughts on SLI, while i enjoyed it when it worked and has a very high cool factor, it never was and never will be flawless. It was never mainstream altho some of us might think it is with the whole buy later. I see it at this at a later point you can buy a card that has at least the performance of your aiming to SLI (when working) while not having to deal with SLI issues (power, heat, and in my case added waterblock costs). Not to mention that every time SLI is working good, all it takes is a fart from the developer and you've got issues (for example BF3, BF4, BF1 after some major patches you get flickering). And on this note moving forward with DX12 and developers having more responsibility about this kinds of things I don't see it as a good sign for mGPU (SLI/CF). Well not at least when mGpu is integrated in the engine support to the level that it takes a check box to have it working, preferably autochecked that's how lazy some dev's are.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
4,401
Copying from the prior SLI thread as I think it bears repeating...

This is my general conclusion after 12+ years of dealing with mGPU setups, both SLI and Crossfire: when considering mGPU you need to ask yourself two primary questions (apart from the usual cost, power/heat aspects):
  • Do you absolutely need the extra horsepower to run your settings/resolution and can no single GPU get you there?
  • Do most of the games you play have good mGPU support?
If either of those questions can be answered as "no", then stick with a single GPU.

The issue is, from what I'm seeing, there are more and more games where mGPU support is either bad or non-existent, or doesn't arrive until months after a game's release. I couldn't care less what the reasons are, although I do agree with Kazeo that NVidia has a significant reasons for wanting to kill SLI, mostly in my view because the "cool" factor is fast being out-weighed by bad press and support costs. That last Batman game where Rocksteady basically said "go suck an egg" to mGPU users was really the canary in the mine.

I do fondly look back on a time when I drooled over NVidia's 7800 GTX quad-SLI demos, but now look at mGPU as frustration and wasted money.

I honestly think the best solution -- for whoever is listening, nV or AMD -- would be to put all efforts into supporting mGPU for only the most graphically-intensive titles (API permitting). That way, you reduce support costs significantly and still maintain a banner marketing technology. While I can certainly use mGPU in envelope-pushng titles, having the driver team spin cycles on mGPU for Barbie and Friends isn't doing anyone any favors.
 
Last edited:

elvn

2[H]4U
Joined
May 5, 2006
Messages
3,761
Convince me not to SLi 1080 Ti (thread)

How good is SLI support in newer games?(thread)

Pretty interesting article.

One thing of note is, if you run Doom in non Vulkan, it does support SLI. I am not sure if the person doing that article knew it or not though.
I'd throw that whole review out. It's obviously biased.

if your performance were a car it would be more like...
It goes the speed of other cars 20% of the roads, but 80% of the roads it goes 25% to 90% faster.

That Babel site refused to run direct x 11 mode on many of those games which require it for sli scaling support, and they also refused to do simple known sli enabling workarounds/profiles on others... and also seemed to have missed some sli enabling patches on others. It seems very biased considering this. Google the titan pascal sli and gtx 1080sli benchmark sites to get a more fair look.
http://www.techspot.com/review/1195-palit-geforce-gtx-1080-sli/page4.html

"Though we mostly focused on games that support Nvidia’s multi-GPU technology, these also happen to be some of the most popular PC games released as of late. On average SLI enabled 63% more performance -- or 71% if we ignore Ashes of the Singularity. We'd call that a solid score for SLI scaling.

Granted, this may not convince many to spend another $600+ on a second GPU but in a good number of scenarios at 4K, it pushed performance from playable to smooth-as-silk.

Overall, while I'm not generally a big advocate of multi-GPU technology, it does make sense here assuming that you can afford it.
The 4K gaming experience delivered by these cards is the best I have seen yet. For those willing and able to drop four figures on this GPU configuration, you won’t be disappointed with the results."

-----------------------------------------------

http://techbuyersguru.com/4k-gaming...1070-sli-vs-1080-sli-vs-titan-x-pascal?page=4

"All told, both Titan X Pascal and modern-day SLI make playable 4K framerates the rule rather than the exception. But alas, TBG's motto is "bringing tech to light," and sometimes that means shining a harsh light on chasing the limits of technology for technology's sake. As we've shown, you can certainly game at 4K with SLI or the Titan X Pascal, and enthusiasts no doubt consider 4K benchmarks to be the ultimate test of a modern gaming PC's mettle. The problem inherent in this approach is that 4K gaming may in fact not be the end-all-and-be-all of modern PC gaming. That's because 4K monitors are currently limited to 60Hz, and throughout our testing, it was painfully obvious that our overall gaming experience on our 27" 4K monitor wasn't nearly as good as it is on the Acer 27" 1440p 144Hz G-Sync monitor we typically test with. Yes, the added resolution is nice when you're sitting still admiring the scenery, but once the action starts up, the tearing, lag, and just plain sluggishness of a 60Hz monitor rears its ugly head."

------------------------------------------------------

Practically every game I play on the GoTY and top pc games of 2016, 2015 and top pc games of all time lists supports SLI once the dust settled on patches and drivers. The only one I miss it on is forza 3 personally.

If you aren't getting at least 100fps-hz average (~ 70 - 100 - 130+ fps-hz band), you aren't getting any appreciable benefit out of a high hz monitor. DP 1.4 144hz 3440 x 1440 and 3840 x 2160 monitors due out later this year.
Pretty meaningless unless you feed it new, unique frames of action anyway.
We are just getting to the point (since titan pascal) where on the most demanding games a top end single gpu can do 100fps-hz average or so at 2560 x1440, sometimes requiring some features to be turned off in the game settings at that. And that is an average in a sort of vibrating blend +/- 30fps all over the place seconds to seconds so has half of the mix being 70 to 100 or so depending on the game(see example graph). You don't get really appreciable blur reduction and motion definition increases until you are around at least 100fps-hz (fps and hz) average so much below that you are getting no appreciable benefit out of a high hz monitor even if it was 300hz. Some old source games and other simpler games and some isometrics can get well over 200fps average though of course.
Both unreal engine and work on final fantasy XV are implementing SLI support, and I'm assuming the sequel to shadow of mordor.. so there should be a lot of titles going forward.

With unreal engine going for HDR and sli support there should be a lot of games in the future that support SLI just on that engine alone.

Also
http://www.tweaktown.com/news/56156/final-fantasy-xv-pc-tech-test-uses-dual-gtx-1080s-sli/index.html


Another SLI game list from top games (incomplete list)

The Witcher 2, 3
Dishonored 1, 2
Half-Life games
Portal 2
Orange Box ganes
Middle Earth: Shadow of Mordor (assuming the sequel "Shadow of War" will too)
The Elder Scrolls V:Skyrim
Borderlands 2 games
Mass Effect games
Fallout Games
Far Cry Games
Metro Games
S.T.A.L.K.E.R games
Overwatch
DOOM 2016
Alien Isolation

Path of Exile
World of Warcraft
Guild Wars 2
The Elder Scrolls Online
Diablo III
Torchlight games
Tomb Raider Games
Bioshock Games
Assassin's creed games
Grand Theft Auto Games
Star Wars games
Civilization games
Quake games
Doom games
Homeworld2 -(some SLI problems at 4k remaster release, then patched and fixed among other gameplay bugs)
Thief games
Resident Evil Games
Evolve
Starcraft 2
Doom 3
Dirt Rally

Metal Gear solid V
Hitman
Crysis
Batman Ark
Battlefield games
Call of Duty games (incl Black Ops 2)
Dark souls games
Dota 2
Pillars of Eternity
Wasteland
Grid
Saint's Row
Watch Dogs 2
Shadow Warrior 2
Max Payne
The Talos Principle
Dying Light

titanfall 2: Use TF1 SLI profile + forum config fix. some flickering looking at sun in multiplayer

https://forums.geforce.com/default/topic/973578/sli/titanfall-2-sli-profile/3/

Battlefield 1: SLI fixes and general gameplay fixes from DICE and drivers Dec 6, 2016

Some reports of occasional flickering remain in forums for some people

"at the start of round for 2 - 3 seconds, occasionally in chat menu in round, slight flickering of numbers and words at end of round screens.Situation is much better than before the December patch.." - suspect overlays - some recommend using dx11 rather than dx 12

-many others report no sli problems after patch and driver updates


edit:
http://www.tweaktown.com/news/56156/final-fantasy-xv-pc-tech-test-uses-dual-gtx-1080s-sli/index.html
Upcoming/Recent Game Releases 2017
--------------------------------------------------------------
Unreal Engine 4.15 HDR, SLI .. could be implemented in a lot of UE games
Final Fantasy XV development showcased 1080 sli

Unity Engine now has support for VRWorks, suppors VR SLI and More March 17, 2016 <-- tweaktown
GDC 2016: Unity Game Engine to Add VRWorks Support <- nvidia blogs
(recent unity games with potential? .. P.A.M.E.L.A. , Vikings:Wolves of Midgard, Syberia 3, System Shock remake, Strafe)

Mass Effect Andromeda ... SLI confirmed by Dev
MiddleEarth: Shadow of War (SLI likely)

Prey:Arkane ... CryEngine , so potentially SLI support.. nothing confirmed
..................." it’s a different engine so the constraints are different. In the case of Dishonored 2 we created a new engine really, even though it’s based on idTech most of it has been redone. In the case of Prey we’re using CryEngine and it’s an engine that’s already shipped stuff before, so it’s not the same configuration. "
..................

Warhammer 40k (sli manual methods, UnrealEngine 4 game has sli potential in new UE4 HDR/SLI)
Resident Evil 7 (sli manual method working, last several RE's had sli patched in so potentially dev patched)
Conan Exiles (manual sli fix.. Unreal Engine 4 game, could implement new SLI and HDR UE framework)
Tekken 7 ... UnrealEngine 4 game so has sli potential
Dead Island 2 ... UnrealEngine 4 game

Dead Rising 4 (sli patched/nvidia driver support ... state of game??)

South Park: The Fractured But Whole .. same Snowdrop Engine as Tom Clancy's: The Division which supports SLI

Sniper Ghost Warrior 3 .. CyrEngine
..................................(beta version.. CI Games Dev "Chatia" : We're working on providing the SLI support to the game.)
..................................http://www.tweaktown.com/news/55090/ghost-sniper-warrior-3-multi-gpu-support-dx12/index.html

Ghost Recon: Wildlands ... official SLI support by devs , ongoing via patches (latest early march 2017)
...........
Patch 1.1.5 notes are below, and the update is around 4GB depending on your client.
  • Multi-GPU improved support: Worked with partners to improve both SLI and Crossfire support. Various stuttering and graphical corruption issues have now been fixed.
  • Quad-core CPUs improved support:Several freezes and performance issues have been
The Surge ... Deck 13's proprietary "Fledge Engine" .. ???

Dirt 4 .....http://wccftech.com/dirt-4-appears-nvidia-drivers-sli-3d-support-announcement-incoming/
.............http://www.dsogaming.com/news/dirt-...rs-rated-excellent-for-3d-vision-sli-profile/



Most Played Games 2016 (general)
-----------------------------------------------------

CS:GO
World of Warcraft ....SLI
Dota 2
Overwatch .... SLI
Hearthstone
League of Legends
Far Cry Primal ... SLI
Euro Truck Simulator ......
Tom Clancy's The Division ... SLI

Steam's Most Played 2016
---------------------------------
Dota 2........
CS:GO.......
Team Fortress 2 .............
Grand Theft Auto V ......... 2560 x1440 ~ 65% increase, 4k 53% increase
Civilization V SLI ... ......... 2560 x1600 .. 70 to almost 100% scaling
No Man's Sky ................ console port w/ poor optimization in general, patches, issues
........................................reddit user 'how to' sli after nvidia driver release support .. planned fixes by developer stated Dec 2016
XCOM 2 .......................... SLI here originally manually and later officially in patch nov 2016
Dark Souls III .................. SLI profile day 1. dsogaming post patch "SLI scaling was amazing and we did not notice any stutters while gaming" . "fps to be essential in such a game. Therefore, those interested in such high resolutions will need to invest in a high end sli system"
Paladins.......

Peak Player Count Steam
------------------------------
Dota 2
CS:Go
No Man's Sky
XCOM 2
Dark Souls III
Paladins (overwatch-like free to play)

Selection of games from Metacritic Top PC games 2016
----------------------------------------------------------------------------------

Witcher 3: Blood and Wine SLI 64%+ scaling (hairworks disabled)
Overwatch SLI
Dark Souls III SLI
XCOM 2 SLI
Civilization VI SLI, CPU bottleneck .. e.g. here.. 780ti 54fps.. 780ti sli 92fps .. gtx 1080 single, 980 sli, 1080sli all 92fps.
Battlefield 1 SLI , scaling varies from patch to patch
World of Warcraft Legion SLI
Forza Horizon 3 - nope
Rise of the Tomb Raider SLI - great scaling
Total War: Warhammer SLI , small scaling .. examples
Dishonored 2 general performance issues being patched out.. SLI working official patch
F1 2016 ... general bugs in game. crossfire officially, no sli officially
Titanfall 2 ... originally - Titanfall 1profile works... some flicker staring at sun, text flicker in menus (probably overlay issue)
...................new - Feb 2017 patch
...................Improved SLI/Crossfire compatibility. Players are recommended to not use TSAA when running in multi-GPU mode because TSAA can negatively affect performance scaling.
..................Fixed various UI scaling issues when running in Eyefinity/Surround ultrawide resolution.
DOOM SLI , initially very poor scaling ~ 23%. Patched Nov 2016 near 100% scaling
Final Fantasy IX
Grim Dawn .. sli manual fix, near 100% scaling
Final Fantasy X/X-2 HD remaster ??
Deus Ex: Mankind Divided .. SLI patched nov 2016 .. scaling ??, (crossfire scales 100%)
Hitman ??
Fallout 4 : Far Harbor SLI scaling ~ 70%
Tom Clancy's The Division SLI patched.. dex 2016 e.g. gtx 1070 max detail AF 1440p= 59fps , gtx 1070 sli = 90 fps
Pillars of Eternity :The white march part 2 ... manual sli implementation, CPU bound game
Hitman: Episode 3
Shadow Warrior 2 SLI, great scaling



Selection of games from Metacritic Top PC games 2015*
----------------------------------------------------------------------------------

GTA V.... SLI .. 2560 x1440 ~ 65% increase, 4k 53% increase
The Witcher 3...... SLI 64%+ scaling (hairworks disabled)
Undertale
Metal Gear Solid V
Pillars of Eternity ... manual sli implementation, CPU bound game
Starcraft II .... SLI
Ori and the Blind Forest
Tales from Borderlands ... SLI
Final Fantasy VIV.. SLI
Homeworld Remastered.. SLI
Heroes of the storm
Dirt Rally.... SLI
Fallout 4.... SLI
Project Cars .... SLI
Guild Wars 2 : Hearth of Thorns ... SLI
FIFA 16 ... nvidia sli profile, SLI unconfirmed
Warhammer End Time Vermintide ... manual sli workaround
Dark Souls II: Scholar of the First Sin .... SLI
Evolve ... SLI

Top Grossing Steam Games 2016
----------------------------------

(Platinum sales numbers)
Dota 2
The Division
The Witcher 3
GTA V
Civilization VI
Fallout 4
No Man's Sky
Rocket League
Dark Souls III
CS:GO
XCOM2
Total War Warhammer

(Gold sales numbers)
Dead by Daylight
Stardew Valley
Rise of the Tomb Raider
Warframe
Call of Duty: Black Ops
Stellaris
Team Fortress 2
Arma III
Ark: Survival Evolved ...... SLI manual fix.. current UnrealEngine's HDR/SLI code could be implemented in future
H1Z1: King of the Kill
Tom Clancy's Rainbow Six Siege
Doom

(Silver Sales)
Watch Dogs 2
Hearts of Iron IV
Cities Skylines
Europa Universalis IV
War and Thunder
Dying Light: The following enhanced ed
Subnautica
Smite
Deus Ex: Mankind Divided
Civilization V
Far Cry Primal
Payday 2
Rust
The Elder Scrolls Online: Tamriel Unlimited
Planet Coaster
Skyrim
There is a difference between whether SLI performance is worthwhile and viable,
and whether a particular person's parameters make sli a good choice for them.

To name a few considerations.. budget, existing rig, upgrade path, monitor resolution, demand high hz gameplay/aesthetics?
ok with dialing down settings (how much?) - to get high fps and hz (100+ avg) on a high rez, high hz monitor?, favorite types of games and titles, also game library/backlog.
If you aren't getting at least 100fps-hz average (~ 70 - 100 - 130+ fps-hz band), you aren't getting any appreciable benefit out of a high hz monitor. DP 1.4 144hz 3440 x 1440 and 3840 x 2160 monitors due out later this year.
Pretty meaningless unless you feed it new, unique frames of action anyway.
We are just getting to the point (since titan pascal) where on the most demanding games a top end single gpu can do 100fps-hz average or so at 2560 x1440, sometimes requiring some features to be turned off in the game settings at that. And that is an average in a sort of vibrating blend +/- 30fps all over the place seconds to seconds so has half of the mix being 70 to 100 or so depending on the game(see example graph). You don't get really appreciable blur reduction and motion definition increases until you are around at least 100fps-hz (fps and hz) average so much below that you are getting no appreciable benefit out of a high hz monitor even if it was 300hz. Some old source games and other simpler games and some isometrics can get well over 200fps average though of course.
 
Last edited:

NeoNemesis

2[H]4U
Joined
Mar 10, 2004
Messages
2,455
Ubisoft games have shit multi GPU support. I've experienced this with The Division and it's also shown in the review. I was interested in Wildlands but look at that performance gain.... what a joke. Thansk but no thanks. If games won't support multi GPU for high resolution, I won't buy them.
Really? From the above list it looks like Ubisoft is the only company still doing anything with multiple GPUs.
 

AltTabbins

Fully [H]
Joined
Jul 29, 2005
Messages
19,928
Too bad the table doesn't actually show negative scaling it just shows everything as 0, which is quite misleading imo.
The numbers dont look right either. I tested GTA 5 at 3440x1440 and saw negative results in the benchmark with SLI on (only a few points but it was still something). At best in all of the games I was playing I was seeing about a 10% increase. At worst, negative point. On average, no difference in all except for power draw. Mix all this in with the inherent headaches that come with SLI and it just not worth it in my opinion. Maybe if I was running 4k or 8k.

I returned my 2nd GTX 1080 this morning and will be perfectly happy until Nvidia 20 comes out.. and I get a single card then too.
 
Top