Any SLI / 2080 TI owners feeling completely ripped off by NVIDIA?

JCD3ntonX

n00b
Joined
Mar 2, 2017
Messages
19
So here's the thing: I've built SLI systems every generation dating back to the 8800 GT, and in general I've been pleased with them. Yes, the complaints in general were valid (although often overstated), but on the whole I've always found it worth it to pay for depreciation on a second GPU to get a decent performance boost in a good number of the most graphically intensive games.

Now fast forward to this generation. I'm a veteran of early adopter stuff, I knew not to expect to be blown away by the RTX stuff, and that it would be a massive performance hit. Of course, the way NVIDIA sold this was that now SLI would be even more important than ever . . . increased support, a new bridge with new link technology. In theory you drop the extra money on the second 2080TI to get playable framerates with the RTX stuff. Whether it's worth it to pay that much for a bit of extra eye candy is of course a questionable value proposition (for me it is because I make plenty of money and gaming is my main hobby), but it's out there for everyone to make an educated decision.

My problem is that NVIDIA has basically pulled a bait and switch -- out of every single game with RTX support, literally none of them support SLI or mgpu. Not a single game as far as I can tell, not even with user hacks. The games that support SLI/mgpu and RTX effects won't work with both enabled at the same time.

Beyond that, there has always been talk of "sli has no support," dating back a decade or more. But this is really the first time that the hyperbole has been reality. Lately it has been AAA game after AAA game with none of them having any support. I just keep it turned off now for the most part. The only utility at all I've gotten from it has been older games . . . there are some GTA V graphics mods that can actually stress a dual RTX 2080 Ti system, and Ghost Recon can benefit from two cards.

But I have to say, I'm really disappointed to see a new SLI bridge come out with promises of renewed support, and then right after, basically all support gets dropped. Division 2, Metro Exodus, etc, I could go on. And many games that have no support now had support in their prior iterations.
 
I skipped SLI this generation. The last several systems in the past 10 years have all be mGPU, but the 2080 Ti is the first time I am skipping it (mainly because of lackluster 1080 Ti SLI support).
 
SLI and Crossfire are dead and pretty much not being used. Those two markets are so small that no developer will waste resources on coding for it. Also why when anyone asks this question before buying two cards I tell them the same thing. Sell your other card and get most of your cash back.
 
I bought two 2080 Tis at launch. I found, as you have, that support is not there. Especially for ray-tracing, which is the main selling point of the cards.

So I ended up just selling the second card, honestly performance has not dropped by much. There were a few games with good scaling, like Tomb Raider with mGPU, but most games did not have a substantial impact.

Do I feel ripped off? Well not really. I bought sight unseen with no reviews, so I only have myself to blame. And everyone here saying SLI was dead, I didn't listen.

That said, it was a good run but I would consider SLI over unless there is a huge shift in developer support for mGPU but that seems unlikely.
 
I also had been doing SLI most of the last decade.

1. 2x560 PNY TI's
2. 2x970 Gigabyte G1's and then added a EVGA SC780 I had laying around for PhysX when I built my 4930k rig. For me this era was the peak. 1080p @120hz was easy and I could get reasonable 1440p and for games with hardware enabled physX I could even do 4k. Good times!
3. 2xMSI 980m's in a Titan laptop-still have
4. 2x1080 Gigabyte G1's-Originally intended to be my 4k solution but the 8GB vram became an issue far too often.

So yeah, I've seen the decline too. I was actually happy to deal all the extra details of SLI. Even took pride in making it work to the point I honestly couldn't perceive the microstutter effect so many complain about. I was considering it this round till I saw the prices for the 2080TI. My budget is tight so I decided since I was going to spend close to the same amount of money either way then I'd just get the best of the best and went for a single Strix 2080TI. Happy with it. Even though NV held out the carrot with the new bridge I had my doubts. I've recently been hearing the same that you stated how enabling DXR features disables any SLI options. F'd up! Good news is that it sounds like you're used to having to deal with other issues(bits/profiles/etc) and there's still plenty of those popping up for games are not getting support from the dev's. Sounds like someone needs to update NV profile inspector to include some RTX stuff to bridge the gap between the two.
 
I've gotten the impression Nvidia hasn't cared much about SLI for awhile now. Both AMD and NV keep winding down support. The NVLink tech was built more around compute scenarios (it first appeared on their Tesla workstation GPUs iirc).

I would agree it is misleading that they are selling a new bridge and everything while SLI support for new games is almost nonexistent. At the same time, I wouldn't have bought two GPUs without seeing some SLI benchmarks of games with the RTX features.
 
I like the concept of this rig but I have to admit the plumbing looks a bit too complicated for my tastes but it does go with your thread.
index.jpg


https://www.guru3d.com/articles_pages/guru3d_rig_of_the_month_march_2019,2.html

index.jpg
 
I've had 3 generations of SLI: GTX 470s, GTX 670s and GTX 1070s. Up until the 670s, SLI was well supported but when I bought the 1070s was really disappointed with the lack of support.

On the flip side, I really like the way 2 cards look in my case :p
 
Of course, the way NVIDIA sold this was that now SLI would be even more important than ever . . . increased support, a new bridge with new link technology. In theory you drop the extra money on the second 2080TI to get playable framerates with the RTX stuff.

I'm not sure where you got that from honestly. It has always been very clear that the switch from DX11 to DX12 meant SLI support switched from being something NVidia supported via the driver (DX11) to something each game developer has to code for in each individual game (DX12). There may have been some (nvidia marketing?) who sidestepped that issue by taking the ignorantly optimistic view that game developers would embrace DX12 multi-GPU and do a great job coding for it in all new games but that obviously is not happening, just like no one really expected it to. In many games, you can still choose DX11 instead of DX12 and in some cases this will enable SLI, but that obviously won't work with any RTX content. I was using 3x GTX680 prior to getting a RTX2080, and i'm still using two of those GTX680's in SLI in my secondary rig. I'm still finding DX11 SLI support to be adequate for now at least, but we'll see how long that lasts.
 
I'm pretty happy with my 2 founders editions cards. I have toyed with the thought of selling one of them since single card performance is still great. Do I feel "ripped off"? Not at all. SLI hasn't changed fundamentally for a very long time in terms of gaming performance and which games support it. I knew what I was getting into when I bought the cards.
 
I'm not sure where you got that from honestly.
If there ever was a reason to support SLI, RTX was it. If RTX isn't enough for developers to care about SLI, nothing is. However, at this point the cost of buying two RTX cards these days isn't in the realm of reality for many (most?) gamers.

I've been using SLI for the last few builds, but I doubt my next one will have it. Current game support for SLI is worse than it ever was. Far Cry 5 and New Dawn are less smooth with it enabled, at least on my rig. Thankfully, or rather unfortunately, the graphics in those games have been dumbed down and simplified enough compared to Far Cry 2, 3 and 4 (likely thanks to consoles) to still maintain a solid and smooth 60 fps w/vsync under Ultra settings with a single card. But I'd be pissed if I actually needed SLI for 60fps and couldn't get it to run smoothly.

Interestingly playing Far Cry 5 and New Dawn on my old GTX 690 with AFR1 forced for SLI is actually quite playable and smooth even with the 2GB memory constraints (oh how I wish they made the 690 have 4GB per GPU). To me it seems that SLI support on older cards like the 680 or 690 is somehow "better" than last and current-generation cards, which doesn't make much sense to me but whatever.
 
I'd probably be at 4k120Hz by now if SLI were still a thing.

I'd probably be the same other than the gpu and display combined prices. I have to admit though that some of those youtube vids out there of people playing games at 8k using 2 2080ti's are amusing if not just ridiculous.

As it stands IMO the best gaming experience is to be had going with a single 2080Ti and a 1440p165Hz monitor w/ variable refresh rate.

Pretty much agree with you here. It just pisses me off it's next to impossible to any HDR above 400nits in this class. If they put one out closer 1000nits I'd start setting money aside right now.
 
I'm a die-hard SLi user.
My first set was two GTX 680 GTs and I upgraded every generation more or less from there.
Even had GTX 480 or 580s which ever were the ones that ran hotter than hell.

I have never been disappointed, until the 1080 Ti.
Support is pretty much gone.

I have Metro Exodus running in SLi with Dx 11 with an nvidia profile inspector hack. But it sucks to have to go through all that.

I don't think I'll be doing anything like SLi again. I stopped Crossfire back about 5 years ago due to absolutely no support from anywhere.
 
It is sad to see since the wild west days of Tesla. Being able to run Crysis when it came out in 2007 with almost every detail cranked up at 1600x1200 and getting around 70 FPS was something to see.
I've gotten the impression Nvidia hasn't cared much about SLI for awhile now. Both AMD and NV keep winding down support. The NVLink tech was built more around compute scenarios (it first appeared on their Tesla workstation GPUs iirc).

I would agree it is misleading that they are selling a new bridge and everything while SLI support for new games is almost nonexistent. At the same time, I wouldn't have bought two GPUs without seeing some SLI benchmarks of games with the RTX features.
It's the developers that are not supporting SLI, not NVIDIA themselves. Most games are not being made that are multi-GPU friendly.
I'm not sure where you got that from honestly. It has always been very clear that the switch from DX11 to DX12 meant SLI support switched from being something NVidia supported via the driver (DX11) to something each game developer has to code for in each individual game (DX12). There may have been some (nvidia marketing?) who sidestepped that issue by taking the ignorantly optimistic view that game developers would embrace DX12 multi-GPU and do a great job coding for it in all new games but that obviously is not happening, just like no one really expected it to. In many games, you can still choose DX11 instead of DX12 and in some cases this will enable SLI, but that obviously won't work with any RTX content. I was using 3x GTX680 prior to getting a RTX2080, and i'm still using two of those GTX680's in SLI in my secondary rig. I'm still finding DX11 SLI support to be adequate for now at least, but we'll see how long that lasts.
No, SLI always needed to be supported at both ends. SLI support has never faltered on NVIDIA's side. See above. Forcing a game to use SLI through Inspector doesn't mean that the engine or specific use of that engine in a game supports SLI. Forcing bits from different games, even ones using the same engine, can cause issues that are not always visible. A common visible issue when I was still using SLI and forcing various bits was lighting leaking through polygons that could be so bad you could see the light source through walls.

You can still use the "DX11" way of implementing SLI by using implicit mGPU in DX12. But again, it's the developers that need to have their feet held to the fire.
 
SLI was always up to the developer to code support, so I don't really blame nvidia. SLI was always just a gimmick IMO, especially so on lower end cards.
 
An issue here is that the DX12 & Vulkan way of implementing SLI/mGPU require more than just a simple hack. SLI as we know it is probably dead. Multi-GPU though I would say is stalled. Software development has always been a situation of reusing code, and implementing what could be a substantial paradigm shift is not going to be a light undertaking.
 
Being able to run Crysis when it came out in 2007 with almost every detail cranked up at 1600x1200 and getting around 70 FPS was something to see.
QFT. I was using SLI 7900GTs at the time on a 1600x1200 Dell LCD. That scene at the start of the game where the sun rises as you are standing on the side of a cliff is still in my memory as one of the best "this is why I am a PC gamer" moments.
 
I have had SLI/Crossfire since 2007 with my GTX 7800s, 8800GTXs, Nvidia 680 GTXs, AMD 5870s, Original Titans, R290x/295x2 (Trifire), R9 FuryX and Radeon Pro Duo (Trifire) Vega FE's and now 2080s in SLI. This is the first time that I feel left behind with my purchase. Next card will be the highest end card I can buy and leave it at that.
 
Last edited:
QFT. I was using SLI 7900GTs at the time on a 1600x1200 Dell LCD. That scene at the start of the game where the sun rises as you are standing on the side of a cliff is still in my memory as one of the best "this is why I am a PC gamer" moments.
Had a colleague in university who was jelly when I bought a pair of 8800GTX when they were brand spanking new. He said he was waiting for reviews of the HD 2900 XT :ROFLMAO:.
 
  • Like
Reactions: Azrak
like this
QFT. I was using SLI 7900GTs at the time on a 1600x1200 Dell LCD. That scene at the start of the game where the sun rises as you are standing on the side of a cliff is still in my memory as one of the best "this is why I am a PC gamer" moments.

For me it was World in Conflict that brought my machine to a halt when three nukes would go off at the same time.
 
  • Like
Reactions: Azrak
like this
SLI was always just a gimmick IMO, especially so on lower end cards.
Totally disagree with this statement. It had its place for a very long time.

An issue here is that the DX12 & Vulkan way of implementing SLI/mGPU require more than just a simple hack. SLI as we know it is probably dead. Multi-GPU though I would say is stalled. Software development has always been a situation of reusing code, and implementing what could be a substantial paradigm shift is not going to be a light undertaking.
Serious Sam games have been using it and it hash been pushed forward by VR. As we see more high-end VR titles in the shooter genre, I think we will see more mGPU successfully implemented.
 
I've been doing SLI for over 20 years. I remember back in the day there being no question that you got double the performance.

I am also remembering buying a pair of Voodoo 2 cards, 12MB each for a whopping 24MB and feeling like king shit with my system. I'm pretty sure those cards were $399 each, so I would have spent over $800 for the pair ... this was 20 years ago. That was a lot of money then and a nice chunk of change today.

In fact, I am pretty sure that HardOCP had a few little news postings on the Voodoo 2 cards. Back then tech news on HardOCP was short and sweet.

SLI is definitely a richman's game. If you have the money then you shouldn't really worry about it. That's the way it generally works. All you have to do is go to YouTube to see first hand accounts along with those results. It's hit or miss. with the 2080 ti .... some games in SLI actually run a bit slower from what I saw which is unbelievable.

The 1080 Ti is also hit or miss.

The last time I ran SLI was with a pair of 980 Ti's and I remember having a fair amount of success in the SLI dept. Seems to me that games since then have supported SLI less and less.

Def do not blame nVidia.

You also have to understand how game companies work. They have budgets and within those budgets they have separate monies for art, music, levels, features, etc etc and yes, even SLI implementation which I'm sure takes a fair bit of time and effort ( money ).

For those of you not paying attention, SLI on 2070's was actually .... removed. You can only SLI on 2080 and 2080 ti cards. So if you have a budget for a game and money is tight, do you provide SLI for the less than 1% of the kids out there that are going to use that feature?
 
I've SLI'ed twice, 2x 7950GT's and 2x 9800GTX's. The 7950GT's were a waste of time IMO, but the 9800GTX's granted me a type of GPU longevity that I had never experienced before, allowing me to skip the 200 and 400 series. When I bought my GTX 570, I was blown away by how much faster it was than my 9800GTX's. I wanted to buy another one for SLI, but by that time, the 600 series was about to drop and I got a GTX 670 instead which was a drastic increase in performance. I looked into buying a second one but the prices never dropped enough to make it worthwhile like the previous series cards.

Now days I could care less about SLI, especially with how much video cards cost.
 
It's the developers that are not supporting SLI, not NVIDIA themselves. Most games are not being made that are multi-GPU friendly.

.

Sure but if Nvidia wants to see SLI / mGPU grow, they could certainly help out developers and fund the inclusion of SLI support, as they do with other features. Unless they have and developers refused, but I doubt that. They don't need to do it with all games, but it would make sense to do it with the titles they do marketing deals with.

Without a big push from the GPU vendors I don't see it ever catching on again tbh, that's why I think Nvidia shoulders a lot of the blame for this.
 
I've SLI'ed twice, 2x 7950GT's and 2x 9800GTX's. The 7950GT's were a waste of time IMO, but the 9800GTX's granted me a type of GPU longevity that I had never experienced before, allowing me to skip the 200 and 400 series.

As someone with a pair of 7900 GTX cards I can say they are severely CPU limited on anything AM2 and especially 939. I am not sure what you ran them on, but this is my personal experience. Adding a second card to a 5600+ only net me around an additional 1000 or so points in 3DM06. You really need a very fast Core 2 to make these cards sing. A 7950 GT is basically a cooler, shorter 7800 GTX 512 - so again, CPU limitation.
 
As someone with a pair of 7900 GTX cards I can say they are severely CPU limited on anything AM2 and especially 939. I am not sure what you ran them on, but this is my personal experience. Adding a second card to a 5600+ only net me around an additional 1000 or so points in 3DM06. You really need a very fast Core 2 to make these cards sing. A 7950 GT is basically a cooler, shorter 7800 GTX 512 - so again, CPU limitation.

I had them paired with a 3700+ @ 2.8GHz and a 4200+ @ 2.75GHz, the 9800GTX's were on a C2Q 9550 @ 3.833GHz. That would explain my woeful performance.
 
I'm just unhappy my thousand dollar investment didn't even come with a blow up doll... Sometimes I have to think about how many women I could have seduced with the money I spent on this purchase.... 5 at most, if current bar wench trends continue unabated.

I used to love going mGPU though haven't touched it since my paired 970's. Once I went 4K on my GTX 1080 and later the 1080Ti I never really looked back. I'm barely playing anything that needs it anymore. The 2080Ti alone, will suit my needs for a while until it Space Invaders on me.
 
Last edited:
Since i've been PC gaming this debate has never ended... every generation however when showing the actual benchmarks for games shows that in some games you get better performance but never enough of an improvement to drop double $$$ on a identical card. The other thing i've always seen is that SLI at times can worsen game performance as opposed to one card. In fact, I remember years ago (can't remember which generation it was) when the single card GPUs became powerful enough to where even reviewers were saying that "such-and-such card is so powerful now that SLI isn't even needed anymore." Some of the folks here i've read have had good experience with SLI and Crossfire with older generations, but these days the GPUs are so incredibly powerful I never even considered a 2nd GPU...
 
Since i've been PC gaming this debate has never ended... every generation however when showing the actual benchmarks for games shows that in some games you get better performance but never enough of an improvement to drop double $$$ on a identical card. The other thing i've always seen is that SLI at times can worsen game performance as opposed to one card. In fact, I remember years ago (can't remember which generation it was) when the single card GPUs became powerful enough to where even reviewers were saying that "such-and-such card is so powerful now that SLI isn't even needed anymore." Some of the folks here i've read have had good experience with SLI and Crossfire with older generations, but these days the GPUs are so incredibly powerful I never even considered a 2nd GPU...
I have to agree, at the current card costs vs the benefit of two cards, it's not really worth it. What the hell ever happened to mGPU support from Direct X 12? Where you could run two totally different video cards and still see a benefit?

To me, the entire purpose of SLI after the Voodoo era was to get SLI/Cx running on cheapo video cards to deliver playable performance for those that couldn't afford expensive graphics cards. For a long time there, 1080P was the target to shoot for. I recall a couple rigs with low end boards that were very adequate in terms of performance and you were getting the video cards cheap. Like I had this one rig that had paired 7600's in it. The cards were sub 100 each and it delivered solid FPS at high resolutions and graphical settings. Then Nvidia started pulling the option to do it off the lower end boards and I think AMD still had it on most of them. I understand it added complexity ... and when Nvidia made it a high end thing that was just pure greed, but how hard is it really? Especially with DX12 having displayed it functional at one point in time.
 
when the single card GPUs became powerful enough to where even reviewers were saying that "such-and-such card is so powerful now that SLI isn't even needed anymore."
SLI will always be "needed" in the sense that new (possibly premature) display standards are always coming out without the GPU power to back them up.

I mean, you can say we finally have cards that can run 4K 60fps with no problem (RTX 2080 and Radeon VII, and certainly the 2080 Ti) but then they go and release 8-friggin-K TVs. It never ends.

Now, whether that "need" is actually fulfilled is better question. I think SLI at this point causes more problems than it solves.
 
Still running my SLI 1080ti's and surprisingly they run fine in BF5 without using any of the trick most have had to employ to get SLI running. I almost jumped on the 2080ti bandwagon... but it seems mGPU and SLI are going the way of the dodo. I'd go with a single 2080ti just cause "upgrade bug".. but it would not out perform my 2x 1080ti's and I'm pushing 3x 32" monitors. In the same boat basically as the 4k~8k guys. Wish they would just formally announce sli/mgpu dead and build stinkin fast single cards. I'm willing to endure some pain in the wallet for the performance.
 
The problem with SL isI it always comes with huge frame time variation - if the frame pacing doesn't work or you wind up in some part of the game that doesn't scale well, you can suffer up to 2x higher frame times. Right now you will almost certainly have a better overall experience by turning the settings down slightly and enjoying a reliable 4k60 on a single 2080 Ti versus using SLI to try to push the settings higher and dealing with stuttering.
 
Right now you will almost certainly have a better overall experience by turning the settings down slightly

This has never been acceptable to me. I only do it if I can't throw more money at the problem and make it go away or mitigate it to acceptable levels.
 
The problem with SL isI it always comes with huge frame time variation - if the frame pacing doesn't work or you wind up in some part of the game that doesn't scale well, you can suffer up to 2x higher frame times. Right now you will almost certainly have a better overall experience by turning the settings down slightly and enjoying a reliable 4k60 on a single 2080 Ti versus using SLI to try to push the settings higher and dealing with stuttering.

I use SLI and leave the same cushion. I usually have perfect frame pacing by using RTSS's frame cap. To use SLI properly(smoothly) -- you really need to set a frame cap.

For me, the SLI 2080Ti allows me to run 8xSGSSAA in older games, sometimes adding OGSSAA to the mix. In newer titles that have a blurry TAA implementation, I'll often run 5k via DSR downsampled to 1440P with Reshade SMAA.
 
Last edited:
Back
Top