Any SLI / 2080 TI owners feeling completely ripped off by NVIDIA?

The Tomb Raider engine supports DX12 mGPU, which is not SLI in the traditional sense.

DX12 (or Vulkan) mGPU works well but requires developers to specifically code support for it (unlike SLI/Crossfire which was enabled on a driver level from GPU vendors).
 
DX12 (or Vulkan) mGPU works well but requires developers to specifically code support for it (unlike SLI/Crossfire which was enabled on a driver level from GPU vendors).

This is the impasse where we're at today- and while engine developers have more or less implemented mGPU in DX12 and Vulkan, widespread use by game developers hasn't been that popular yet.

My bet is that it will likely take off again as both games and output devices get more demanding. Ray tracing will be in everything and used to greater effect while VR and 4k120 displays (and higher) become more mainstream.
 
I've gone with SLI of the top-end cards up until the 10xx series. It just seemed like support was waning and not worth it anymore. I don't regret dropping SLI one bit.
 
So here's the thing: I've built SLI systems every generation dating back to the 8800 GT, and in general I've been pleased with them. Yes, the complaints in general were valid (although often overstated), but on the whole I've always found it worth it to pay for depreciation on a second GPU to get a decent performance boost in a good number of the most graphically intensive games.

Now fast forward to this generation. I'm a veteran of early adopter stuff, I knew not to expect to be blown away by the RTX stuff, and that it would be a massive performance hit. Of course, the way NVIDIA sold this was that now SLI would be even more important than ever . . . increased support, a new bridge with new link technology. In theory you drop the extra money on the second 2080TI to get playable framerates with the RTX stuff. Whether it's worth it to pay that much for a bit of extra eye candy is of course a questionable value proposition (for me it is because I make plenty of money and gaming is my main hobby), but it's out there for everyone to make an educated decision.

My problem is that NVIDIA has basically pulled a bait and switch -- out of every single game with RTX support, literally none of them support SLI or mgpu. Not a single game as far as I can tell, not even with user hacks. The games that support SLI/mgpu and RTX effects won't work with both enabled at the same time.

Beyond that, there has always been talk of "sli has no support," dating back a decade or more. But this is really the first time that the hyperbole has been reality. Lately it has been AAA game after AAA game with none of them having any support. I just keep it turned off now for the most part. The only utility at all I've gotten from it has been older games . . . there are some GTA V graphics mods that can actually stress a dual RTX 2080 Ti system, and Ghost Recon can benefit from two cards.

But I have to say, I'm really disappointed to see a new SLI bridge come out with promises of renewed support, and then right after, basically all support gets dropped. Division 2, Metro Exodus, etc, I could go on. And many games that have no support now had support in their prior iterations.
NEVER buy 2 cards when you can use the money to buy 1 faster card. You should have bought the Titan RTX.
 
NEVER buy 2 cards when you can use the money to buy 1 faster card. You should have bought the Titan RTX.

Agreed. I never understood the appeal of running dual mid-range GPUs as you added system complexity and made the performance of your system conditional on support that may or may not be there. You typically had less VRAM as a result of such a configuration as well. I've always felt SLI was best when it was used to get next generation performance now on the high end. I used it as a way to drive extremely high resolutions and multiple monitor gaming because no single card could do it. I used it to max out games that other wise would have to be run on medium settings or worse.
 
First GPU upgrade that sli didnt even pass my mind.

Did use 1080ti sli for a month or so cause I could for rise of the tomb raider 4k.

Yeah been doing SLI since sli was a thing back when with voodoo cards.

Did quad sli with gtx580 and quad fire 7970s. Waste of money.

I feel bad everytime I hear someone talk about SLI with 2080ti cards.

Just imagine buying 2 2080ti kingpin cards! Yeah I've read about that yesterday.


Sli is only good for benchmarks really in today's world unless your running older games but then why with the power of a single 2080ti?
 
I thought there was always a downside of added input lag and some extra tearing? I know there's benchmarks where some games get way higher fps but fps isn't everything. I thought it was almost always better to just get 1 really good card over 2 mediocre cards

That was my experience with SLI. I had 680s and 970s in SLI, went to a single card with the 1080. Not likely to go back to dual cards unless DX12 mGPU becomes prevalent.
 
That was my experience with SLI. I had 680s and 970s in SLI, went to a single card with the 1080. Not likely to go back to dual cards unless DX12 mGPU becomes prevalent.

Same but with 670's- and both worked pretty well with the games I played at the time. With respect to the 970's, though, if I'd known that there was a 980Ti on the way I'd just have waited for that instead.

And yeah, SLI works, when the game support is there. Benchmarks show frametimes (or 99% FPS) scaling with overall framerates in good implementations.
 
I'm still running two 1080ti hybrids in sli with a high bandwidth bridge. A lot of my favorite games support it. Without hitting at least 100fps+Hz average you really aren't getting appreciable gains out of high hz. Even 100fps+Hz is usually something like a 70- 100 - 130fps graph. At 120fps+Hz = 50% blur reduction from 60fps-Hz's smearing blur, and double the motion definition and pathing of the entire viewport moving in relation to you in 1st/3rd person games while mouse-looking and movement-keying / controller panning.
100 average (~ 70 - 100 - 130 with variable refresh rate) is about as low as I'm willing to go.

Personally I'm waiting for die shrink and hdmi 2.1 outputs on gpus. For now it seems they released a whole gpu gen without hdmi 2.1 and are going to milk a whole series of proprietary monitors at 4k that can only do 120hz and variable refresh rate on their included displayport connection instead of being able to use hdmi 2.1 at 120hz with VRR (variable refresh rate) on any hdmi 2.1 monitor or tv that comes out on hdmi 2.1 It's a slow release roadmap with a lot of monitors pushed back too. Upgrading is not worth it to me until 7nm + hdmi 2.1 and I usually wait for the Ti Hybrids at that. If a single top tier one can do 100fps-hz average or better at 4k on very demanding games at very high+ to ultra (even with a few over the top settings turned off), I'd consider a single one instead of dual. It would also depend on what games support mgpu at that time.


okw997S.png




 
Last edited:
I'm pretty sure my 2080ti ftw3 ultra was doing 100+ at 4k the other night when testing hehe. I had vsync off just to see the frame rate that was bfv. High to ultra settings.. but I'm guessing high.

I normally play maxed out though on my 34" ultra wide 1440 120hz monitor.

Samsung 40" 4k screen on my desk as 2nd monitor or Netflix viewing.

I'm pretty sure as I remember being in shock over it but it was screen tearing like crazy so I had to exit and turn vsync back on and limit it to 60hz and lock frame rate to 60 in game.

Too laggy with input though to be do able for a fast game and was glad to swap back to the 1440 screen with gsync and 120hz.
 
Last edited:
I'll have to see what the performance is between single and dual once a hdmi 2.1 gpu comes out , hopefully with a die shrink.

This quote still applies to a lot of games. Of course you can dial in(down) the graphics settings to gain motion clarity (blur reduction) and motion definition (smoothness, pathing, animation) . I like to play at VeryHigh+ to Ultra- if I can afford to but 100fps+hz average (70 - 100 - 130 graph for the most part) is about as low as I'm willing to go.

Considering the graphics ceiling is an arbitrary set point and that the challenge for devs is to whittle games down to fit "real time", the graphics ceiling limit can be pushed much further than it is now. Some games can even be modded now way past what is playable (a few screenshot forums are dedicated to this type of thing).. increasing view distances and animated objects viewed in distance.. downsampling 8k, 16k resolution, modded textures, etc. So anyway, we just got to the point were a single powerful gpu like a 1080ti or titan can do 100fps or better on demanding games at 2560x1440, sometimes with some over the top settings turned off on the most demanding games. It is likely going to be several years before gpus get enough of a jump to make that type of scenario for 4k resolutions considering the graphics complexity/ceilings will continue to rise along with gpu generations. That is, unless some exponential leap in gpu speed happens.

https://www.gamersnexus.net/guides/3419-sli-nvlink-titan-rtx-benchmark-gaming-power-consumption

bTNMfsK.png
 
Last edited:
One thing to keep in mind, a higher FPS number does not necessarily translate into a better experience, especially with multi-GPU.

In my case I was running Vega 64 Crossfire on Far Cry 5. CF did improve the framerate number but at the cost of smoothness in the game.

I believe this may have been microstutter, and after replacing the 2 cards with a Radeon VII the stutter went way.

Now I get a lower FPS number (it may have been around 90 fps w/ 2 cards and 70 fps w/ 1), but I have a much better experience.
 
I can understand that. It could have been straight up microstutter but some types of AA stutter in sli too. It's not a perfect solution but dropping anything near into -30 << 60 >> 90- graphs is like molasses to me and reintroduces smearing blur during viewport movement. Not that it's unplayable. To be fair I play dark souls 3 at 60fps since I have no other choice.

Hopefully a die shrink on a hdmi 2.1 gpu will come along in Ti tier eventually so I can see what a single gpu like that can do at 4k and in HDR where available, and in that gpu's generation of game graphics ceilings.

Luckily I'm a pretty patient gamer nowadays so by the time I buy a game there have been a bunch of nvidia driver updates which sometimes improve that game as well as patches from game's developers.. both overall and in some of the top game's cases sli addition/improvement. It's nice to have where it works.

==========================================================

Quoting myself:

Some games are much worse than others, and if your frame rate graphs are lower the effects of microstutter can appear worse. Think of the rapidity of the frame rate delivery.. a stutter becomes more of a quiver and a quiver becomes not much of anything. Again dependent on the game engine and the graphics settings.

-----------------------

There are games where microstutter can be horrible and if you aren't using a high bandwidth sli bridge and if you run lower fps graphs you'll experience it even more across the board. In addition to running DX11, several SLI capable games also require you to turn AA down to minimal levels or off to avoid obnoxious microstutter as well. e.g. "you have to turn AA down to low so that TAA gets turned off. taa uses the last frame to AA the next frame which causes a problem with afr and sli. "

Sli doesn't work on all games, and some of the ones it works on require work arounds/DiY fixes.

SLI scaling varies - even when it's scaling some games do 30%, 60% while some still do ~90%.

SLi requires DX11 (unless a newer game supports nvLink)

SLI official support can potentially take months of game patches and nvidia driver updates before the wrinkles are ironed out - though some top games work great near/at launch.

Sli microstuttering can be overt on some, usually poorly optimized to start with, game engines and when running frame rate graphs with low bottoms, and when not using a high bandwidth SLI bridge.

SLI is very expensive and you can get by just fine, probably better served and price/performance with a single card at a lower resolution monitor like 1440p for 100fpsHz+ gameplay in order to get appreciable gains out of higher Hz.

It's definitely not for everyone and unnecessary at 1440p for the most part with the top tier modern gpus

(even if you'd still have to dial some over the top settings down on the most demanding or otherwise poorly optimized games to keep 100fpsHz+ average).

-------------------------------

So it is far from a perfect solution for every game and every game's graphics/AA settings - but playing beneath 100fpsHz average sample and hold blur and lack of motion definition on a high hz monitor is useless to me so SLI gives me that on higher resolutions and on higher demand games that support it adequately in the last few years (Witcher 3, GTA V, Dishonored 2, Prey. overwatch, dark souls 3(able to maintain 60fps with cranked settings, even modded graphics), shadow of mordor/War, FarCry Primal, Vermintide 2 << etc. .. I haven't tried black ops 4 yet but I heard it runs well in sli. The Dirt series works with sli too. I wouldn't even consider buying into 4k 120hz on some games without it. 1440p has now moved into the sweet spot more or less for single top tier gpu though.. even if you have to dial some of the over the top settings down on the most demanding or otherwise unoptimized games to hit 100FpsHz+ average.

===========================================================
 
Last edited:
Another old timer here. I am currently running SLI 1080Ti's and I am completely satisfied gaming at 1440. I do think mGPU may be going the way of the dodo but for now no complaints, good support on titles I play with occ. tweaking. Next upgrade will be when I can get one card that beats my 1080Ti's by at least 20% across the board. (Or maybe when direct neural interface is a thing.)
 
NEVER buy 2 cards when you can use the money to buy 1 faster card. You should have bought the Titan RTX.

I should have bought a card over twice the price of a 2080ti, that offers a general 5% performance gain (and would get decimated in the multigpu games), that wasn't out at the time I upgraded and would have given me a shorter system lifespan, and that will absolutely collapse in resale value the second it is no longer "the fastest card"? No, I'm pretty sure I shouldn't have bought one of the worst value cards of all time. If we were talking about the original GTX Titan, which wrecked the GTX 680 and traded blows with multi GPU systems, you would have a point, but the Titan RTX is a joke. Actually with the original Titan I made a build with 3 so I could finally run Crysis . . . at 7860x1600.


Agreed. I never understood the appeal of running dual mid-range GPUs as you added system complexity and made the performance of your system conditional on support that may or may not be there. You typically had less VRAM as a result of such a configuration as well. I've always felt SLI was best when it was used to get next generation performance now on the high end. I used it as a way to drive extremely high resolutions and multiple monitor gaming because no single card could do it. I used it to max out games that other wise would have to be run on medium settings or worse.

The 2080ti is not a "mid-range GPU," nor does VRAM really have much relevance given that I've never seen even a single game VRAM limited on that card. My complaint, right now, is that SLI cannot be used for the exact purpose you say it is for: maxing out games that could not otherwise be maxed out. We should be using it to max ray-traced games. Instead, they only use one card, and they chug. After I made the original post, Tomb Raider became the first game to actually do it right. Although the ray-tracing effects themselves in that game aren't exactly impressive, it at least allows a dual 2080ti build to run full-throttle and more or less max the game at 4k 60fps (there are some sections where the RTX effects have to be turned down one notch from the top to keep 60fps).
 
I should have bought a card over twice the price of a 2080ti, that offers a general 5% performance gain (and would get decimated in the multigpu games), that wasn't out at the time I upgraded and would have given me a shorter system lifespan, and that will absolutely collapse in resale value the second it is no longer "the fastest card"? No, I'm pretty sure I shouldn't have bought one of the worst value cards of all time. If we were talking about the original GTX Titan, which wrecked the GTX 680 and traded blows with multi GPU systems, you would have a point, but the Titan RTX is a joke. Actually with the original Titan I made a build with 3 so I could finally run Crysis . . . at 7860x1600.


The 2080ti is not a "mid-range GPU," nor does VRAM really have much relevance given that I've never seen even a single game VRAM limited on that card. My complaint, right now, is that SLI cannot be used for the exact purpose you say it is for: maxing out games that could not otherwise be maxed out. We should be using it to max ray-traced games. Instead, they only use one card, and they chug. After I made the original post, Tomb Raider became the first game to actually do it right. Although the ray-tracing effects themselves in that game aren't exactly impressive, it at least allows a dual 2080ti build to run full-throttle and more or less max the game at 4k 60fps (there are some sections where the RTX effects have to be turned down one notch from the top to keep 60fps).

I never said the RTX 2080 Ti was a mid-range card. I am simply saying that comparatively, dual-mid range cards rely on SLI support and mid-range cards generally have less VRAM than their high end counterparts do. In an SLI configuration half of it is used to duplicate the other cards data. Even if less VRAM isn't a problem, which I've certainly seen be an issue, the second card's VRAM isn't helpful in the traditional sense. A single, more powerful card can potentially offer more capability than dual mid-range cards in SLI can. The reverse is also true in certain situations but those are fewer and further between all the time. Again, I stand by the statement that if your choice is a high end single GPU card or two mid-range cards, I'd opt for the former. You will generally get a better gaming experience out of the single GPU configuration.

I realize that with the latest generation of cards, using SLI to achieve results no single card can is a poor option now. There are only a handful of games where this is a valid approach. However, I was referring to the last ten years or so where that was absolutely viable. I often started with a single card each generation and added the second one within a day or two due to being unsatisfied with the performance of that single GPU or single graphics card. I've been doing this since dual GPUs became a thing.
 
I have been real disappointed with where SLI is the last few years and will probably only go single card from this point forward.

I want to play the upcoming Rage 2 on my 4KTV, but it appears there will probably be no SLI support for my 2X1080GTX cards. I am trying to decide if it is worthwhile to upgrade to a single RTX 2080 - the RTX 2080 TI is a bit too expeensive for me. I would be content if the framerate minimum is above 30fps.
 
I have been real disappointed with where SLI is the last few years and will probably only go single card from this point forward.

I want to play the upcoming Rage 2 on my 4KTV, but it appears there will probably be no SLI support for my 2X1080GTX cards. I am trying to decide if it is worthwhile to upgrade to a single RTX 2080 - the RTX 2080 TI is a bit too expeensive for me. I would be content if the framerate minimum is above 30fps.

A single RTX 2080 is barely any faster than a single GTX 1080 Ti. Given the cost of one, it isn't a efficient upgrade at all. I think the difference is around 10% at best. That hardly makes up for the loss of the second GTX 1080 Ti. That was one of the reasons why this generation sucked. While the RTX 2080 Ti is a worthy successor to the GTX 1080 Ti, its just too expensive for people who bought at the $700 price point. Its an increase of around $400-$500 after taxes. That's massive in a single generation. I've explained my thoughts on this many times, but in short, I think it really replaced the Titan, and people at the 1080 Ti level didn't really get an upgrade at all. As a 1080 Ti owner at the time, I saw no upgrade besides a single RTX 2080 Ti, which isn't always an upgrade compared to dual 1080 Ti's. Just looking at the numbers alone, the only upgrade from 1080 Ti SLI is GTX 2080 Ti SLI. Unfortunately, that doesn't work either as it doesn't work worth a damn in very many games.

The way I see it, the RTX 2080 Ti is a great card but one that's hurt by being priced too high. The rest of the RTX line is even worse off as its virtually no better than what was replaced.
 
I think RAGE 2 might be using the same engine as Mad Max, that was really well optimized.

I bet if you tweaked the settings you can run at 4K with a single GTX 1080.
 
A single RTX 2080 is barely any faster than a single GTX 1080 Ti. Given the cost of one, it isn't a efficient upgrade at all. I think the difference is around 10% at best. That hardly makes up for the loss of the second GTX 1080 Ti. That was one of the reasons why this generation sucked. While the RTX 2080 Ti is a worthy successor to the GTX 1080 Ti, its just too expensive for people who bought at the $700 price point. Its an increase of around $400-$500 after taxes. That's massive in a single generation. I've explained my thoughts on this many times, but in short, I think it really replaced the Titan, and people at the 1080 Ti level didn't really get an upgrade at all. As a 1080 Ti owner at the time, I saw no upgrade besides a single RTX 2080 Ti, which isn't always an upgrade compared to dual 1080 Ti's. Just looking at the numbers alone, the only upgrade from 1080 Ti SLI is GTX 2080 Ti SLI. Unfortunately, that doesn't work either as it doesn't work worth a damn in very many games.

The way I see it, the RTX 2080 Ti is a great card but one that's hurt by being priced too high. The rest of the RTX line is even worse off as its virtually no better than what was replaced.

Thanks Dan...I think I wil buy this game for the Xbox One X and wait to see how the rest of the year pans out for video cards. I would buy a RTX 2080 TI now if it was the price of a regular RTX, or at least consider it if it was under $1000.
 
Ugh....I am spending too much time obsessing over PC parts lol. This is a "free" paycheck month for me since I get paid 3 times, so I am considering getting a RTX 2080 TI now, even at that price point - just wondering how much my dated 4770K would hold it back. I miss the days when SLI worked for the majority of games.
 
The way GPU prices have been.. be a good idea to take that free to blow check on a gpu while ya can.

Cpu/mb/ram might even be cheaper then the card total.. depending on what parts you go with.

Get the card out of the way.. as long as your not at 1080 screen res you'll be fine enough.

Plan a new pc upgrade later.


I wont even mention my crazy setup and sure I need to update my sig.
 
Back
Top