Any SLI / 2080 TI owners feeling completely ripped off by NVIDIA?

Discussion in 'nVidia Flavor' started by JCD3ntonX, Mar 27, 2019.

  1. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,326
    Joined:
    Mar 22, 2008
    The Tomb Raider engine supports DX12 mGPU, which is not SLI in the traditional sense.

    DX12 (or Vulkan) mGPU works well but requires developers to specifically code support for it (unlike SLI/Crossfire which was enabled on a driver level from GPU vendors).
     
    lostin3d and IdiotInCharge like this.
  2. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,606
    Joined:
    Jun 13, 2003
    This is the impasse where we're at today- and while engine developers have more or less implemented mGPU in DX12 and Vulkan, widespread use by game developers hasn't been that popular yet.

    My bet is that it will likely take off again as both games and output devices get more demanding. Ray tracing will be in everything and used to greater effect while VR and 4k120 displays (and higher) become more mainstream.
     
    Manny Calavera and lostin3d like this.
  3. DooKey

    DooKey [H]ard DCOTM x4

    Messages:
    7,859
    Joined:
    Apr 25, 2001
    I've gone with SLI of the top-end cards up until the 10xx series. It just seemed like support was waning and not worth it anymore. I don't regret dropping SLI one bit.
     
    ShuttleLuv likes this.
  4. rinaldo00

    rinaldo00 [H]ard|Gawd

    Messages:
    1,419
    Joined:
    Mar 9, 2005
    NEVER buy 2 cards when you can use the money to buy 1 faster card. You should have bought the Titan RTX.
     
  5. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    53,425
    Joined:
    Feb 9, 2002
    Agreed. I never understood the appeal of running dual mid-range GPUs as you added system complexity and made the performance of your system conditional on support that may or may not be there. You typically had less VRAM as a result of such a configuration as well. I've always felt SLI was best when it was used to get next generation performance now on the high end. I used it as a way to drive extremely high resolutions and multiple monitor gaming because no single card could do it. I used it to max out games that other wise would have to be run on medium settings or worse.
     
    Manny Calavera, lostin3d and Azrak like this.
  6. Bawjaws

    Bawjaws Limp Gawd

    Messages:
    434
    Joined:
    Feb 20, 2017
    I agree with this, too. Buy the fastest single card that you can afford.
     
  7. lostin3d

    lostin3d [H]ard|Gawd

    Messages:
    1,962
    Joined:
    Oct 13, 2016
    cybereality likes this.
  8. eclypse

    eclypse 2[H]4U

    Messages:
    3,043
    Joined:
    Dec 7, 2003
    First GPU upgrade that sli didnt even pass my mind.

    Did use 1080ti sli for a month or so cause I could for rise of the tomb raider 4k.

    Yeah been doing SLI since sli was a thing back when with voodoo cards.

    Did quad sli with gtx580 and quad fire 7970s. Waste of money.

    I feel bad everytime I hear someone talk about SLI with 2080ti cards.

    Just imagine buying 2 2080ti kingpin cards! Yeah I've read about that yesterday.


    Sli is only good for benchmarks really in today's world unless your running older games but then why with the power of a single 2080ti?
     
    lostin3d and cybereality like this.
  9. MavericK

    MavericK Zero Cool

    Messages:
    28,636
    Joined:
    Sep 2, 2004
    That was my experience with SLI. I had 680s and 970s in SLI, went to a single card with the 1080. Not likely to go back to dual cards unless DX12 mGPU becomes prevalent.
     
    DLGenesis likes this.
  10. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    9,606
    Joined:
    Jun 13, 2003
    Same but with 670's- and both worked pretty well with the games I played at the time. With respect to the 970's, though, if I'd known that there was a 980Ti on the way I'd just have waited for that instead.

    And yeah, SLI works, when the game support is there. Benchmarks show frametimes (or 99% FPS) scaling with overall framerates in good implementations.
     
    lostin3d likes this.
  11. elvn

    elvn 2[H]4U

    Messages:
    3,097
    Joined:
    May 5, 2006
    I'm still running two 1080ti hybrids in sli with a high bandwidth bridge. A lot of my favorite games support it. Without hitting at least 100fps+Hz average you really aren't getting appreciable gains out of high hz. Even 100fps+Hz is usually something like a 70- 100 - 130fps graph. At 120fps+Hz = 50% blur reduction from 60fps-Hz's smearing blur, and double the motion definition and pathing of the entire viewport moving in relation to you in 1st/3rd person games while mouse-looking and movement-keying / controller panning.
    100 average (~ 70 - 100 - 130 with variable refresh rate) is about as low as I'm willing to go.

    Personally I'm waiting for die shrink and hdmi 2.1 outputs on gpus. For now it seems they released a whole gpu gen without hdmi 2.1 and are going to milk a whole series of proprietary monitors at 4k that can only do 120hz and variable refresh rate on their included displayport connection instead of being able to use hdmi 2.1 at 120hz with VRR (variable refresh rate) on any hdmi 2.1 monitor or tv that comes out on hdmi 2.1 It's a slow release roadmap with a lot of monitors pushed back too. Upgrading is not worth it to me until 7nm + hdmi 2.1 and I usually wait for the Ti Hybrids at that. If a single top tier one can do 100fps-hz average or better at 4k on very demanding games at very high+ to ultra (even with a few over the top settings turned off), I'd consider a single one instead of dual. It would also depend on what games support mgpu at that time.


    okw997S.png


     
    Last edited: Apr 10, 2019
  12. eclypse

    eclypse 2[H]4U

    Messages:
    3,043
    Joined:
    Dec 7, 2003
    I'm pretty sure my 2080ti ftw3 ultra was doing 100+ at 4k the other night when testing hehe. I had vsync off just to see the frame rate that was bfv. High to ultra settings.. but I'm guessing high.

    I normally play maxed out though on my 34" ultra wide 1440 120hz monitor.

    Samsung 40" 4k screen on my desk as 2nd monitor or Netflix viewing.

    I'm pretty sure as I remember being in shock over it but it was screen tearing like crazy so I had to exit and turn vsync back on and limit it to 60hz and lock frame rate to 60 in game.

    Too laggy with input though to be do able for a fast game and was glad to swap back to the 1440 screen with gsync and 120hz.
     
    Last edited: Apr 10, 2019
  13. elvn

    elvn 2[H]4U

    Messages:
    3,097
    Joined:
    May 5, 2006
    I'll have to see what the performance is between single and dual once a hdmi 2.1 gpu comes out , hopefully with a die shrink.

    This quote still applies to a lot of games. Of course you can dial in(down) the graphics settings to gain motion clarity (blur reduction) and motion definition (smoothness, pathing, animation) . I like to play at VeryHigh+ to Ultra- if I can afford to but 100fps+hz average (70 - 100 - 130 graph for the most part) is about as low as I'm willing to go.

    https://www.gamersnexus.net/guides/3419-sli-nvlink-titan-rtx-benchmark-gaming-power-consumption

    bTNMfsK.png
     
    Last edited: Apr 17, 2019
  14. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,326
    Joined:
    Mar 22, 2008
    One thing to keep in mind, a higher FPS number does not necessarily translate into a better experience, especially with multi-GPU.

    In my case I was running Vega 64 Crossfire on Far Cry 5. CF did improve the framerate number but at the cost of smoothness in the game.

    I believe this may have been microstutter, and after replacing the 2 cards with a Radeon VII the stutter went way.

    Now I get a lower FPS number (it may have been around 90 fps w/ 2 cards and 70 fps w/ 1), but I have a much better experience.
     
    Chimpee likes this.
  15. elvn

    elvn 2[H]4U

    Messages:
    3,097
    Joined:
    May 5, 2006
    I can understand that. It could have been straight up microstutter but some types of AA stutter in sli too. It's not a perfect solution but dropping anything near into -30 << 60 >> 90- graphs is like molasses to me and reintroduces smearing blur during viewport movement. Not that it's unplayable. To be fair I play dark souls 3 at 60fps since I have no other choice.

    Hopefully a die shrink on a hdmi 2.1 gpu will come along in Ti tier eventually so I can see what a single gpu like that can do at 4k and in HDR where available, and in that gpu's generation of game graphics ceilings.

    Luckily I'm a pretty patient gamer nowadays so by the time I buy a game there have been a bunch of nvidia driver updates which sometimes improve that game as well as patches from game's developers.. both overall and in some of the top game's cases sli addition/improvement. It's nice to have where it works.

    ==========================================================

    Quoting myself:

    Some games are much worse than others, and if your frame rate graphs are lower the effects of microstutter can appear worse. Think of the rapidity of the frame rate delivery.. a stutter becomes more of a quiver and a quiver becomes not much of anything. Again dependent on the game engine and the graphics settings.

    -----------------------

    There are games where microstutter can be horrible and if you aren't using a high bandwidth sli bridge and if you run lower fps graphs you'll experience it even more across the board. In addition to running DX11, several SLI capable games also require you to turn AA down to minimal levels or off to avoid obnoxious microstutter as well. e.g. "you have to turn AA down to low so that TAA gets turned off. taa uses the last frame to AA the next frame which causes a problem with afr and sli. "

    Sli doesn't work on all games, and some of the ones it works on require work arounds/DiY fixes.

    SLI scaling varies - even when it's scaling some games do 30%, 60% while some still do ~90%.

    SLi requires DX11 (unless a newer game supports nvLink)

    SLI official support can potentially take months of game patches and nvidia driver updates before the wrinkles are ironed out - though some top games work great near/at launch.

    Sli microstuttering can be overt on some, usually poorly optimized to start with, game engines and when running frame rate graphs with low bottoms, and when not using a high bandwidth SLI bridge.

    SLI is very expensive and you can get by just fine, probably better served and price/performance with a single card at a lower resolution monitor like 1440p for 100fpsHz+ gameplay in order to get appreciable gains out of higher Hz.

    It's definitely not for everyone and unnecessary at 1440p for the most part with the top tier modern gpus

    (even if you'd still have to dial some over the top settings down on the most demanding or otherwise poorly optimized games to keep 100fpsHz+ average).

    -------------------------------

    So it is far from a perfect solution for every game and every game's graphics/AA settings - but playing beneath 100fpsHz average sample and hold blur and lack of motion definition on a high hz monitor is useless to me so SLI gives me that on higher resolutions and on higher demand games that support it adequately in the last few years (Witcher 3, GTA V, Dishonored 2, Prey. overwatch, dark souls 3(able to maintain 60fps with cranked settings, even modded graphics), shadow of mordor/War, FarCry Primal, Vermintide 2 << etc. .. I haven't tried black ops 4 yet but I heard it runs well in sli. The Dirt series works with sli too. I wouldn't even consider buying into 4k 120hz on some games without it. 1440p has now moved into the sweet spot more or less for single top tier gpu though.. even if you have to dial some of the over the top settings down on the most demanding or otherwise unoptimized games to hit 100FpsHz+ average.

    ===========================================================
     
    Last edited: Apr 17, 2019
  16. schoenda

    schoenda Gawd

    Messages:
    860
    Joined:
    Apr 8, 2003
    Another old timer here. I am currently running SLI 1080Ti's and I am completely satisfied gaming at 1440. I do think mGPU may be going the way of the dodo but for now no complaints, good support on titles I play with occ. tweaking. Next upgrade will be when I can get one card that beats my 1080Ti's by at least 20% across the board. (Or maybe when direct neural interface is a thing.)
     
  17. JCD3ntonX

    JCD3ntonX n00b

    Messages:
    19
    Joined:
    Mar 2, 2017
    I should have bought a card over twice the price of a 2080ti, that offers a general 5% performance gain (and would get decimated in the multigpu games), that wasn't out at the time I upgraded and would have given me a shorter system lifespan, and that will absolutely collapse in resale value the second it is no longer "the fastest card"? No, I'm pretty sure I shouldn't have bought one of the worst value cards of all time. If we were talking about the original GTX Titan, which wrecked the GTX 680 and traded blows with multi GPU systems, you would have a point, but the Titan RTX is a joke. Actually with the original Titan I made a build with 3 so I could finally run Crysis . . . at 7860x1600.


    The 2080ti is not a "mid-range GPU," nor does VRAM really have much relevance given that I've never seen even a single game VRAM limited on that card. My complaint, right now, is that SLI cannot be used for the exact purpose you say it is for: maxing out games that could not otherwise be maxed out. We should be using it to max ray-traced games. Instead, they only use one card, and they chug. After I made the original post, Tomb Raider became the first game to actually do it right. Although the ray-tracing effects themselves in that game aren't exactly impressive, it at least allows a dual 2080ti build to run full-throttle and more or less max the game at 4k 60fps (there are some sections where the RTX effects have to be turned down one notch from the top to keep 60fps).
     
    GoldenTiger likes this.
  18. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    53,425
    Joined:
    Feb 9, 2002
    I never said the RTX 2080 Ti was a mid-range card. I am simply saying that comparatively, dual-mid range cards rely on SLI support and mid-range cards generally have less VRAM than their high end counterparts do. In an SLI configuration half of it is used to duplicate the other cards data. Even if less VRAM isn't a problem, which I've certainly seen be an issue, the second card's VRAM isn't helpful in the traditional sense. A single, more powerful card can potentially offer more capability than dual mid-range cards in SLI can. The reverse is also true in certain situations but those are fewer and further between all the time. Again, I stand by the statement that if your choice is a high end single GPU card or two mid-range cards, I'd opt for the former. You will generally get a better gaming experience out of the single GPU configuration.

    I realize that with the latest generation of cards, using SLI to achieve results no single card can is a poor option now. There are only a handful of games where this is a valid approach. However, I was referring to the last ten years or so where that was absolutely viable. I often started with a single card each generation and added the second one within a day or two due to being unsatisfied with the performance of that single GPU or single graphics card. I've been doing this since dual GPUs became a thing.
     
    cybereality likes this.
  19. Arioch

    Arioch Limp Gawd

    Messages:
    382
    Joined:
    Jun 5, 2004
    I have been real disappointed with where SLI is the last few years and will probably only go single card from this point forward.

    I want to play the upcoming Rage 2 on my 4KTV, but it appears there will probably be no SLI support for my 2X1080GTX cards. I am trying to decide if it is worthwhile to upgrade to a single RTX 2080 - the RTX 2080 TI is a bit too expeensive for me. I would be content if the framerate minimum is above 30fps.
     
  20. Dan_D

    Dan_D [H]ard as it Gets

    Messages:
    53,425
    Joined:
    Feb 9, 2002
    A single RTX 2080 is barely any faster than a single GTX 1080 Ti. Given the cost of one, it isn't a efficient upgrade at all. I think the difference is around 10% at best. That hardly makes up for the loss of the second GTX 1080 Ti. That was one of the reasons why this generation sucked. While the RTX 2080 Ti is a worthy successor to the GTX 1080 Ti, its just too expensive for people who bought at the $700 price point. Its an increase of around $400-$500 after taxes. That's massive in a single generation. I've explained my thoughts on this many times, but in short, I think it really replaced the Titan, and people at the 1080 Ti level didn't really get an upgrade at all. As a 1080 Ti owner at the time, I saw no upgrade besides a single RTX 2080 Ti, which isn't always an upgrade compared to dual 1080 Ti's. Just looking at the numbers alone, the only upgrade from 1080 Ti SLI is GTX 2080 Ti SLI. Unfortunately, that doesn't work either as it doesn't work worth a damn in very many games.

    The way I see it, the RTX 2080 Ti is a great card but one that's hurt by being priced too high. The rest of the RTX line is even worse off as its virtually no better than what was replaced.
     
    lostin3d, noko and Arioch like this.
  21. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,326
    Joined:
    Mar 22, 2008
    I think RAGE 2 might be using the same engine as Mad Max, that was really well optimized.

    I bet if you tweaked the settings you can run at 4K with a single GTX 1080.
     
  22. Arioch

    Arioch Limp Gawd

    Messages:
    382
    Joined:
    Jun 5, 2004
    Thanks Dan...I think I wil buy this game for the Xbox One X and wait to see how the rest of the year pans out for video cards. I would buy a RTX 2080 TI now if it was the price of a regular RTX, or at least consider it if it was under $1000.
     
  23. Arioch

    Arioch Limp Gawd

    Messages:
    382
    Joined:
    Jun 5, 2004
    Ugh....I am spending too much time obsessing over PC parts lol. This is a "free" paycheck month for me since I get paid 3 times, so I am considering getting a RTX 2080 TI now, even at that price point - just wondering how much my dated 4770K would hold it back. I miss the days when SLI worked for the majority of games.
     
  24. eclypse

    eclypse 2[H]4U

    Messages:
    3,043
    Joined:
    Dec 7, 2003
    The way GPU prices have been.. be a good idea to take that free to blow check on a gpu while ya can.

    Cpu/mb/ram might even be cheaper then the card total.. depending on what parts you go with.

    Get the card out of the way.. as long as your not at 1080 screen res you'll be fine enough.

    Plan a new pc upgrade later.


    I wont even mention my crazy setup and sure I need to update my sig.