2080ti 2080 Ownership Club

I had 2080 Ti SLI for a bit when they first came out.

SLI is dying, for the most part, but it does still work for some games. If you're lucky you can expect around 50% boost, provided the game is supported, some more some less.

Ultimately I found there was too much BS with SLI support and sold the second card. But when it worked, it was nice.
 
I had 2080 Ti SLI for a bit when they first came out.

SLI is dying, for the most part, but it does still work for some games. If you're lucky you can expect around 50% boost, provided the game is supported, some more some less.

Ultimately I found there was too much BS with SLI support and sold the second card. But when it worked, it was nice.

The funny thing is I got off 1080 Ti SLI to the 2080 Ti because I hated the SLI support. I honestly don't expect much difference, but the games that support it, I'll take advantage of it as much as possible. Gaming at 4K/98hz (love that 10 bit color; let alone 4k/120hz) at least has been difficult on some games at max settings.
 
Yes, I understand. I was on triple 1440p for a while (7680x1440 @ 144Hz) and it was a struggle even with the best computers.

I had 2080 Ti SLI and hitting 100 fps (let alone 144 fps) in new games was very hard without turning down settings, and sometimes just not possible.

So I ended up selling the monitors and getting one 1080p ultrawide. Seems crazy to drop to 1080p, but performance is much better and I don't have to worry anymore.
 
I've had a 2080 Ti FE for awhile now, and was using it to game on a 27 inch 1440p 165hz monitor. Well, stupid me decided to upgrade it to the Asus PG27UQ 4k 144hz monitor a few weeks back and I noticed a few games were struggling to maintain decent frames at max settings. So how do I decide to solve this? By paying $1000 for a second, used 2080 Ti FE off ebay, and an NVLink off Amazon. Looks like sometime this week I'll have dual 2080 Ti's.

Depending on performance numbers, I'll probably either resell it, or do it proper and get them both under water. I'll share by numbers once I get the cards tested and compare them with newer games.
What games were you struggling with? There are very few games that are unacceptable on my PC at 4K. I can't even think of any recently where I had to touch the settings. The last one was Quake II RTX where I turned the resolution down to 1920x1080 for ray tracing in its full glory.
 
You can see the video I did last year with 2080 Ti NVLink.



Tomb Raider was the best case (native DX12 mGPU) and even then I was just breaking 100 fps. Project Cars was really fast, but it's still decent with 1 card.

Far Cry 5 was around 90 fps with texture flickering I couldn't fix. Watch Dogs 2 was hovering around 60 - 75 fps.

If you are talking about 4K high refresh, there is nothing you can buy to hit those framerates, outside of e-sports or old titles.

Probably the best you can expect realistically is 90 - 100 fps and hope you have G-Sync/FreeSync to smooth it out.
 
  • Like
Reactions: JMCB
like this
Personally I drop my 4k to 1440p if I want more frames. It’s a 55” screen at around 6’ so I can barely tell the difference.

Alternatively there’s usually one or two settings that tank frames.. like shadows on ultra instead of high and I can’t tell the difference anyways.
 
Last edited:
  • Like
Reactions: noko
like this
You can see the video I did last year with 2080 Ti NVLink.



Tomb Raider was the best case (native DX12 mGPU) and even then I was just breaking 100 fps. Project Cars was really fast, but it's still decent with 1 card.

Far Cry 5 was around 90 fps with texture flickering I couldn't fix. Watch Dogs 2 was hovering around 60 - 75 fps.

If you are talking about 4K high refresh, there is nothing you can buy to hit those framerates, outside of e-sports or old titles.

Probably the best you can expect realistically is 90 - 100 fps and hope you have G-Sync/FreeSync to smooth it out.


If I was getting 90-100 FPS on Star Wars Battlefront 2 maxed out with PCSS shadow quality, instead of 50 fps, or Control with Ray Tracing enabled, I'd probably stick to it. I'm hoping they've improved some scaling in a year, but we'll see. The second card arrives this week, so here's to hoping. Hahaha
 
Personally I drop my 4k to 1440p if I want more frames. It’s a 55” screen at around 6’ so I can barely tell the difference.

Alternatively there’s usually one or two settings that tank frames.. like shadows on ultra instead of high and I can’t tell the difference anyways.
This. In many games a couple of settings drops will look virtually identical in game compared to max settings. It's certainly doable to hit the marks you want at 4k without sli. You won't see the difference at all in motion.
 
This. In many games a couple of settings drops will look virtually identical in game compared to max settings. It's certainly doable to hit the marks you want at 4k without sli. You won't see the difference at all in motion.
It does seem that the settings that have the most impact on GPU performance are those that have the smallest impact on visual quality in motion. PCSS typically tanks framerate and is one of the first settings I lower if I'm having issues. Soft particles is another. Ambient occlusion is one that does have a big visual impact in addition to tanking performance, and I'll only touch it as a last resort.
 
Yeah, I normally don't play with ultra settings. I just did that for the video cause I know people ask for ultra.

Most of the time I will tweak settings down to a custom high, so I can get the fastest framerates.

I have no problem putting some settings on medium or low if they don't make a big visual impact and will net me more frames.
 
Long time reader of these forums... first time poster! lol. :)

Anyway, I joined the 20 series club about 2 weeks ago. I snagged an eVGA 2080Ti FTW3 Ultra. Microcenter had them on "sale" for $1398 plus a $100 rebate. Was looking for the larger PCB for better cooling and the larger power limit (373W) to maintain higher boost clocks and in theory, have a decent overclock. Not sure it was worth the extra $200 over the XC Ultra, but I'm a sucker for extra performance, no matter what.

Right now, I am 100% game and benchmark stable with 2130Mhz core and 8000mhz memory. The larger cooler does its job for sure as temps on the core never get above 64C and the memory under full load is about 75C. I can maintain the boost clock the entire time. Sadly, at 124% power slider, I'm still limited I think. If I up my OC to 2145Mhz or higher, I'll occasionally crash in Control, but not other games. Still, indicates lack of stability that high for me with RT being used.

I came from 2 1080's in SLI, which still worked great in games that supported (or could be nvidiainspector hacked to support) SLi. However, I PC game to be a graphics whore, so the lack of being able to Ray Trace was eating away at me and I couldn't wait for the next generation. Plus dwindling SLi support all but sealed the deal for me on newer games.

First impressions are, on older games or DX12 games that fully support mGPU and SLI, it's a lateral move. Negligible increase for me from 2 1080's for the price... maybe ~10 to 15% increase. However, on games that only support 1 GPU or games with DXR, its obviously a massive increase in performance (as 10 series obviously chokes with DXR).

One important note, after running SLi for 6 or more years, I forgot how bad microstutter could be. Even if in some older SLi supported games my frames are slightly less now, I have noticed an overall smoothness quality... just feels so much more fluid and solid of a gaming experience. Never really noticed it until now...

To me, DXR is underrated. I've been playing Control, Metro, BFV and SOTTR to test this new GPU out, and while DXR is more "detail" oriented, I don't think I could ever go back. It just looks so much better once you have used it fully.

I'm excited to see what's next, as even my 2080Ti at 1440P will choke down to 55~60 FPS in some spots in Control with RT fully enabled. Very playable, but as I'm used to 100+ FPS on my 144Hz gsync monitor, I'd love to be able to hit that again someday with RT on.

Overall, I am pleased thus far. A bit more than I would have cared to pay, but such is the price of PC gaming and wanting to be on the forefront and experience of a brand new technology. Plus with dwindling SLi support, I felt buying the best single video card one could buy was a worth while jump to make again. Hopefully intel and AMD can help make nvidia honest again though. But considering the technology and die size, perhaps it's not as much of a gouge as I perceive... who knows!
 
Jesus christ, 2080 Ti SLI is awful. I tested it in three applications - 3dMark (which worked perfectly), Star Wars Battlefront 2, and Black Ops 4. In Star Wars BF 2, I was getting significantly better frames, but two things. Weird auras showed up around the characters at times, and I'd get some stuttering. In BO4, it was a weird shakiness around the player model and a lot of stutter. Maybe I need to play around with it for a bit, but if THIS is it, I'm probably just going to sell the second card VERY quickly. So far, this has been a much worse experience than 1080 Ti SLI...
 
Jesus christ, 2080 Ti SLI is awful. I tested it in three applications - 3dMark (which worked perfectly), Star Wars Battlefront 2, and Black Ops 4. In Star Wars BF 2, I was getting significantly better frames, but two things. Weird auras showed up around the characters at times, and I'd get some stuttering. In BO4, it was a weird shakiness around the player model and a lot of stutter. Maybe I need to play around with it for a bit, but if THIS is it, I'm probably just going to sell the second card VERY quickly. So far, this has been a much worse experience than 1080 Ti SLI...

That's sad to hear considering if these RT games had native mGPU support the boost on frames would make a much larger difference in perception of overall gameplay smoothness. Last few years we have been spoiled with very high frame rates, but with the implementation of Ray Tracing, the idea of more power through multiple GPUs seems somewhat important again! Even the massive 2080Ti is brought to its knees at 4K with RT and even at 2K, full on Ray Tracing pulls you well under the 100FPS range on most games.
 
I would try more games, some do work (Tomb Raider, for example), but overall SLI is more hassle than it's worth.

I had Vega 64 Crossfire for like 2 years before I got the Radeon VII and realized I was seeing stutter the whole time (despite higher fps numbers). That is when I sold my 2nd 2080 Ti.
 
I finally jumped to a 2080ti ftw3 about 3 weeks ago. Haven't even tried overclocking yet, it's factory boost clock is 1770Mhz pretty sure. Should I just move the power slider up to 121% and let it do it's own thing?
 
I finally jumped to a 2080ti ftw3 about 3 weeks ago. Haven't even tried overclocking yet, it's factory boost clock is 1770Mhz pretty sure. Should I just move the power slider up to 121% and let it do it's own thing?

Pretty much. And set a fan profile for yourself if you want. I think they usually boost up to around 1900Mhz stock. OCs are usually around 2000. nVidia took Turing much closer to the edge than they traditionally would.
 
I finally jumped to a 2080ti ftw3 about 3 weeks ago. Haven't even tried overclocking yet, it's factory boost clock is 1770Mhz pretty sure. Should I just move the power slider up to 121% and let it do it's own thing?

Throw Power and Voltage to Max on the slider... set a fan profile to keep temps in the low 60's so you don't ever have to worry about throttling. The 3 fans are not as loud (to me anyway) compared to 2 fan solutions, so I have mine set to run 100% at 65C as I care more about the power as I can't hear them anyway over my games.

Once your at that point; just raise the core until instability and back off (average is between 2000 ~ 2100 for most people). Next, you can probably safely go 7500 on the memory right away and tweak up as required. I recommend running Valley in windowed mode and watching FPS to see where the memory speed / timing cutoff is for you in terms of performance. Keep an eye on the DDR6 temps in the ICX window. I'd try and stay below 80C on the memory, I personally decided to keep mine under 75C. By Overlocking vs. Stock 2080Ti FTW3 speeds (which are already faster), I noticed a nice performance jump by overclocking. :)

Have fun OC'ing, it's almost impossible to hurt video cards overclocking unless you decide to Voltage Mod past nvidia's set limits.
 
Seeing the EVGA 2080Ti with the AIO under US$1300, that's what I'd go with if I made the jump. Love my 1080Ti with one.

Also revived an aging two-fan GTX 970 for an ITX rig, and while stable, holy hell is that thing loud when you choke off the airflow. Have a Noctua on the CPU set to 100% fan in the BIOS, 8700K with MCE boosting to 4.7GHz on all cores, and I can't even hear it from the seating position and it doesn't skip a beat.

I'd throw an AIO on the GPU if there was room; as it stands, there isn't room for a three-fan version, and I'll likely take to dremeling out some more venting to let it breathe. The ITX case is the Cryorig Taku, for reference. It's fairly dumbly laid out for performance / gaming, but I'm making it work, and it does indeed look sharp in the living room.
 
So far my card can do 850 offset on memory, 100 on core, with no voltage offset and only 124% power target (as far as my slider goes in Precision X1) . 1000 on the memory, or 140 on the core and its unstable without the voltage offset.

After adding the 100 offset on voltage, It can do 1000 offset on the memory, 100 on the core (all i've tried so far) and it's still stable. Seeing 2055 Mhz after it settles down in Battlefront II, initially starts around 2090 ish. Tried 140 core and game ctd's, and just tried 120 and it was fine. Game now runs at 2085, so somehow its 10 mhz more stable? (got +30 ingame from a +20 offset bump)

At 3440x1440 getting 130+ fps. Monitor caps out at 100 so I guess I need a faster display :)

Nice how easy these oc :)
 
I'm joining the club with a 2080 ti gaming x trio. Beast of a card. Currently testing it out at 2100mhz (950mv) and 8000 mem on BF V. I'm on spring and my temps at this loads are around 45 to 50 deg. The card is not loading 100% all the time since I'm on a 7700k at 5ghz.
I'm really impressed on this card!!!
 
I'm joining the club with a 2080 ti gaming x trio. Beast of a card. Currently testing it out at 2100mhz (950mv) and 8000 mem on BF V. I'm on spring and my temps at this loads are around 45 to 50 deg. The card is not loading 100% all the time since I'm on a 7700k at 5ghz.
I'm really impressed on this card!!!

Don’t forget DSR is a thing if there are games where your GPU is under utilized. I find it a great way to burn extra FPS.

I also think a little bit of DSR + 2xMSAA looks better than 4xMSAA and is possible to get slightly higher FPS.
 
Don’t forget DSR is a thing if there are games where your GPU is under utilized. I find it a great way to burn extra FPS.

I also think a little bit of DSR + 2xMSAA looks better than 4xMSAA and is possible to get slightly higher FPS.
The card usually is at 9x %, I mean, with my 1080 ti I get 99% almost all the time, but with this beast, I guess my CPU is bottlenecking. Can you tell me more about DSR?
Thanks!
 
Don’t forget DSR is a thing if there are games where your GPU is under utilized. I find it a great way to burn extra FPS.

I also think a little bit of DSR + 2xMSAA looks better than 4xMSAA and is possible to get slightly higher FPS.
I've read what that string means, and yes I knew about super resolutions before, but don't you think DSR will make my CPU bottleneck even more? I'm playing at 1440p 165hz right now...
 
I've read what that string means, and yes I knew about super resolutions before, but don't you think DSR will make my CPU bottleneck even more? I'm playing at 1440p 165hz right now...

Yeah, it’s basically SSAA with a filter. It renders the image at a higher resolution which mainly puts more load on your GPU.

If you’re in the 9x% and are aiming for high Hz you’re find but if you ever have fps to burn DSR works in the vast majority of games.
 
You can add me to the club. Sold my Radeon VII (for a decent $800 CAD) and picked up a Gaming X Trio 2080 Super. Haven't used it too much yet, but the small amount of gaming that I did play the GPU ran whisper quiet. Especially compared to the jet engine that was the VII's cooler.
 
Yeah, it’s basically SSAA with a filter. It renders the image at a higher resolution which mainly puts more load on your GPU.

If you’re in the 9x% and are aiming for high Hz you’re find but if you ever have fps to burn DSR works in the vast majority of games.
Yes 165 is good at my resolution with my HW I guess, but I'll try when I arrive that tech to see if bottlenecks my cpu or end up using more VGA....
 
It renders the image at a higher resolution which mainly puts more load on your GPU.

This seems backward a bit to me. While I understand that some visuals do increase CPU usage with resolution, framerates are also going to go down a bit too, right?

Misread
 
Last edited:
DSR is great to get better image quality for older or less intensive titles.

Keep in mind that the "virtual" resolution you set will be what the game is rendering at.

So if you set a virtual 4K res, the game performance will be that of 4K res (even if you have a 1080p monitor).

I use DSR for old games like Half-Life 2 or Left4Dead. I'm running at 5K ultrawide resolution (5120x2160) 166Hz. It's amazing.

It does not look quite as sharp as a real 4K or 5K monitor, obviously, but it looks way better than 1080p. Worth looking into if you have performance on the table.
 
This seems backward a bit to me. While I understand that some visuals do increase CPU usage with resolution, framerates are also going to go down a bit too, right?

I mentioned if you have fps to burn DSR could be a good way to do it. It will lower your fps in trade for IQ. If he’s CPU bottlenecked and his GPU is sitting at low util + settings cranked, would be a good place for DSR, ect.
 
I mentioned if you have fps to burn DSR could be a good way to do it. It will lower your fps in trade for IQ. If he’s CPU bottlenecked and his GPU is sitting at low util + settings cranked, would be a good place for DSR, ect.

Got that part - just not sure how it would increase CPU usage?

Misread
 
Last edited:
Anyone tried BFV with raytracing on? If so, do you get a noticeable lower FPS vs RT off?
 
Anyone tried BFV with raytracing on? If so, do you get a noticeable lower FPS vs RT off?

I play BFV at 1440P with Ray Tracing set to Ultra. Frames are obviously less than with RT off, but I'm still able to maintain 80~100 FPS on average. Sometimes its higher near 120~130, sometimes when it's raining on a large map it drops into the 60~70 range. However, for me, the details are well worth it and it is still extremely playable. TBH, I can't go back to it off even if I had too because I'm a huge fan of max graphics and details in games.
 
What monitor would you guys suggest for a dual 2080ti setup?
 
Back
Top