How Does the GTX 1080 Ti Stack Up in 2020?

The way I see it is this. If you currently own a 1080Ti and play @ 1080p or 1440p, there really is no reason to upgrade at all.

Your only doing it to have the latest and greatest. That’s how good the card still is.

Only reason would be if you want to try ray tracing I guess.
 


HUB also showed how well it stands out. Other then Control (which i dont even know who even plays that game). It is still a great card!
 
Only reason would be if you want to try ray tracing I guess.

That and higher frame rates. Depending on the game, the 1080Ti can't push things all that fast at 2.5k. I mean you'll almost always get at least around 60, but of course faster is nicer when you have a nice high framerate monitor. That's the biggest reason I want a new GPU (well ok second biggest, just liking new toys is the biggest). I have a 2560x1440 144Hz monitor and I'm a big fan of framerates > 60. It really does look and feel smoother. For some games, no problem, it'll push it to near the limit, or even be refresh limited. However for others, nah, it just can't handle it. So it would be nice to have a more beefy GPU that could regularly handle 120Hz+ on new titles.

Now that said, the 2080Ti is just not impressive enough performance for the price, which is why I don't have one. I was ready to buy one when it launched but they can fuck right off with a $1200 price. Hence my 1080Ti continues to serve faithfully. But even without 4k, there is reason to want more. Keep in mind that 2.5k120 is roughly same same amount of power you need for 4k60.
 
Now that said, the 2080Ti is just not impressive enough performance for the price, which is why I don't have one. I was ready to buy one when it launched but they can fuck right off with a $1200 price. Hence my 1080Ti continues to serve faithfully. But even without 4k, there is reason to want more. Keep in mind that 2.5k120 is roughly same same amount of power you need for 4k60.

I thought the same thing until I went 4K. There are plenty of titles that the 2080ti allows me to run substantially better at 4k. AC Odyssey, GR Breakpoint, etc. You're correct, a 2080ti won't even get you to 4k 60 FPS in a lot of these newer games, but I can tell you that the 1080ti was barely getting me 30fps in AC Odyssey. When the difference is 30-40 FPS or 40-50 FPS along with substantially higher minimums the 2080ti is worth it. Further, a number of older titles get boosted up to the point where i'm hitting 100-120 at 4k.

Further, people will downplay it because they are haters, but in games like Control or MW5 raytracing + DLSS 2.0 is amazing. DLSS 2.0 in particular really needs to be implemented in more games. It literally doubles the frame rate without any visual quality loss. So games that are sitting under 60 FPS @ 4k would be far over.

I agree though, had I never gotten my Acer X27 and just stuck with my 1440 monitor the 1080ti was just fine. People bashing on the 2080ti need to understand there is absolutely a market for it. If I spent $2k on a monitor I have no problem spending $1100 on a GPU that lets me use the technology in that monitor better.
 
I paid like effective $530 (tax and all) for my 1080Ti 3 years ago, essentially an open box type purchase on a ZOTAC AMP edition. I got a lot of longevity out of her. Kind of went off on a console tangent for awhile there with a new TV. Back in the PC space with a Corsair Lapboard purchase, and glad this thing is still very much relevant. I guess you could argue it's kind of depressing, that deal huunting in 2017, gets you the same performance you can get today for ~$500?

My score was a bit of an outlier, but they were only supposed to be ~700 + tax iirc. Maybe a bit more for a non-reference like this one. It was an Amazon warehouse deal, tax return funds converted to Amazon bucks + 10%, and they sent a coupon for any one Amazon Warehouse item -10%. I just hoped I'd get to use the latter on something good, before blowing it on some crap before the expiration, and therrrre she was. The box was maybe scuffed. I MIGHT have cared if it was truly purchased "new." Probably like a year later, you could fetch 1200 for em when Nv's were actually good at one of the mining thingees. Anyway, as you can tell, I'm quite proud of that purchase, and I <3 the 1080Ti.
 
Last edited:
I agree though, had I never gotten my Acer X27 and just stuck with my 1440 monitor the 1080ti was just fine. People bashing on the 2080ti need to understand there is absolutely a market for it. If I spent $2k on a monitor I have no problem spending $1100 on a GPU that lets me use the technology in that monitor better.

It's not that there isn't a market, it is just that they decided to charge so much without real justification. I was 100% ready to buy it at $700-800 like the 1080Ti, even though it wasn't going to be all that big an upgrade... but then they hiked the price a ton because "fuck you" basically. So I kept my money, even though I wanted a new toy.

We'll see what happens with the 3080Ti. It'll depend on price, performance, and just how bad the economy is screwed when it comes out.
 
So it looks like the 1080ti is going to go down as one of the great cards in the history of cards alongside the likes of the 9800xt and the 8800gtx.


Certainly looks that way. I bought an original 6GB Kepler GTX Titan in 2013 for $1000. It lasted me two years before I had the upgrade itch. That was unheard of for me before that time. In the past I would have gone through 4 GPU upgrades in that time.

Now I'm sitting in year 4 of my Pascal Titan, which was essentially a more expensive 7 month early preview of the 1080ti. I would like to upgrade to get more 4K performance, but I still don't really NEED it.

That's pretty amazing.

Or shitty, depending on how you see it, because it is an indication that GPU tech is advancing much more slowly than it used to.


I am answering all 3 of you. But please tell me why? I don't understand.

Your last sentence Zarathustra is the one the one that describes the situation best. The 1080Ti was released 3 years ago. But if you discount the number of years, there has been no new GPU releases since then apart from Turing. The 8800GTX stands out because it lasted through 3/4 generations of GPUs. The proof of how good/bad the 1080ti is will be after Ampere is released.

I just don't understand why the number of years matter, because if no new technology is released then games are going to be created and optimised to use the current available technology. And don't forget that Turing cards weren't focused on Rasterized performance, so that make the 1080ti's longevity even less impressive.

I think the 1080Ti is a great card, but I don't think it's anything super special like the 8800GTX.
 
Wish I got mine earlier, but i had less money in 2016 and had to make due with a 1060 for a while. Of course, I was still using a Phenom II and a 1080p monitor back then.

I actually just this week finally upgraded the cooler on my 1080ti to the Arctic Accelero Xtreme III (what a name) so it's running faster, cooler, and quieter. I know that even when I'm done with this card - which will likely be when I can afford a 3080ti or similar,it will live in my fiance's PC for quite a few years.

I'm actually CPU restricted at high refresh rates now.
 
It's not that there isn't a market, it is just that they decided to charge so much without real justification. I was 100% ready to buy it at $700-800 like the 1080Ti, even though it wasn't going to be all that big an upgrade... but then they hiked the price a ton because "fuck you" basically. So I kept my money, even though I wanted a new toy.

We'll see what happens with the 3080Ti. It'll depend on price, performance, and just how bad the economy is screwed when it comes out.
I think it was more along the lines of them having the 2080ti ride off the coat tail of the 1080ti. They added a 30% bump in performance and added RT. This was also right off the downward slope of bitcoin where they saw people paying 1200 for 1080tis for an extended period. They figured between that and 1080tis sales and demand that the 2080ti would be bought no problem at the 1200 price point.

I think the 1080Ti is a great card, but I don't think it's anything super special like the 8800GTX.
 
I got a 1080 ti FE for a crazy bargain. It ran fine but was at 90c at load. Just needed new paste.

The cooler sucks. At 4k60 in RDR2 it is by far the loudest GPU I have owned since a GT 6800 EVGA blower I had.

It is able to stay below 80c at 2ghz at 100% fan speed though. I really want to replace the cooler, but the noise is only an issue without headphones.

The 1080 ti itself is a wonderful GPU though. I am seeing its age now. A lot of newer games are putting it to the limits at 4k. I do want to move to 4k144 soon!
 
That and higher frame rates. Depending on the game, the 1080Ti can't push things all that fast at 2.5k. I mean you'll almost always get at least around 60, but of course faster is nicer when you have a nice high framerate monitor. That's the biggest reason I want a new GPU (well ok second biggest, just liking new toys is the biggest). I have a 2560x1440 144Hz monitor and I'm a big fan of framerates > 60. It really does look and feel smoother. For some games, no problem, it'll push it to near the limit, or even be refresh limited. However for others, nah, it just can't handle it. So it would be nice to have a more beefy GPU that could regularly handle 120Hz+ on new titles.

Now that said, the 2080Ti is just not impressive enough performance for the price, which is why I don't have one. I was ready to buy one when it launched but they can fuck right off with a $1200 price. Hence my 1080Ti continues to serve faithfully. But even without 4k, there is reason to want more. Keep in mind that 2.5k120 is roughly same same amount of power you need for 4k60.


That seems like a low estimate for the 1080ti at 1440p.

I feel like my stepson is running 1440p at 120+fps using a 2060 Super, and a 1080ti should be a good deal faster than a 2060 Super.

Granted, he mostly runs Fortnite, which isn't the heaviest of games, but still....
 
Last edited:
It's not that there isn't a market, it is just that they decided to charge so much without real justification. I was 100% ready to buy it at $700-800 like the 1080Ti, even though it wasn't going to be all that big an upgrade... but then they hiked the price a ton because "fuck you" basically. So I kept my money, even though I wanted a new toy.

We'll see what happens with the 3080Ti. It'll depend on price, performance, and just how bad the economy is screwed when it comes out.

Yep, but cost and value have never scaled linearly with performance.

Even a relatively small percentage increase in performance can mean the difference between enjoyable framerates, and barely playable ones, especially at 4k, and if you have the cash to throw around, getting enjoyable framerates might be worth it for you.
 
We'll see what happens with the 3080Ti. It'll depend on price, performance, and just how bad the economy is screwed when it comes out.

I had seen some rumors of a 40% increase over previous gen, but I don't really believe it. It would be an extreme outlier compared to recent generation over generation gains if true, though it is their biggest jump in process node in a while, going from 12nm down to 7nm, and in GPU's the process node is more scaleable than in CPU's due to their inherently highly threaded architecture and workloads. So, maybe?

In CPU's when you shrink the node size from 12nm to 7nm you start hitting clock speed limitations. It allows you to add more cores in the same thermal envelope, but CPU loads are a mix bag when it comes to threading, so in many loads you don't benefit much from adding those extra cores.

GPU rendering loads are always 100% threaded. They scale out to as many compute units as you want them to, just by their very nature. So, if you have to sacrifice some clock speed due to moving to 7nm, but that allows you to vastly increase your core count, its no big deal.
 
Loved it enough I bought two of em, EVGA FE that went under water and a MSI Gaming X (grabbed from a miner in 1st qtr 2018 for $400). Both still perform well without issue.
 
That seems like a low estimate for the 1080ti at 1440p.

I feel like my stepson is running 1440p at 120+fps using a 2060 Super, and a 1080ti should be a good deal faster than a 2060 Super.

Granted, he mostly runs Fortnite, which isn't the heaviest of games, but still....

Ya well Fortnite is very low impact. Some games I can think of that I've played that are not all that high an FPS:

--Mankind Divided: I recall it being between 60-90fps in actual gameplay, Guru3D has the benchmark at 59fps in high quality mode.
--Subnautica: No benchmark here and I saw FPS in the 120+ range in parts, but dropping as low as 40 in others.
--Hitman 2: Just ran the Miami benchmark, 36 minimum 86 average settings maxed more or less.
--Metro Exodus: 70-80fps in its benchmark.

With modern high detail games, a 1080Ti just can't refresh cap a fast monitor. It still does well, don't me wrong, they are all more than playable, but you aren't in a situation of getting FPS at or near the max your monitor can handle most of the time.
 
8800 gtx would still be an ok card for 1080p but unfortunately doesn’t get driver updates anymore.


Yeah the 768mb vram would hold up fine!

Yeah, the VRAM would likely be a serious limitation today.

That, and you wouldn't have DX11 or 12.

...and didn't all of these self destruct due to Nvidia's bad solder interconnect issue?

I mean, it started with the mobile 8400M and 8600M chips, but I thought we later found out that all of the Geforce 8 and 9 series were impacted, it just teased out more quickly on mobile parts due to the more extreme temperature cycling.

I remember people complaining that their 9800 series GPU's were failing in 2010 when they started playing Civilization V. The dumb-assess in the 2K forums were trying to blame Firaxis for killing their GPU's.
 
Last edited:
1080p and lower settings? That amount of VRAM should be enough.

It’s a 14 year old card that can still run modern stuff at low settings. That’s pretty impressive to me.
 
Ya well Fortnite is very low impact. Some games I can think of that I've played that are not all that high an FPS:

--Mankind Divided: I recall it being between 60-90fps in actual gameplay, Guru3D has the benchmark at 59fps in high quality mode.
--Subnautica: No benchmark here and I saw FPS in the 120+ range in parts, but dropping as low as 40 in others.
--Hitman 2: Just ran the Miami benchmark, 36 minimum 86 average settings maxed more or less.
--Metro Exodus: 70-80fps in its benchmark.

With modern high detail games, a 1080Ti just can't refresh cap a fast monitor. It still does well, don't me wrong, they are all more than playable, but you aren't in a situation of getting FPS at or near the max your monitor can handle most of the time.
Do you have links for these numbers?
 
1080p and lower settings? That amount of VRAM should be enough.

It’s a 14 year old card that can still run modern stuff at low settings. That’s pretty impressive to me.

Fair. I haven't run anything at 1080p in a very long time, so I don't know what the VRAM requirements are. I assumed they were higher.

How many modern titles can still run on DX10 or lower though?
 
True, as of 2019 a lot more titles are now DX12 / Vulkan only which is going to hard kill a lot of this older stuff.
 
1080p and lower settings? That amount of VRAM should be enough.

It’s a 14 year old card that can still run modern stuff at low settings. That’s pretty impressive to me.

That said, I think vram requirements are higher today.

I was just running The Outer Worlds at 4k, and it was sitting at 8GB to 9GB VRAM.

1080p is a quarter of the size of 4k, so I would expect that to translate to 2GB - 2.25GB or so.

Possibly even more, as I imagine some of that is overhead and wouldn't scale with resolution.
 
That said, I think vram requirements are higher today.

I was just running The Outer Worlds at 4k, and it was sitting at 8GB to 9GB VRAM.

1080p is a quarter of the size of 4k, so I would expect that to translate to 2GB - 2.25GB or so.

Possibly even more, as I imagine some of that is overhead and wouldn't scale with resolution.
Use =/= need. Lot of games use all the vram it can to cache data. Looks at GoW 5. It looks far above The Outer world and only consumed around 6BG at 3440x1440 maxed out with ultra textures. Excessive amounts of memory makes developers lazy.
 
I'm willing to bet a game like Outer Worlds or Borderlands 3 would run with only 768mb of VRAM at low settings, but the game would likely have some areas where the texture would never load beyond the very low detail stuff that renders at distance. With how UE runs it should run, but likely just look like garbage.
 
still rocking my gtx 1080 and I love it but I feel its age now with newer games and my new 1440p 144hz monitor. I end up using around 80% render resolution to get frames to average closer to 120ish and peak at 150hz, sounds bad but 80% render still looks better than trying to run it at 1080p. Looking forward to the new 3080/3080ti, should be a good upgrade and it lets me move my gtx 1080 to my htpc to replace my old ass R9 290 so I can start playing games at 4k on my tv lol.
 
Use =/= need. Lot of games use all the vram it can to cache data. Looks at GoW 5. It looks far above The Outer world and only consumed around 6BG at 3440x1440 maxed out with ultra textures. Excessive amounts of memory makes developers lazy.

You know, I should have known that. I'm becoming forgetful in my ripe old age...

Yeah, it makes sense to pre-fill all textures you'll need into VRAM to avoid having to load textures to the GPU mid rendering, as that is likely to cause stutter, etc. How much of that is really necessary? I don't know.

The minimum specs for the title are a Geforce 650 Ti which had 1GB of VRAM. Maybe 768MB would work.
 
8800 gtx would still be an ok card for 1080p but unfortunately doesn’t get driver updates anymore.
Yeah, the VRAM would likely be a serious limitation today.

That, and you wouldn't have DX11 or 12.

...and didn't all of these self destruct due to Nvidia's bad solder interconnect issue?

I mean, it started with the mobile 8400M and 8600M chips, but I thought we later found out that all of the Geforce 8 and 9 series were impacted, it just teased out more quickly on mobile parts due to the more extreme temperature cycling.

I remember people complaining that their 9800 series GPU's were failing in 2010 when they started playing Civilization V. The dumb-assess in the 2K forums were trying to blame Firaxis for killing their GPU's.
Ja, lack of DirectX 11 & 12 support and the 768 MB framebuffer means it would absolutely not be an "okay" card for 1080p today. I recall the last game I played with mine was Fallout: New Vegas, which came out at the end of 2010, and it did not do well at all at 1920x1080 resolution. I think I was running consistently in the 40 FPS range after tweaking setting and running no mods whatsoever. It ran about as well as Crysis did with working SLI, let's put it that way.

And, yes, both of my 8800 GTX cards died to weak solder joints from the amazing lead-free solder that California still slaps a P65 warning on even though the whole point of making it lead-free in the first place was for health and the "environs."
 
I'm playing most games I play at 4k60 with near maxed settings on a 1080ti. Witcher 3, Doom Eternal, Monster Hunter World, AssCreed Odyssey, Final Fantasy XV, pretty much every single "indie" game I throw at it.
Yeah, I know they aren't "max settings" but I don't care about 16k shadows. Hell I purposely download ultra intensive higher res texture packs and lighting mods because my card handles it.

Monster Hunter World at 4k60 is all I really care about, as I have hundreds of hours in that game and expect hundreds more.

I play my faster FPS games at 1440p, but that's because my 1440p monitor has 165hz while my 4k is only 60. But even then I'm usually more CPU limited than GPU limited, not that I care about 100+fps competitive play.

Only game I've had trouble getting 4k60 is No Man's Sky and I don't have a clue what's up with that title, shits not good looking enough for the performance problems.

Also a few other poorly coded games like city builders and the like. But those don't even hit 30% gpu utilization usually.

Hitman 2 averaged 52-60fps for me during the miami benchmark at 4k. I think I turned down a single setting, but i'd have to double check which one. Performance was weirdly almost identical at 1440p. Haven't done anything else with the game since I just got it in the humble bundle for this month.
 
Last edited:
I felt the same way about my Titan X Pascal. I would've still had it if I didn't get a sweet deal on this 2080 Ti.

Same! My Titan XP was my most expensive GPU purchase but at the same time I still think it was great value and still running 4K on the latest titles.
 
Hitman 2 averaged 52-60fps for me during the miami benchmark at 4k. I think I turned down a single setting, but i'd have to double check which one. Performance was weirdly almost identical at 1440p. Haven't done anything else with the game since I just got it in the humble bundle for this month.
1588059117612.png

Hitman 2, 4k, exclusive fullscreen, DX12, every setting maxed except for shadows which is at "high" instead of "ultra" (difference is about 4fps average)
The "low" FPS is right at the beginning of the benchmark, where there appears to be some weird loading quirk, and the only noticeable FPS drop is during the car crash scene where all of the transparency effects bring the fps down a bit - but they bounce right back.
 
It is the best card I ever owned. I might even just skip the 3080ti if the price is ridiculously high still. Which it will be.
The problem is the price will not go down anytime soon. I dont think amd has an answer unlike they did with their cpu. Leaked specs of ampere are showing it to be a behemoth while using less power. This is going to be $$$. I'm selling my 1080ti to upgrade and run with my 3950x.
 
Back
Top