Question about an old 1080TI and 11 gig

narsbars

2[H]4U
Joined
Jan 18, 2006
Messages
2,779
Surprise, I couldn't find a 3070 $499.00 at BestBuy. I grabbed a used 1080TI FE. How did the argument about the usability of having 11 gig onboard work out in the long run. I am thinking about the current 3070s with 8 gig DDR6 vs the ancient 1080TI with DDR5 and 11 gig. Obviously no direct competition but will the 11 gig help at all? I am just going to do basic OC with MSI afterburner with conservative OC. I will be running a 4K TV with HDMI 2.1 as I can't afford a big enough real monitor to compensate for my lousy eyesight.
 
Surprise, I couldn't find a 3070 $499.00 at BestBuy. I grabbed a used 1080TI FE. How did the argument about the usability of having 11 gig onboard work out in the long run. I am thinking about the current 3070s with 8 gig DDR6 vs the ancient 1080TI with DDR5 and 11 gig. Obviously no direct competition but will the 11 gig help at all? I am just going to do basic OC with MSI afterburner with conservative OC. I will be running a 4K TV with HDMI 2.1 as I can't afford a big enough real monitor to compensate for my lousy eyesight.
I believe it made no difference for a 1080ti to have 11gb. I don't think I even hit 10gb with any game I played while having a 1080ti. Sure there are outliers and games that just allocated all available memory for a game but I never seen it limiting me in anyway.
 
I believe it made no difference for a 1080ti to have 11gb. I don't think I even hit 10gb with any game I played while having a 1080ti. Sure there are outliers and games that just allocated all available memory for a game but I never seen it limiting me in anyway.
If you hit 10GB then having 8GB would've limited you. So you draw the exact opposite conclusion of what you should be.
 
If you hit 10GB then having 8GB would've limited you. So you draw the exact opposite conclusion of what you should be.
I never hit 8gb on anything that was playable frame rates for majority of the time. More then likely it was just allocation more then actually needing the memory.
 
I never hit 8gb on anything that was playable frame rates for majority of the time. More then likely it was just allocation more then actually needing the memory.
Love the quantifier "majority of the time". But that still means it matters sometimes. This is like saying that it's not worth having a GPU that can do 100fps for the majority of time because it only dips bellow 60 a few times.

As for whether software actually needed the memory or not, LOL, how could you tell? That it didn't "need it"?

Anyway, having 11GB of memory does help in certain bottleneck situations if you play at 4K or close to it with very texture heavy games. It's not like you can have a 1080ti with less memory anyway, so it is purely academic question. I'd rather have it and rarely need it, that not have it at all.
 
Surprise, I couldn't find a 3070 $499.00 at BestBuy. I grabbed a used 1080TI FE. How did the argument about the usability of having 11 gig onboard work out in the long run. I am thinking about the current 3070s with 8 gig DDR6 vs the ancient 1080TI with DDR5 and 11 gig. Obviously no direct competition but will the 11 gig help at all? I am just going to do basic OC with MSI afterburner with conservative OC. I will be running a 4K TV with HDMI 2.1 as I can't afford a big enough real monitor to compensate for my lousy eyesight.
Im running a 5700xt that has only 8GB. Granted I play at 1440 and not 4K, but my VRAM consumption has always been well less than 8. Like in the 4-6gb range. You should be good just fine man
 
Hoping to get through to late summer when I guess I might be able to grab a 3070 with more than 8 gig at a reasonable price. At that point I should be the proud owner of a $75.00 1080TI. Yeah I can hear the laughtero_Oo_Oo_Oo_Oo_Oo_Oo_Oo_O
 
Love the quantifier "majority of the time". But that still means it matters sometimes. This is like saying that it's not worth having a GPU that can do 100fps for the majority of time because it only dips bellow 60 a few times.

As for whether software actually needed the memory or not, LOL, how could you tell? That it didn't "need it"?

Anyway, having 11GB of memory does help in certain bottleneck situations if you play at 4K or close to it with very texture heavy games. It's not like you can have a 1080ti with less memory anyway, so it is purely academic question. I'd rather have it and rarely need it, that not have it at all.
Majority only matter the few instance where it might matter are normally outliers. Like 1000 graphic mods for shitty games like skyrim. Which I don't care for. For normally gaming it is enough.
 
Im running a 5700xt that has only 8GB. Granted I play at 1440 and not 4K, but my VRAM consumption has always been well less than 8. Like in the 4-6gb range. You should be good just fine man
4K is 2.5x the resolution of 2560x1440. You cannot make a direct comparison. Trust me when I say that you will quickly be missing the extra VRAM if you game on 4K.
 
With high res texture mods I've definitely gone over 8GB at 4k with my 1080ti.

I wouldn't really call it a rare scenario, as it's a very common one for the types of games I play. (Sim/management and RPG games with tons of mods.)
 
Love the quantifier "majority of the time". But that still means it matters sometimes. This is like saying that it's not worth having a GPU that can do 100fps for the majority of time because it only dips bellow 60 a few times.

As for whether software actually needed the memory or not, LOL, how could you tell? That it didn't "need it"?

Anyway, having 11GB of memory does help in certain bottleneck situations if you play at 4K or close to it with very texture heavy games. It's not like you can have a 1080ti with less memory anyway, so it is purely academic question. I'd rather have it and rarely need it, that not have it at all.
Its easy to tell. Have you ever ran out of VRAM before? Because I did all the time playing modded Skyrim on a GTX 670 in 2012 and the result is it slows down to a slideshow and usually resulted in me hard resetting the system and then dropping texture mods until I got a more powerful card with additional VRAM.

Unless your game suddenly became a slideshow, you didn't run out of VRAM.
 
Unless your game suddenly became a slideshow, you didn't run out of VRAM.
With how fast regular ram got has a buffer ( a bit like missing ram on Windows got way more subtle over time with compressing data on the ram and the hard drive cache / ssd becoming faster as a buffer) I could imagine that there is case where you lose just a little bit when you get too close or just out of it instead of crashing down once you miss a significant amount and framerate drop like a rock making it obvious.


I am thinking about the current 3070s with 8 gig DDR6 vs the ancient 1080TI with DDR5 and 11 gig.
Even for the memory hungry title, even at 4K, I do not think there a case where a 1080TI get close to a 3070:
https://www.gpucheck.com/compare/nv...ore-i9-10900k-vs-intel-core-i7-7700k-4-20ghz/

I am no sure there is much case for recent game where a playable frame rate on a 1080TI would have great use of having more than 8 gig.

Take a very memory hungry title and even going outside playable for a 1080TI scenarios:
https://www.guru3d.com/articles-pages/geforce-rtx-3070-founder-review,20.html

Flight simulator at 4K

3070 vs 1080TI ratio
1080p: 109%
1440p: 127%
2160p: 141%

Has scenario require more ram it seem the gap get bigger and bigger in favor on the 3070 and I do not remember seeing a case of a sweet spot were the 1080TI gain over it because of having more ram, maybe they do exist too.
 
Its easy to tell. Have you ever ran out of VRAM before? Because I did all the time playing modded Skyrim on a GTX 670 in 2012 and the result is it slows down to a slideshow and usually resulted in me hard resetting the system and then dropping texture mods until I got a more powerful card with additional VRAM.

Unless your game suddenly became a slideshow, you didn't run out of VRAM.
Yes, I've run out of VRAM, and it doesn't necessarily mean slideshow. It can mean 90% of the scene is in memory. And everything is fine, but when you do a 180 turn it pauses for 100-200ms while it swaps in textures for objects behind you. It gets really annoying when it does that every time you turn around in a game, yet the AVG FPS will remain virtually unchanged.

That's why looking just at average frame rate doesn't show the whole picture. I'm not even sure this type of hitch would even be visible in 1% lows. Maybe not even 0.1% lows. So the only thing that might be able to show running out of VRAM is your good old eyeballs when you are just balancing at the edge of your RAM capacity. That's a bit different than trying to play a game with 8GB of textures in a scene on a 2GB card.
 
Last edited:
Looks like I can expect a little bit more functionality and life out of my 1080TI, 11 gig. Maybe it is just placebo effect but hopefully I can last long enough for the madness to be over, 3070s with both more memory and a reasonable price to come out.
Maybe the high gig 3xxx series cards will kick off the crazy again and I will consider myself lucky if I can buy a used 3070 with 8 gig. Target: This coming summer by July :) or :( ?
 
Yes, I've run out of VRAM, and it doesn't necessarily mean slideshow. It can mean 90% of the scene is in memory. And everything is fine, but when you do a 180 turn it pauses for 100-200ms while it swaps in textures for objects behind you. It gets really annoying when it does that every time you turn around in a game, yet the AVG FPS will remain virtually unchanged.

That's why looking just at average frame rate doesn't show the whole picture. I'm not even sure this type of hitch would even be visible in 1% lows. Maybe not even 0.1% lows. So the only thing that might be able to show running out of VRAM is your good old eyeballs when you are just balancing at the edge of your RAM capacity. That's a bit different than trying to play a game with 8GB of textures in a scene on a 2GB card.
I guess it's gotten better then because back then it totally was a slideshow trying to move around and out of VRAM.
 
I have a 1080ti running dual 4K screens, and I haven't as of yet found something I couldn't play with decent frames. Might not be playing the most demanding games, but it's still an excellent card.
 
1080 Ti is still a very good card.
Yeah, every time I look at the benchmarks for the new cards that compare to a 1080Ti, i'm reminded of why "upgrading" to something worthwhile is gonna be an $800 investment. I coasted by on XX60 series upgrades forever because every two generations it was $200 to get something that absolutely trounced the previous XX60 card. While right now, it's great that the 3060 is finally the same power* as a 1080Ti, but its also not so great that the 3060 is still the same COST as a 1080Ti, even secondhand, and with much less VRAM.

*Rasterization, I know that DLSS and raytracing and such absolutely do matter, but not enough to spend $400 for a rasterization side-grade.
**assuming that you can find a 3060, 3060ti or 1080ti at all. 1080Tis are more expensive now than when I got mine 1.5 years ago.
 
Someone on this forum was an absolute gem and sold me a 1080ti for a very fair price when I posted a WTB for a specific card model. People on here are pretty awesome - maybe you can find a fair buy/trade ¯\_(ツ)_/¯
 
Updating ancient post: Finally got a LHR 3070 at MSRP $679.00. I guess I will be finding out the difference. Time to sell the 1080TI.
 
Back
Top