3080 10g and 4k

killroy67

[H]ard|Gawd
Joined
Oct 16, 2006
Messages
1,583
So I was all set on the 3080 until I thought what if I go from 2k to 4k, will 10gb of memory be enough. Does anyone game at 4k find they have any problems with the 1000 or 2000 series? I don't see myself going to 4k anytime soon, but I am curious to how it performs on current generation cards.
 
Last edited:
So I was all sett on the 3080 until I thought what if I go from 2k to 4k, will 10gb of memory be enough. Does anyone game at 4k find they have any problems with the 1000 or 2000 series? I don't see myself going to 4k anytime soon, but I am curious to how it performs on current generation cards.
That is a good question. It will be very interessting to see how the 10GB will hold up. It's weird for me, I bought a Titan X in 2015 which had 12GB, then I bought a 1080ti with 11GB in 2017 and now I want to buy the 3080 with 10GB in 2020... maybe I just should get the 3090 ;)
 
I haven't seen any evidence of the 2080 Ti being VRAM limited. Plus it sounds like nVidia has some newer memory compression techniques that are supposed to make better use of VRAM with less utilization overall. But as always, we shall see. I don't think having 20 GB will be necessary.
 
So I was all sett on the 3080 until I thought what if I go from 2k to 4k, will 10gb of memory be enough. Does anyone game at 4k find they have any problems with the 1000 or 2000 series? I don't see myself going to 4k anytime soon, but I am curious to how it performs on current generation cards.
With a 2080 Ti at 4K, I haven't found any instances where its 11GB of memory obviously isn't enough.

I assume that will change eventually, but it's not presently the case, as far as I know. I've heard some rumblings that maybe the new MS Flight Simulator calls for more than that if you go ham with the settings, but I've not had the pleasure of testing this.
 
A single game out right now even uses 8gb at 4K without mods (the exception being Microsoft Flight Simulator 2020 but that game is a massively scaled strange exception as they release a new version to it once a decade or more). A few come close though. 10gb should be able to hold the line. Vram usage hasn’t been increasing much year on year.
 
It shouldn't matter with the 30 series gen. But as the consoles age and they start adding more bells and whistles for PC ports, it may be an issue since the current consoles are going to have 4K assets by default.
 
I would of felt a lot more comfortable if the 3800 was 12gb instead of 10gb. I may just hold off and wait for the next version which may be a 3080ti at 20gb.
 
FWIW I game at 4K, always max setting and I keep RT on (sometimes DLSS 2.0 on, sometimes off) and have a 2080Ti. The most vram I have seen used is 9~10GB, which is close to the 11GB present. I can see future games pushing this more frequently, but on average I notice my usage is more in the 6GB~7GB range at 4k. But as graphics progress and RT increases, I expect the need for more vram to show itself.
 
My situation is I have an HP Reverb G2 coming soon (true 4k VR headset) and I really want to play MS Fligh Sim 2020 on it. I'm eyeing the 3080 10GB, but am worried it won't be enough VRAM for FS 2020 at 4K. I guess we can only wait for benchies...
 
FWIW I game at 4K, always max setting and I keep RT on (sometimes DLSS 2.0 on, sometimes off) and have a 2080Ti. The most vram I have seen used is 9~10GB, which is close to the 11GB present. I can see future games pushing this more frequently, but on average I notice my usage is more in the 6GB~7GB range at 4k. But as graphics progress and RT increases, I expect the need for more vram to show itself.

What games use that much Vram? I was only aware of one that broke 8gb from watching benchmarks and my own testing. If my post was wrong earlier in the thread I’d like to know so I can correct it.
 
What games use that much Vram? I was only aware of one that broke 8gb from watching benchmarks and my own testing. If my post was wrong earlier in the thread I’d like to know so I can correct it.
So far Shadow of The Tomb Raider at 4K with RT on, HDR on and all settings max will hit around 9.5GB for me in spots.

Also, Microsoft Flight Simulator 2020 at 4K Ultra I heard will actually go past 11GB on cards that have more. I do own MSF2020, but have not actually monitored vram usage in it yet.
 
So far Shadow of The Tomb Raider at 4K with RT on, HDR on and all settings max will hit around 9.5GB for me in spots.

Also, Microsoft Flight Simulator 2020 at 4K Ultra I heard will actually go past 11GB on cards that have more. I do own MSF2020, but have not actually monitored vram usage in it yet.

MSF2020 was the one I knew about. I saw a video where it got up to just a tick under 18gb usage flying over New York. 😦
 
MSF2020 was the one I knew about. I saw a video where it got up to just a tick under 18gb usage flying over New York. 😦
But damn does the game look amazing though! Surprisingly playable at 4K on Ultra at 45~50FPS, with dips into the 30s every now and then on a 2080Ti. Probably because its a sim. I get crazy if an FPS goes below 80 on me... haha.

Can't wait for real 3rd party 3000 series benchmarks on some of these most demanding titles.
 
I'm in the same boat. I have an LG CX OLED that can do 4k120. I was planning on getting a 3090, but the difference in performance between the 3080 and 3090 is very small, and for double the price, I can't justify the cost associated with the 3090.

I've been doing some 4k testing with my 2070 Super @ 8GB VRAM. What I have found is that I can't max out the VRAM currently in any game, including Control and CoD:MW. This has put my mind at ease for the time being and makes me feel better about the 3080.

Honestly, unless you're planning on doing stupid resolutions (8K), the 3090's 24GB VRAM just seems like extreme overkill. Only time will tell.
 
I'm in the same boat. I have an LG CX OLED that can do 4k120. I was planning on getting a 3090, but the difference in performance between the 3080 and 3090 is very small, and for double the price, I can't justify the cost associated with the 3090.

I've been doing some 4k testing with my 2070 Super @ 8GB VRAM. What I have found is that I can't max out the VRAM currently in any game, including Control and CoD:MW. This has put my mind at ease for the time being and makes me feel better about the 3080.

Honestly, unless you're planning on doing stupid resolutions (8K), the 3090's 24GB VRAM just seems like extreme overkill. Only time will tell.
I have a C9 OLED and was considering the 3090, but I just don't think the performance between the two is going to warrant the massive price difference.
 
I've been gaming at 4k60 since 2014, and am not concerned about the 10gb vram on the rtx 3080. Just because a game allocates it doesn't mean it needs it. Worst comes to worst, I dial down shadows by one notch in a couple of years. I have always found that I run out of gpu horsepower before vram anyway. Looking forward to going 4K120 at some point.
 
I've been gaming at 4k60 since 2014, and am not concerned about the 10gb vram on the rtx 3080. Just because a game allocates it doesn't mean it needs it. Worst comes to worst, I dial down shadows by one notch in a couple of years. I have always found that I run out of gpu horsepower before vram anyway. Looking forward to going 4K120 at some point.

I was going to say exactly this.
 
The cynical side in me thinks Nvida has a 16GB 3080 TI for 899 to launch around big Navi Launch or announce it then anyway.

they will piss off everyone who bought 3080 launching anything above it so close to launch. Also I really don’t think they are going to be calling them TI anymore. It’s all super. That’s why they came up with 3090 series now. To go super across lineup if they want.

But I might see AIBs do 20GB models. But I don’t see them releasing anything else so close to launch. Maybe sometime next year as refresh.
 
The cynical side in me thinks Nvida has a 16GB 3080 TI for 899 to launch around big Navi Launch or announce it then anyway.
Eh I am assuming with memory bus it will use we will get 11 or 22gb 3080ti. I don't expect any less the $1000.
 
Of course what might push me to the 3090 is the 3080 being out of stock and I'm sitting there starting at a 3090 in stock........
The 3090 will be even harder to get I bet. It will be a lot lower volume part.
 
I've been gaming at 4k60 since 2014, and am not concerned about the 10gb vram on the rtx 3080. Just because a game allocates it doesn't mean it needs it. Worst comes to worst, I dial down shadows by one notch in a couple of years. I have always found that I run out of gpu horsepower before vram anyway. Looking forward to going 4K120 at some point.

‘Plus if you aren’t obsessive about nothing but max settings in modern titles high shadows vs ultra are basically indistinguishable in motion. I always leave them down one notch now after some research because I can barely tell the difference in uncompressed screenshots while pouring over it on my big ass TV.
 
The other transformative tech that Nvidia has introduced is DLSS, which is able to take a lower resolution image and upscale it using AI. This will help to reduce VRAM load as well.
 
It shouldn't matter with the 30 series gen. But as the consoles age and they start adding more bells and whistles for PC ports, it may be an issue since the current consoles are going to have 4K assets by default.

Right,but by the time it matters, the video card will be too under-powered to play modern titles maxed-out. Because just like the PS4 Pro, we will see a midlife upgrade o these current consoles. (unlocking the true potential pf that massive 16GB VRAM). until then, most PC ports will be underwhelming.

You can find corner cases (high-res mod packs, or single games like Flight Sim 2020, but most gamer s will be fine for the next 5 years with 8GB Vram (let-alone 10).

You have to pay extra for all these bandwidth improvements required to power the 250% performance increase since the GDDR5X-powered GTX 1080 ti replaced a long line of GDDR5 cards! As a result, it's going to be another year before we see double VRAM become universal.

As someone still paying less-demanding recent 1080p games (Borderlands remastered, Jedi Knight Fallen Oder, and Dragon Quest 11, and Life is Strange 2,) all run at near maX, quite comfortable on my GTX 960 2gb card. Even played FFXV demo at mixed med/high settings on my HTPC,


You will be able to use spare main memory for some VRAM cache (so it doesn't all have to fit). I can even play older games (2012 or earlier) at native 4k resolution on that card! Having five times the VRAM than my GTX 960 (and many generations improved compression) should make modern AAA 4k gaming easy :D

Also, even though consoles have 16gb ram, it's not all reserved for the frame buffer,
the Xbox Series X has 10GB VRAM, exactly the same as the 3080. But the PC has the advantage of having AT LEAST 16GB of main memory to work with. The PS5 is fully-shared, but you're going to have to dedicate a similar 10/6 split between game engine,+ level data + multitasking.

Much like 8GB ram in 2013 sounded like overkill for a console;, 16 is going to get tiny quick!
 
Last edited:
What games use that much Vram? I was only aware of one that broke 8gb from watching benchmarks and my own testing. If my post was wrong earlier in the thread I’d like to know so I can correct it.

RAM usage numbers alone are useless.
They need to be granulated out like with System RAM:
1599120196719.png


That 1 GB of "standby" are both in use...and then really not.
 
You’ll be fine although I’d be curious to find out how much vram future titles coming this generation will utilize. The Unreal Engine rocks demo they showed on PS5 was very visually impressive but I’d imagine it also eats a lot of vram, especially at 4K.
 
RAM usage numbers alone are useless.
They need to be granulated out like with System RAM:
View attachment 275859

That 1 GB of "standby" are both in use...and then really not.

The only way you can show an accurate measure is by benchmarking game with two different GPU memory configs. like an RX 570 8gb vs 4gb or a GTX 1060 6GB vs GB (the shader cut is small enough you would notice the "?much greater than 10%" performance falloff).
 
Last edited:
videocardz.com and a few other sites where showing that Lenovo had leaked there would be 3070 and 3080 variants with double the ram, though with the caveat that Lenovo has a history of leaking erroneous things.
 
10 GB vram may only become an issue when the system has 'only' 16 GB of system Ram.

Your vram can help out if you are short on system ram.

This was shown by HUB when they tested a 6GB gtx 1060 and it was fine with both 8GB and 16GB system ram.

Under the same conditions, the 3GB GTX suffered with only 8 GB of system Ram.

I can see a similiar scenario but scaled up with the 3080.
 
I'd rather save hundreds now on VRAM and have a good chunk of the next card saved up by which point more RAM might have any use at all.


Spare me FS2020 uses 18.5GB in NYC I don't play it.
 
The 10GB memory seems to be one of the more hottely debated aspects of this card. I don't understand why. Nvidia has been pretty clear on this, the 3080 only has 10GB because that's all it will effectively be able to utilize. It was a cost saving measure, and a smart one. The past two generations had more, and I don't ever recall an example of a 1080 or 2080 using up anywhere near 11gb. They had it because they could. It was a flex on Nvidias part, and a flex people were willing to pay for.

In a few years, games will need more VRAM. What people fail to realize is that even if in 3-4 years we see games topping 10gb, that doesn't mean that the 3080 will be fast enough to perform at whatever settings level more than 10gb would require. It's not as if a GPU would forever stay a top performer, if only it had enough VRAM.

Nvidia knows what they are doing. They built the card the way they did for a reason. I'll trust Nvidia over some keyboard warrior claming 10GB won't be enough.
 
The 10GB memory seems to be one of the more hottely debated aspects of this card. I don't understand why. Nvidia has been pretty clear on this, the 3080 only has 10GB because that's all it will effectively be able to utilize. It was a cost saving measure, and a smart one. The past two generations had more, and I don't ever recall an example of a 1080 or 2080 using up anywhere near 11gb. They had it because they could. It was a flex on Nvidias part, and a flex people were willing to pay for.

In a few years, games will need more VRAM. What people fail to realize is that even if in 3-4 years we see games topping 10gb, that doesn't mean that the 3080 will be fast enough to perform at whatever settings level more than 10gb would require. It's not as if a GPU would forever stay a top performer, if only it had enough VRAM.

Nvidia knows what they are doing. They built the card the way they did for a reason. I'll trust Nvidia over some keyboard warrior claming 10GB won't be enough.
Sigh, things like textures that need a lot of vram have almost zero impact on performance so I get tired of this nonsense claiming that a GPU has to be a certain speed to take advantage of more vram. Let's discuss some actual facts here which is that Wolfenstein youngblood right now here today requires more than 8 gigs of video RAM to run Max settings with Ray tracing even at just 1440p. If you Max all the settings like that and try to run the game it will tell you that you have exceeded the vram and if you try to run the game it will hitch and eventually lock up. You have to turn down texture streaming one notch and also lower the dlss setting to at least balanced. With more vram that is not an issue and having more GPU power would be irrelevant as again it's vram that is the limitation there. I have tested this on a 2080 super and 2060 super both of which have 8 gigs of vram. And this other nonsense people are saying about having plenty of system ram can offest the vram does not apply to plenty of games as if they run out of vram they will hitch stutter and in the case of Wolfenstein youngblood eventually just lock up.
 
Last edited:
The 10GB memory seems to be one of the more hottely debated aspects of this card. I don't understand why. Nvidia has been pretty clear on this, the 3080 only has 10GB because that's all it will effectively be able to utilize. It was a cost saving measure, and a smart one. The past two generations had more, and I don't ever recall an example of a 1080 or 2080 using up anywhere near 11gb. They had it because they could. It was a flex on Nvidias part, and a flex people were willing to pay for.

In a few years, games will need more VRAM. What people fail to realize is that even if in 3-4 years we see games topping 10gb, that doesn't mean that the 3080 will be fast enough to perform at whatever settings level more than 10gb would require. It's not as if a GPU would forever stay a top performer, if only it had enough VRAM.

Nvidia knows what they are doing. They built the card the way they did for a reason. I'll trust Nvidia over some keyboard warrior claming 10GB won't be enough.

Exactly this. Game performance demands a exceeding vram demands the last few years. A good comparison is Far Cry Primal vs Far Cry 5. FC:5 needs far more gpu power but both only need about 4 GB vram.
 
Mates, you cnt futureproof in electronics. By the time you need more ram, there will be other options available cheaper and yours truly won't be able to hold its weight by then anyways.

If you want to futureproof, save money to upgrade in few years.
A very true statement.
 
Mates, you cnt futureproof in electronics. By the time you need more ram, there will be other options available cheaper and yours truly won't be able to hold its weight by then anyways.

If you want to futureproof, save money to upgrade in few years.
Actually you can if there are options to do so. The 320 mb 8800gts was obsolete long before the 640 mb version was. The 256 mb 8800 gt was essentially doa compared to the 512 mb version. The 6 gb 1060 can run higher settings at much smoother performance than the 3 gb version. The 8 gb 570/580 allows much higher textures and such than the 4 gb version.
 
The cards you mention were budget, stripped-down versions.
And that changes nothing about what I said. He made a general statement that you can't future proof yourself in electronics. The point is that you can future-proof yourself if there are other options available.
 
Back
Top