Just another "is 10GB enough for 4K gaming" thread?

sblantipodi

2[H]4U
Joined
Aug 29, 2010
Messages
3,759
As title.

We have 11GB cards since 1080Ti, do you think that 10GB will be enough for 4K gaming in 2021 and 2022?

Will next gen games uses more VRAM since the new consoles have 16GB?

Is 3080 a safe bet or just a card to skip due to the "low memory"?
 
If you are going to upgrade next generation then I would say the 3080 is a safe bet to get you there. If you are going to keep it for a while then the jury is still out.

Yeah if I had to guess I'd say 8gb will be fine for the next year. Beyond that hard to say.
 
If you are going to upgrade next generation then I would say the 3080 is a safe bet to get you there. If you are going to keep it for a while then the jury is still out.

I would like to use the card for 1.5 years maximum two.
 
I guess it’ll be enough for the next 2 years in the vast majority of games. But I’m sure others will disagree with me.

It’s all speculation at this point. Hard to argue one way or another without having actual input from developers who are already producing next generation games.
 
Yes, it will be fine.
Streaming direct from SSDs will become normal.
 
As title.

We have 11GB cards since 1080Ti, do you think that 10GB will be enough for 4K gaming in 2021 and 2022?

Will next gen games uses more VRAM since the new consoles have 16GB?

Is 3080 a safe bet or just a card to skip due to the "low memory"?
I use a 2080 Super (came from a 1080 Ti, but got the Super for free) to game at 4K without any issues with VRAM being too low. When I run into issues, it's because the 2080 Super doesn't have the GPU power to get me max settings, and I'll turn down shadows / AA as those are usually the biggest hits to performance.

There is 1 game that some report as using more than 8GB causing a 20% slowdown with a 2080, but using the latest driver I do not see it myself. Doom Eternal 4K Ultra Nightmare settings - I've tested, and not seen the issue. Again, this is using a 2080 Super (not 2080) and with the latest drivers.

There is 1 other game, a shitty console port (something about super heroes?), that when you use a texture pack that didn't come with the game, causes high VRAM use and tanks performance.

If outliers make you scared, pay the $1500 for 10% more performance and 24GB VRAM with a 3090. Or wait to see what AMD brings. More 3080s for the rest of us (yeah, I wish - frikken bots buying up everything).
 
Who will live will see.
I find that 3080 is a "meh" upgrade from my 2080ti for the memory reason.
 
Who will live will see.
I find that 3080 is a "meh" upgrade from my 2080ti for the memory reason.
If you game at 1440P or less, I agree it isn't a great upgrade. It's definitely a 4K card since those other resolutions get CPU bound (1080P very much so, and 1440P only seeing about 24% increase over a 2080 Ti, whereas at 4K is 34% higher on average).
 
Last edited:
If you game at 1440P or less, I agree isn't not a great upgrade. It's definitely a 4K card since those other resolutions get CPU bound (1080P very much so, and 1440P only seeing about 24% increase over a 2080 Ti, whereas at 4K is 34% higher on average).

I have a 4k 120Hz monitor
 
As title.

We have 11GB cards since 1080Ti, do you think that 10GB will be enough for 4K gaming in 2021 and 2022?

Will next gen games uses more VRAM since the new consoles have 16GB?

Is 3080 a safe bet or just a card to skip due to the "low memory"?
No for max type settings

Yes if one does not mind reducing texture load, RT settings etc.

  • RT adds 1-2gb -> BVH, compute shaders, more geometry in scene
  • Games texture resolution and quality continue to go up => Increase vram needed
  • Geometry goes up with more complex scenes and increase drawing distances, more object which goes back into more textures and shaders and more objects for RT to calculate from => more vram is needed

Some of the above can be alleviated using the Tensor cores, how much?

No one has a crystal ball but some developers do like to have options that really push things, those options being significant or not is not important, your ability to use them as an option to try, may indeed be important.

12gb would have been a sweet spot giving some leeway in my opinion. Still a 10gb 3080 is a very potent card and should be good for years to come, at present there are not too many memory issues I know of at 10gb. Tomorrow as in 2021/22? Probably.
 
Regarding consoles - aren't they both 16 GB of total memory, with 8 GB dedicated to VRAM? Something like that, I believe. I don't see them pushing 10 GB any time soon.
 
Regarding consoles - aren't they both 16 GB of total memory, with 8 GB dedicated to VRAM? Something like that, I believe. I don't see them pushing 10 GB any time soon.
Correct, but the consoles also have crazy bandwidth available to them from their new SSDs (especially the PS5). Wouldn’t surprise me if developers will be able to stream some assets straight from the SSD instead of loading it into the RAM/VRAM.
 
I'd say 10 Gb will be fine for at least this generation of GPUs. First of all, the new consoles have shared memory so not all can be used for VRAM. And secondly, how many console games are actually ported to the PC?
And using ray tracing at 4K will have your GPU kneel anyway, so does it matter if it is out of breath both because of ray tracing performance and memory limitations?
 
Yes, it will be fine.
Streaming direct from SSDs will become normal.

760GB/s for gddr6x on the 3080 vs 8GB/s for a nvme gen 4 SSD. Latency measured in nanoseconds for GDDR vs milliseconds for SSD.

Have fun with that.
 
760GB/s for gddr6x on the 3080 vs 8GB/s for a nvme gen 4 SSD. Latency measured in nanoseconds for GDDR vs milliseconds for SSD.

Have fun with that.
Streaming is not direct to screen, its to prepare for upcoming frames without needing as much memory.
Latency doesnt factor, all display writes are from vram.
 
Streaming is not direct to screen, its to prepare for upcoming frames without needing as much memory.
Latency doesnt factor, all display writes are from vram.

Streaming assets from storage that are going to be used for a later frame is nothing new and it's never been a substitute for vram
 
Regarding consoles - aren't they both 16 GB of total memory, with 8 GB dedicated to VRAM? Something like that, I believe. I don't see them pushing 10 GB any time soon.

I don't think that there is a real differentiations between system memory and vram there, there is 16gb of fast ram and the game can use it as it needs.
Am I wrong?
 
I don't think you've gotten that far in your explanation.
I'm not your mother, look it up!
You're the one bugging me over something you arent aware of.

ps I've had lots of beer, I dont mean to be rude, but I'm not up for finding links for you atm.
Tomorrow if you still want?
 
I don't think that there is a real differentiations between system memory and vram there, there is 16gb of fast ram and the game can use it as it needs.
Am I wrong?

I think that is true of one of them, the other I believe has a certain amount dedicated to VRAM. Can't remember which is which, though.
 
If 10GB is allocated to the GPU on console, 10GB is not going to be enough for PC. What a mess of a launch.

Something else to consider, by consoles tend to last around 7 years and games have to be designed within the limits of it. So it may be excessive now, but 4-5 years down the line it would be beneficial.
 
If 10GB is allocated to the GPU on console, 10GB is not going to be enough for PC. What a mess of a launch.

Sure it is. People act like if a game releases that can utilize more than 10GB VRAM, their system is going to self destruct if they don't have it.

Lets think about this logically. Even if a game could use more than 10GB VRAM... a 10GB 3080 is still going to vastly outperform an 11GB 1080 Ti. And certainly, that 3080 will still be able to provide a perfectly playable gameplay experience, you just might not be able to max out every single setting. You know, the way every game works the longer you hold onto a GPU. Given that a higher VRAM model is likely to cost significantly more, and likely to provide a tangible benefit on only a small handful of games over the life of the card, the vast majority of people are probably going to be perfectly happy saving a few bucks with the 10GB variants.

If that's not you, if you need the guarantee that you'll not have to make any sacrifices on the handful of games that'll top10GB of VRAM in the next few years, you'll be happy to know that nobody is forcing you to buy a 3080. In a few months, you'll be able to spend an additional $200-$300 on a 20GB variant with all the VRAM you could ever need. Most of us will be happy with the 10GB model and a few extra bucks in our pockets. Ours cards will certainly be fine.

There is a wealth of things to find fault with in the 3080's launch. VRAM, IMO, isn't one of them.
 
I've been trying to stress the crap out of my 2070 Super and its lowly 8GB VRAM on my 4k 120Hz TV, and I have yet to find a game that can do it. I'm not terribly worried at the moment.

You can't max out Ghost Recon Breakpoint and even Destiny 2 will offer subpar performance maxed out from time to time.
 
I've been trying to stress the crap out of my 2070 Super and its lowly 8GB VRAM on my 4k 120Hz TV, and I have yet to find a game that can do it. I'm not terribly worried at the moment.

Control, HZD, Breakpoint, AC:Oddessy, Mechwarrior 5, Destiny 2, Remnant From the Ashes, Total Warhammer 2, Tomb Raider's Shadow, etc.

All of those titles will not hit 120hz at 4k with a 2070 super without IQ reduction. However, I doubt its a VRAM issue.
 
Control, HZD, Breakpoint, AC:Oddessy, Mechwarrior 5, Destiny 2, Remnant From the Ashes, Total Warhammer 2, Tomb Raider's Shadow, etc.

All of those titles will not hit 120hz at 4k with a 2070 super without IQ reduction. However, I doubt its a VRAM issue.

Destiny 2 and Breakpoint will kill a 2070 Super at 4K. At max settings the latter will barely hit 60FPS under Vulkan on very high using a 2080 Super. You have no chance at Ultimate settings. 120Hz? Not possible even on a 2080 Ti. Destiny 2 can't sustain 120FPS 100% of the time at 3440x1440. At 4K it occasionally can drop just below 60FPS on a [email protected] with a 2080 Ti. 120FPS? Not sustainable.

The 2070 Super isn't really a 4K card. Sure, it can do it on some titles but it will struggle on many games at that resolution.
 
People have said Flightsim 2020 requires a lot of VRAM to run properly, but the 24Gb 3090 is only about 15% faster than the 3080 in F.S 2020, the same amount as in other games. So again, allocated VRAM memory is not the same as needed.
 
I thought Nvidia's RTX I/O was/is going to make the need for huge VRAM buffers redundant?

https://www.nvidia.com/en-gb/geforce/news/rtx-io-gpu-accelerated-storage-technology/

Object pop-in and stutter can be reduced, and high-quality textures can be streamed at incredible rates, so even if you’re speeding through a world, everything runs and looks great. In addition, with lossless compression, game download and install sizes can be reduced, allowing gamers to store more games on their SSD while also improving their performance.
 
Sure it is. People act like if a game releases that can utilize more than 10GB VRAM, their system is going to self destruct if they don't have it.

Lets think about this logically. Even if a game could use more than 10GB VRAM... a 10GB 3080 is still going to vastly outperform an 11GB 1080 Ti. And certainly, that 3080 will still be able to provide a perfectly playable gameplay experience, you just might not be able to max out every single setting. You know, the way every game works the longer you hold onto a GPU. Given that a higher VRAM model is likely to cost significantly more, and likely to provide a tangible benefit on only a small handful of games over the life of the card, the vast majority of people are probably going to be perfectly happy saving a few bucks with the 10GB variants.

If that's not you, if you need the guarantee that you'll not have to make any sacrifices on the handful of games that'll top10GB of VRAM in the next few years, you'll be happy to know that nobody is forcing you to buy a 3080. In a few months, you'll be able to spend an additional $200-$300 on a 20GB variant with all the VRAM you could ever need. Most of us will be happy with the 10GB model and a few extra bucks in our pockets. Ours cards will certainly be fine.

There is a wealth of things to find fault with in the 3080's launch. VRAM, IMO, isn't one of them.

On paper, a memory capacity-deficient card may have better average FPS, but the lows and stuttering you'll see once you hit the memory buffer is a turd in a your drink. Running a game with too little memory doesn't "self destruct" your system, but it does feel like you're going up the hill on the Coney Island Cyclone: jerky as all hell.

My 2080 Super was notably faster computationally than my Radeon VII, but running FS 2020 4k ultra settings shows how detrimental hitting a memory buffer is, and the benefit of having sufficient VRAM.

I'm not suggesting most users will need >10GB for most games immediately, but this point is coming at some point over the next few years. If you plan on holding on to your new card for a while, I'd plan accordingly and get something with >10GB.
 
Last edited:
  • Like
Reactions: noko
like this
I'm not suggesting most users will need >10GB for most games immediately, but this point is coming at some point over the next few years. If you plan on holding on to your new card for a while, I'd plan accordingly and get something with >10GB.

I feel like my point holds. The 3090 is currently the only card faster than a 3080 with more memory. The cost is so high that anyone worried about future performance would be better off with a 3080 now and it's successor next year, still likely spending less and long term having a faster GPU.

Anything else with more vram is outclassed by the 3080. Sure, the, 3080 might show its vram weakness in time, but any other higher vram options are already starting to show their weeknessess now. The idea of buying a less powerful gpu with more vram and thinking it's more future proof doesn't resonate with me.
 
I feel like my point holds. The 3090 is currently the only card faster than a 3080 with more memory. The cost is so high that anyone worried about future performance would be better off with a 3080 now and it's successor next year, still likely spending less and long term having a faster GPU.

Anything else with more vram is outclassed by the 3080. Sure, the, 3080 might show its vram weakness in time, but any other higher vram options are already starting to show their weeknessess now. The idea of buying a less powerful gpu with more vram and thinking it's more future proof doesn't resonate with me.

I see your side of it, but I would in fact argue to buy a card with a lesser GPU within a certain range e.g. 10%, if you're going to keep the card for a while. Going back, I really wish I would have gone with a 290x (4GB) in lieu of a 780 ti (3GB; now my "back up" card). The extra 1GB in the 290x makes a world of difference, even at 1080p in most games after 2015.

To put it in relevant and practical terms, if Navi 21 (16GB) is within 10% GPU performance of 3080, and I only had $600-700 to burn (I'm guessing Navi 21 will certainly be cheaper than 3080 20GB), this is the option I'd pick if I was going to keep it for more than a year.
 
  • Like
Reactions: noko
like this
Yep, in FS 2020, 5700XT has better fps than the Vega FE but in San Francisco and other places while the fps is higher it is a stuttery, jerky mess. To get the 5700 XT smooth I have to go to medium settings while the FE will play in VH settings without stutters albeit low fps but superior overall over the 5700 XT. 8gb is definitely not enough even at 1440p without having to lower settings in some games. I don't see 10gb as enough for a card I would want as top end for two years and would keep for 3-5 years while also upgrading to a higher resolution monitor -> dead end card. If Nvidia had 12gb+ that would be much better, AMD 16gb to me is a the sweet spot if RNDA2 has that amount. 3080 20gb also would be golden.
 
Yep, in FS 2020, 5700XT has better fps than the Vega FE but in San Francisco and other places while the fps is higher it is a stuttery, jerky mess. To get the 5700 XT smooth I have to go to medium settings while the FE will play in VH settings without stutters albeit low fps but superior overall over the 5700 XT. 8gb is definitely not enough even at 1440p without having to lower settings in some games. I don't see 10gb as enough for a card I would want as top end for two years and would keep for 3-5 years while also upgrading to a higher resolution monitor -> dead end card. If Nvidia had 12gb+ that would be much better, AMD 16gb to me is a the sweet spot if RNDA2 has that amount. 3080 20gb also would be golden.

3080 20GB will likely be the best-balanced choice in the $800-900 range this generation. That said, I'd rather save a few hundred (my guess is Navi 21 will cost $600) for 10% less performance (again, a guess where I think Navi 21 will land vs. 3080). That $200 will pay for a new Ryzen 4600 and part of a new mobo to support it.
 
3080 20GB will likely be the best-balanced choice in the $800-900 range this generation. That said, I'd rather save a few hundred (my guess is Navi 21 will cost $600) for 10% less performance (again, a guess where I think Navi 21 will land vs. 3080). That $200 will pay for a new Ryzen 4600 and part of a new mobo to support it.
$800-900? The 10GB AIB variants already go for $700-850. By doubling the VRAM I doubt you'll see them for less than $1000.
 
Back
Top