Is 10gb VRAM really enough for 1440p?

Actually, if you did some research, you will know that nvidia limited their own memory speed and bandwidth, from memory it was about only 3 Gbps less than AMD. Having faster memory wont buy you more VRAM space
Yes, the modules are 3 GBps faster, but the bus is also wider.

6800/6800 XT/6900 XT: 512 GB/s
RTX 3080: 760 GB/s
RTX 3090: 936 GB/s

You have to factor in both speed and bus width to get actual memory speed, and the 3080 and 3090 are significantly faster than the RX 6000 cards.

More memory bandwidth = better performance at higher resolutions.

The RX 6000 cards will constantly be limited because of their crippled memory bus.



Also, you do realize that the RTX 3080 is faster than the RX 6800 XT in Doom Eternal, Cyberpunk 2077, and Flight Sim 2020, right? That puts a cork in your theory, doesn't it?
 
Yes, the modules are 3 GBps faster, but the bus is also wider.

6800/6800 XT/6900 XT: 512 GB/s
RTX 3080: 760 GB/s
RTX 3090: 936 GB/s

You have to factor in both speed and bus width to get actual memory speed, and the 3080 and 3090 are significantly faster than the RX 6000 cards.

More memory bandwidth = better performance at higher resolutions.

The RX 6000 cards will constantly be limited because of their crippled memory bus.



Also, you do realize that the RTX 3080 is faster than the RX 6800 XT in both Doom Eternal and Cyberpunk 2077, right? That puts a cork in your theory, doesn't it?

And yet, all these numbers seem to help ONLY with 4K, not 1440p. On average, the 3080 is either on a tie, or a bit slower than the 6800xt, so all things considered, that super-fast mem is only good for 4K. Ironically though, at 4K you would need MORE VRAM, not less. Guys playing 4K with a 3080 will hit the limits of VRAM, as well as the limits of the GPU itself. Sad story for them.

At least on the 3090 VRAM is not an issue.

the 3090 is really a good card, price aside
 
And yet, all these numbers seem to help ONLY with 4K, not 1440p. On average, the 3080 is either on a tie, or a bit slower than the 6800xt, so all things considered, that super-fast mem is only good for 4K. Ironically though, at 4K you would need MORE VRAM, not less. Guys playing 4K with a 3080 will hit the limits of VRAM, as well as the limits of the GPU itself. Sad story for them.

At least on the 3090 VRAM is not an issue.

the 3090 is really a good card, price aside
I think the sad story is this joke of a thread.
 
No use in waiting for the next refresh , they're just going to be more expensive and more scarce. PC gaming fucking sucks now days.
 
This thread sounds like a bunch of RTX fanboys not admitting their brand new GPUs shipping with 8/10 GB are not even enough. Time to close the thread indeed

Who cares if it isn’t enough two or three years down the line, and who cares about resale value in 2022? I could crack my 3080 in half when I upgrade to the next gen cards and it will have cost me $1 or less per hour of usage. Dirt cheap compared to all of my other hobbies. Right now it’s faster in 4k, has better features, and isn’t vaporware (at least in comparison to the AMD cards).
 
And yet, all these numbers seem to help ONLY with 4K, not 1440p. On average, the 3080 is either on a tie, or a bit slower than the 6800xt, so all things considered, that super-fast mem is only good for 4K. Ironically though, at 4K you would need MORE VRAM, not less. Guys playing 4K with a 3080 will hit the limits of VRAM, as well as the limits of the GPU itself. Sad story for them.

At least on the 3090 VRAM is not an issue.

the 3090 is really a good card, price aside
Sooo the 3080 does not have enough VRAM for 4K but out performs the 6800xt at 4K.

Congrats on the gold stat in mental gymnastics! :)

10 gigs is enough for 4K gaming right now. It might be an issue near the end of life of the cards in 2024... likely will still not be an issue because the mainstream 5060 will likely have 12 gigs of vram.
 
Who cares if it isn’t enough two or three years down the line, and who cares about resale value in 2022? I could crack my 3080 in half when I upgrade to the next gen cards and it will have cost me $1 or less per hour of usage. Dirt cheap compared to all of my other hobbies. Right now it’s faster in 4k, has better features, and isn’t vaporware (at least in comparison to the AMD cards).

Ah, you are one of those guys who spend $700 every 2 years on a GPU? Not everyone is like you though
 
Sooo the 3080 does not have enough VRAM for 4K but out performs the 6800xt at 4K.

Congrats on the gold stat in mental gymnastics! :)

10g is enough for 4K gaming right now. It might be an issue near the end of life of the cards in 2024...

Outperforming in GPU power is one thing, not having enough VRAM down the line in a year or more is a totally different thing.

10GB is enough for 4K? Sure, whatever makes you feel better I guess. Maybe for the games you play, def not for all games out there.
 
FWIW, I have several games that use +12GB of Vram. Cyberpunk 2077, MSFS2020 and SOTTR all push in the 11~13GB range at 4K.

At 1440p though? You're probably fine with 10GB, especially if you factor in DLSS which renders at a lower resolution.
 
FWIW, I have several games that use +12GB of Vram. Cyberpunk 2077, MSFS2020 and SOTTR all push in the 11~13GB range at 4K.

At 1440p though? You're probably fine with 10GB, especially if you factor in DLSS which renders at a lower resolution.
Thanks. First reply that looks like something I was looking for.

Good to know, though I was told or read somewhere that the difference between 4K and 1440p is not as much as people think, maybe 1 GB VRAM less.
 
FWIW, I have several games that use +12GB of Vram. Cyberpunk 2077, MSFS2020 and SOTTR all push in the 11~13GB range at 4K.

At 1440p though? You're probably fine with 10GB, especially if you factor in DLSS which renders at a lower resolution.
Is this VRAM utilization or allocation? I have yet to run into a game that completely saturates 10GB VRAM at 4K and still has enough performance to be playable. Case in point... Cyberpunk 2077. Even on an RTX 3080, Cyberpunk 2077 feels sluggish at 4K without DLSS.

At this point, the 16GB VRAM on the RX 6800, XT, and 6900 XT is nothing more than a marketing point because by the time you break the 10GB VRAM barrier, you're into "unplayable" territory. So the question is "16GB of unused VRAM" or "10GB of used, faster VRAM"? The latter is a better fit for most enthusiasts.

Ah, you are one of those guys who spend $700 every 2 years on a GPU? Not everyone is like you though
Many people on this forum are, though. You're pandering to your own will at this point.

You've already said that RT/DLSS is not important to you and you will be playing at 1440P. The 6800 XT is a better fit for you. Problem solved.
 
And yet, all these numbers seem to help ONLY with 4K, not 1440p. On average, the 3080 is either on a tie, or a bit slower than the 6800xt, so all things considered, that super-fast mem is only good for 4K. Ironically though, at 4K you would need MORE VRAM, not less. Guys playing 4K with a 3080 will hit the limits of VRAM, as well as the limits of the GPU itself. Sad story for them.
Not in the games you listed.

Tell me about a game where 10GB VRAM is the bottleneck, so much so that an RX 6800 outperforms the RTX 3080 at 4K. Go ahead. I'll wait.
 
In what? VRAM usage is going to widely vary depending on game/application.

CP2077 already allocates over 9GB of VRAM at 4K today, and that is an early next-gen (technically cross-gen) game. Sure, that’s allocation but that doesn’t mean it isn’t potentially using most of it, we don’t know the buffer.
But it also allocates 21.6 GB of VRAM on my RTX 6000, so....
 
DLSS was meant
Is this VRAM utilization or allocation? I have yet to run into a game that completely saturates 10GB VRAM at 4K and still has enough performance to be playable. Case in point... Cyberpunk 2077. Even on an RTX 3080, Cyberpunk 2077 feels sluggish at 4K without DLSS.

At this point, the 16GB VRAM on the RX 6800, XT, and 6900 XT is nothing more than a marketing point because by the time you break the 10GB VRAM barrier, you're into "unplayable" territory. So the question is "16GB of unused VRAM" or "10GB of used, faster VRAM"? The latter is a better fit for most enthusiasts.


Many people on this forum are, though. You're pandering to your own will at this point.

You've already said that RT/DLSS is not important to you and you will be playing at 1440P. The 6800 XT is a better fit for you. Problem solved.

Faster VRAM does not substitute VRAM capacity. Why do u think they are adding 20 GB on the 3080ti if 10 is enough? Just for giggles? Wasting memory? Marketing as well?
 
DLSS was meant


Faster VRAM does not substitute VRAM capacity. Why do u think they are adding 20 GB on the 3080ti if 10 is enough? Just for giggles? Wasting memory? Marketing as well?
Marketing, many potential customers are saying they want more and are looking at AMD's cards because they have more memory. Hard stop. None of the consoles have more than 10GB of VRAM so it is highly unlikely any developer is going to seriously do anything that uses more than that outside a few select titles.
 
Simply because I can get my hand on a 3080 easily, but not the 6800xt.

The 3080ti looks very promising, If I can wait 2-3 months, I can probably grab one, but its 2-3 months of waiting. I dont have any GPU at the moment
3080ti is rumored to be $999, so quite a bit more expensive than a 3080!
Also do remember that 3080ti may well take months and months to try to acquire just like current gen cards. Now that scalpers and bots have a taste of the profits, this trend may continue a while longer. And covid isn't over yet either - with the card due to come out in less than 2 months if I'm not mistaken, the supply situation might not only not improve, but get worse, as new waves of scalpers show up after hearing about scalping success in 2020.

I was/am in the same dilemma as you. I looked for 6800XT for 3 straight weeks post launch, checking AMD site on the drop time each morning, and being on Discord and available to pounce at any time. Got a few alerts but even though I clicked on them and ready to order as soon as they popped out, they always went out of stock.
Finally told myself ok, I'm going to try for a 3080 too in the mean time, and managed to grab one relatively quickly

I'm still worried about the future-proof potential of 10GB, but I figure for now it is not a problem, and at worst I could do a straight trade it for a 6800XT if/when 10GB does becomes an issue, and someone else is interested in nVidias DLSS and RTX. I think you could go the same route in the future.

FWIW, I have several games that use +12GB of Vram. Cyberpunk 2077, MSFS2020 and SOTTR all push in the 11~13GB range at 4K.

At 1440p though? You're probably fine with 10GB, especially if you factor in DLSS which renders at a lower resolution.
DLSS in what though? The ~5 serious titles that support it? If DLSS was available driver-level for every game and not something developers had to find ways to work into their games, it would be so gold! But sadly, as great a feature as DLSS is, it just isn't really adopted and to be honest I can't imagine it will be adopted to a much wider scale as time goes by.
Develoeprs don't typically care to put so much resource and effort and manpower into a feature such a small percent of their userbase can use :(

Don't get me wrong, I love DLSS. I only used it on Control so far (don't have CP and the other triple-A games), and it's pretty great. I just wish it was adopted. Right now it's not even worth considering as a purchasing point, same with RT to be honest. They're two features that only work in a handful to a dozen games all in all as of today, and they've been out for a while. I doubt all of a sudden majority of games released over the next 2-4 years will support it in much greater quantities.

If AMD and nVidia can cooperate on a DLSS-like standard that both card makers can use, that's a different story. But this is like Glide from 3dfx in the old days. Sure, cool features (still were bigger circulated than DLSS and RT too), but due to majority of customers not having access to it, most games were still made for DirectX. Not many titles had Glide support.

Agonistic features are the way to go. Anything exclusive to one company or the other will be rarely used, I feel - simply because develoeprs and publishers care about reaching the most audience, and not spend effort on something .02% of their users have access to.




Still, as it stands today, from the benchmarkings I've seen, it appears the 3080 still has higher lows and higher high FPS than the 6800XT (that probably will get evened out in time with driver improvements from AMD) at 1440p. So that's what made me comfortable going 3080 and forgoing the hope of a 6800XT anytime soon, and just do a trade in the future with someone as a worst case scenario.
 
Last edited by a moderator:
Not if you follow the forum posts and modify the memory_pool_budgets.csv file and give it access to more than the 3GB it will actually use.

PC, Durango, and Orbis were all hard-capped at 3GB usage in there, changing those values for the PC to better match your system will usually help. T
Edit:
They patched this out in 1.05, I haven't had a chance to play in the last 2 weeks, and I rarely ever fire up games on the Quadro rig because it runs them like garbage.
 
Last edited:
3080ti is rumored to be $999, so quite a bit more expensive than a 3080!
Also do remember that 3080ti may well take months and months to try to acquire just like current gen cards. Now that scalpers and bots have a taste of the profits, this trend may continue a while longer. And covid isn't over yet either - with the card due to come out in less than 2 months if I'm not mistaken, the supply situation might not only not improve, but get worse, as new waves of scalpers show up after hearing about scalping success in 2020.

I was/am in the same dilemma as you. I looked for 6800XT for 3 straight weeks post launch, checking AMD site on the drop time each morning, and being on Discord and available to pounce at any time. Got a few alerts but even though I clicked on them and ready to order as soon as they popped out, they always went out of stock.
Finally told myself ok, I'm going to try for a 3080 too in the mean time, and managed to grab one relatively quickly

I'm still worried about the future-proof potential of 10GB, but I figure for now it is not a problem, and at worst I could do a straight trade it for a 6800XT if/when 10GB does becomes an issue, and someone else is interested in nVidias DLSS and RTX. I think you could go the same route in the future.


DLSS in what though? The ~5 serious titles that support it? If DLSS was available driver-level for every game and not something developers had to find ways to work into their games, it would be so gold! But sadly, as great a feature as DLSS is, it just isn't really adopted and to be honest I can't imagine it will be adopted to a much wider scale as time goes by.
Develoeprs don't typically care to put so much resource and effort and manpower into a feature such a small percent of their userbase can use :(

Don't get me wrong, I love DLSS. I only used it on Control so far (don't have CP and the other triple-A games), and it's pretty great. I just wish it was adopted. Right now it's not even worth considering as a purchasing point, same with RT to be honest. They're two features that only work in a handful to a dozen games all in all as of today, and they've been out for a while. I doubt all of a sudden majority of games released over the next 2-4 years will support it in much greater quantities.

If AMD and nVidia can cooperate on a DLSS-like standard that both card makers can use, that's a different story. But this is like Glide from 3dfx in the old days. Sure, cool features (still were bigger circulated than DLSS and RT too), but due to majority of customers not having access to it, most games were still made for DirectX. Not many titles had Glide support.

Agonistic features are the way to go. Anything exclusive to one company or the other will be rarely used, I feel - simply because develoeprs and publishers care about reaching the most audience, and not spend effort on something .02% of their users have access to.




Still, as it stands today, from the benchmarkings I've seen, it appears the 3080 still has higher lows and higher high FPS than the 6800XT (that probably will get evened out in time with driver improvements from AMD) at 1440p. So that's what made me comfortable going 3080 and forgoing the hope of a 6800XT anytime soon, and just do a trade in the future with someone as a worst case scenario.

Well they trade blows depending on the game. Some have the 6800xt do better (even in lows), other games the 3080 does better.

I dont think people who buy the 6800xt would trade with a 3080, for many reasons. worst case though, you can sell the 3080 and grab the 6800xt. I dont think 10 GB is gonna be an issue before 2 years-time, but then who knows. Imagine wanting to play a new game that is VRAM hungry in a year time and not being able to... that would suck.
 
Not if you follow the forum posts and modify the memory_pool_budgets.csv file and give it access to more than the 3GB it will actually use.

PC, Durango, and Orbis were all hard-capped at 3GB usage in there, changing those values for the PC to better match your system will usually help. Though supposedly they patched this out in 1.05, I haven't had a chance to play in the last 2 weeks, and I rarely ever fire up games on the Quadro rig because it runs them like garbage.

You mean this:
 
Simply because I can get my hand on a 3080 easily, but not the 6800xt.

The 3080ti looks very promising, If I can wait 2-3 months, I can probably grab one, but its 2-3 months of waiting. I dont have any GPU at the moment
Curious, how can you get ahold of a 3080 west easily?
 
Is this VRAM utilization or allocation? I have yet to run into a game that completely saturates 10GB VRAM at 4K and still has enough performance to be playable. Case in point... Cyberpunk 2077. Even on an RTX 3080, Cyberpunk 2077 feels sluggish at 4K without DLSS.

At this point, the 16GB VRAM on the RX 6800, XT, and 6900 XT is nothing more than a marketing point because by the time you break the 10GB VRAM barrier, you're into "unplayable" territory. So the question is "16GB of unused VRAM" or "10GB of used, faster VRAM"? The latter is a better fit for most enthusiasts.


Many people on this forum are, though. You're pandering to your own will at this point.

You've already said that RT/DLSS is not important to you and you will be playing at 1440P. The 6800 XT is a better fit for you. Problem solved.
I'm not entirely sure, I let gpuz run in the background and log memory usage when I tested my 2080Ti and my new 3090.

On my 2080Ti, cyberpunk, SOTTR and MSFS2020 would max out at 10998mb.

On my 3090, cyberpunk used 13985mb. This was at 4k, psycho RT and DLSS at quality settings.

I did not pick up a 3090 for the vram though, I picked it up because I wanted the best video card available. Just a bonus it has more vram, but the wider and faster bus is more important for gaming performance.
 
Curious, how can you get ahold of a 3080 west easily?
Not OP but in my case around the time I gave up on finding 6800XT, I noticed MicroCenter was getting TONS more 3080s daily all of a sudden (I'd visit it before too, for 6800XTs. They would have 3080s come in a couple times a week but in low numbers, then it exploded in quantity for every delivery out of the blue).
There are also 1000x more pings on Discord for 3xxx cards, while rare to get a ding on a 6800 series card.
I dont think people who buy the 6800xt would trade with a 3080, for many reasons. worst case though, you can sell the 3080 and grab the 6800xt.
I'm not sure that's true. Some people might just not be happy with their 6800XT for one reason or another (unhappy with drivers, performance, 4k gaming, or want to try DLSS or ray tracing) and would be happy to trade.
But yeah, selling it is an option too, but something I never like because you basically are forced to take a price cut and then look for a good deal elsewhere on whatever it is you were looking to get
That, instead of an easy trade (not even talking about the fact 3080 is more expensive than a 6800XTand I wouldn't even ask for any cash on their end, lol! [well, if we're talking MSRP prices from AMD. Otherwise they might ask *me* for cash on top of the 3080 :LOL:] )
 
Not OP but in my case around the time I gave up on finding 6800XT, I noticed MicroCenter was getting TONS more 3080s daily all of a sudden (I'd visit it before too, for 6800XTs. They would have 3080s come in a couple times a week but in low numbers, then it exploded in quantity for every delivery out of the blue).
There are also 1000x more pings on Discord for 3xxx cards, while rare to get a ding on a 6800 series card.

I'm not sure that's true. Some people might just not be happy with their 6800XT for one reason or another (unhappy with drivers, performance, 4k gaming, or want to try DLSS or ray tracing) and would be happy to trade.
But yeah, selling it is an option too, but something I never like because you basically are forced to take a price cut and then look for a good deal elsewhere on whatever it is you were looking to get
That, instead of an easy trade (not even talking about the fact 3080 is more expensive than a 6800XTand I wouldn't even ask for any cash on their end, lol! [well, if we're talking MSRP prices from AMD. Otherwise they might ask *me* for cash on top of the 3080 :LOL:] )
Dang, my nearest micro center is 6hrs away, but I’d kill for a reference 3080 right now. Give me a month, and I’ll be willing to wait for 40xx/30xxti though.
 
Well they trade blows depending on the game. Some have the 6800xt do better (even in lows), other games the 3080 does better.

I dont think people who buy the 6800xt would trade with a 3080, for many reasons. worst case though, you can sell the 3080 and grab the 6800xt. I dont think 10 GB is gonna be an issue before 2 years-time, but then who knows. Imagine wanting to play a new game that is VRAM hungry in a year time and not being able to... that would suck.

2 years from now... whatever game that needs more than 10gb is not gonna run on ultra settings and not due to the memory pool. Once you turn down a few settings to get into playable territory you will also be ruducing the amount of memory you need so "poof" no bottleneck. Also more than likely in 2 y most games, and any that would be a potential concern will likely support dlss so even less likely to be an issue. I dont care what card you buy now you will likely have to tweak or reduce a few settings in many new games 2 years from now.
 
Not to mention all those NVidia specific goodies that most of the studios announced support for like Reflex.
 
2 years from now... whatever game that needs more than 10gb is not gonna run on ultra settings and not due to the memory pool. Once you turn down a few settings to get into playable territory you will also be ruducing the amount of memory you need so "poof" no bottleneck. Also more than likely in 2 y most games, and any that would be a potential concern will likely support dlss so even less likely to be an issue. I dont care what card you buy now you will likely have to tweak or reduce a few settings in many new games 2 years from now.
True but textures have almost zero impact on the gpus performance and would only be impacted by the amount of vram. But yeah most people that are going to spend $700 to $800 or more on a GPU will probably also upgrade in two or three years.
 
True but textures have almost zero impact on the gpus performance and would only be impacted by the amount of vram. But yeah most people that are going to spend $700 to $800 or more on a GPU will probably also upgrade in two or three years.

Right. And half the time lowering the textures a notch or two is only something you notice in screenshots and not when you are actually playing the game. granted this is a generalization and not true in all cases.
 
As the bulk of AAA 4k games that need "help" at 4k are now supporting DLSS 2.0 and look like they will continue to for the forseeable future the 10GB 3080 continues to look like the best choice. At 1440p there's no sign at all of any serious need for 20GB. IMHO the industry is going to start working extremely hard to keep VRAM requirements down. Consoles already are and SSD to video card direct access is already GOING to be a thing. That much is obvious as both AMD and Nvidia are both pushing it. That should come into it's own within the next 2 years.

Would it be nice to have the 20GB 3080? Yes. Would I pay $1000+ for it vs $700-800 for a 10GB? Hell no. I'd rather upgrade sooner.

If you are seriously trying to buy a card to last 5-6 years... well... ok then go ahead and wait for a 20GB 3080 but honestly at the rate video cards are advancing right now the 20GB refresh won't be widely available for months yet, putting us that much closer to next gen cards and with the way availability of video cards has been it could be even longer. If you are going to buy an expensive graphics card you want it as soon as you possibly can after it's generation launches to give you as much time with the card as you can get before it depreciates and goes obsolescent. Buying over 6 months in at $1000+ is just going to hurt like hell when something new 30-50% faster comes out only 1.5 years at most after that. if you are a 3080 buyer you are going to want to upgrade again in either 2 or at the very most 4 years. Better to have a card with more resale value against the original purchase price.

I still maintain that the ideal card is ANY 3080 at the $700 flat MSRP. You'll never regret it at that price point. There is no way on earth the 20GB model will be less than $1000 in the current climate.

If you can snag a Founders or an ASUS TUF or a Zotac or WHATEVER at $700, you save 30% over what that 20GB card is likely to cost. Upgrade again 30% sooner for crying out loud. That's 2 years instead of 3... or 3.5 years instead of 5!

Worst case scenario you'll just be getting into that 3 year zone you are so worried about... but it won't matter because it'll be upgrade time again!
 
Last edited:
Doom Eternal, CyberPunk, many SIM games (MS Flight Sim), and many open-world games.
The first 2 don't use more than 10 GB. MS Flight Sim in rare scenarios jumps over 10 GB but still runs better on nVidia last I checked.
doom-eternal-3840-2160.pngYrMQ4KGurqQGkXsMvwUJC5.png
Flight Simulator 2020 Radeon RX 6800 XT vs GeForce RTX 3080 with Ryzen 5900X - YouTube
 
If you want to gauge VRAM 5 years from now, the first place I'd look is 5 years ago and see what happened.

Right now, would you rather have an AMD R9 390X with 8GB VRAM or a GTX 980 with 4GB of VRAM?
 
If you want to gauge VRAM 5 years from now, the first place I'd look is 5 years ago and see what happened.

Right now, would you rather have an AMD R9 390X with 8GB VRAM or a GTX 980 with 4GB of VRAM?
That is a false comparison. The 390 came out almost a year after the 980. The 980 is six years old versus five years for the 390.
 
Sooo the 3080 does not have enough VRAM for 4K but out performs the 6800xt at 4K.

Congrats on the gold stat in mental gymnastics! :)

10 gigs is enough for 4K gaming right now. It might be an issue near the end of life of the cards in 2024... likely will still not be an issue because the mainstream 5060 will likely have 12 gigs of vram.
No one can predict how much memory will need for a game until the developer releases the requirements. They could stay under 10 GB for the next 10 years, or start utilizing over 10gb as soon as some ps5 games start to make their way to pc. It is, and will forever be an inadequate benchmarks to use hypotheticals as a rule.
I'm not entirely sure, I let gpuz run in the background and log memory usage when I tested my 2080Ti and my new 3090.

On my 2080Ti, cyberpunk, SOTTR and MSFS2020 would max out at 10998mb.

On my 3090, cyberpunk used 13985mb. This was at 4k, psycho RT and DLSS at quality settings.

I did not pick up a 3090 for the vram though, I picked it up because I wanted the best video card available. Just a bonus it has more vram, but the wider and faster bus is more important for gaming performance.
Many games will allocate more memory simply because the card has more memory available. I have yet to see (unless I am incorrect), a game that loses performance due to a lack of VRAM at 10gb up to 4k. This could change any day, but for the time being it is what it is. Honestly I think even 8gb might be adequate up to 4k at the moment as well without any actual gameplay experience variance of note.


Un-quote related thoughts:
AMD had dug a nitch in the graphics card ecosystem by offering more RAM at lower prices which helps to promote their cards when Nvidia has an overwhelming preference by gamers and DIY's in general. It's been a long time since AMD have been able to compete with Nvidia in brand and performance from top down, and the sales data continues to demonstrate that. Could it change in the next 3 years? I'd like to see it. Unfortunately Nvidia has something ridiculous like a 10:1 sell through on their discrete GPUS. AMD's integrated cpus that have a built in GPU make it look like less in a lot of surveys, but NVIDIA is still outselling AMD in discrete GPUs at an astounding clip that most people don't fully grasp.

AMD's issues with black screens on their 5700 series drivers was a lost opportunity to start moving sales in a strictly per unit metric, and probably cost them 2 years of potential brand rebuilding that could have started to get them into the right direction.

It will be awhile before their new mainstream navi 2 cards get drivers thoroughly proven, so at the moment the brand is still in a holding pattern from a competetive standpoint.
 
That is a false comparison. The 390 came out almost a year after the 980. The 980 is six years old versus five years for the 390.

Those were the cards available 5 years ago it's not a false comparison.
 
If you want to gauge VRAM 5 years from now, the first place I'd look is 5 years ago and see what happened.

Right now, would you rather have an AMD R9 390X with 8GB VRAM or a GTX 980 with 4GB of VRAM?
Keep in mind that 5 years ago the PS4 and XB1 were already out for 2-3 years. Right now we're in a transitionary period between the last gen and next gen. I'd argue most PC games are console ports so I would reckon that VRAM utilization is going to increase more significantly in the next 2-3 years than the previous 5 years.

But once again, I'll reiterate that this is all speculation at this point. The only people who actually have insight on this are game developers already working on next-gen games.
 
First of all, the next gen of GPU from Nvidia is gonna be 2 years from now. Waiting +- 3 months for a 3080ti when it releases and becomes available wont matter, because the 4080ti will take anyhow 2 years from whatever that date is. Thats the cycle.

buying a $700 GPU every 2 years is a waste of money for me. The 3080 and 6800xt are beasts at 1440p, specially if you forget about RT Bullshit (everyone who tried it said the same thin, its useless). So that aside, I really really really dont see any game requiring more than a 3080/6800xt in 2-3 years, again, at 1440p. That said, your limiting factor is more the VRAM than the GPU power itself.

I have kept my cards easily over 3.5 years (in some cases longer), and VRAM was not an issue, because I had 8 GB VRAM in all the cards I had in the past 4+ years.


The other misconception is more VRAM would also need more GPU power, thats not the case actually. Its almost like saying the more system memory you add the more CPU power you need: nope. The opposite is what causes issues: little VRAM, where the GPU is waiting on textures to load (since they are on disk and RAM after VRAM is full), which causes a huge performance hit. Having too much VRAM wont increase GPU performance, however.

Like I said, in 2022 when more and more games need more and more VRAM, and when the new gen of AMD / Nvidia will only have lots of VRAM compared today, that will make the 3080 decrease in value. The 3070 is already outdated as of.... now. My 1080 had 8 GB 4 years ago.

Many games max out 8GB already as of today at 1440p, so the 3070 is not a good purchase IMO
 
Doom Eternal, CyberPunk, many SIM games (MS Flight Sim), and many open-world games.
MSFS is heavily bound by everything regardless of what you pick right now - but that was true with FSX came out too. We used to expect games that couldn't be maxed out today, only on tomorrow's cards (crysis anyone? Hell, Wing Commander 3 - anyone remember 2+minute loading times?). 10G of VRAM is fine for right now (and before you assume bias, I own both a 6800XT and a 3080 and a 3070 and 2080TI and 5700XT, just in the same room I'm in right now), we have no idea what will be "fine" for the future. 16G in a 6800 may help - or it may be too slow for whatever could use the 16G. Unless you're Nostradamus, all you can buy for is ~right now~, and Doom Eternal already hits 150FPS+ with settings cranked, so I really wouldn't be worried about THAT particular game.
 
I was able to get a Founders 3080 at Best Buy for $699. BEST purchase I've made outside of my 1440p g-sync 144hz ips monitor in the last 10 years I think. I play at 1440p and it pretty much just destroys everything. I actually used the weird super resolution 3200x2300 on doom eternal at nightmare quality just because I could and I am almost always above 130fps on my 5800x (and usually maxing out at 165fps).

If I have to turn the details down a notch due to ram in a few years, so be it. I can always hand it down to my linux box on 144hz/1080p and upgrade to a higher ram card.
 
Back
Top