Is 10gb VRAM really enough for 1440p?

DLSS was meant


Faster VRAM does not substitute VRAM capacity. Why do u think they are adding 20 GB on the 3080ti if 10 is enough? Just for giggles? Wasting memory? Marketing as well?
Because folks are asking for it and it will lead to more revenue. Regardless of if its needed or not. Revenue is king.
 
I was able to get a Founders 3080 at Best Buy for $699. BEST purchase I've made outside of my 1440p g-sync 144hz ips monitor in the last 10 years I think. I play at 1440p and it pretty much just destroys everything. I actually used the weird super resolution 3200x2300 on doom eternal at nightmare quality just because I could and I am almost always above 130fps on my 5800x (and usually maxing out at 165fps).

If I have to turn the details down a notch due to ram in a few years, so be it. I can always hand it down to my linux box on 144hz/1080p and upgrade to a higher ram card.

That card is practically a unicorn though. They definitely are one of the hardest if not the hardest one to come by.
 
First of all, the next gen of GPU from Nvidia is gonna be 2 years from now. Waiting +- 3 months for a 3080ti when it releases and becomes available wont matter, because the 4080ti will take anyhow 2 years from whatever that date is. Thats the cycle.

buying a $700 GPU every 2 years is a waste of money for me. The 3080 and 6800xt are beasts at 1440p, specially if you forget about RT Bullshit (everyone who tried it said the same thin, its useless). So that aside, I really really really dont see any game requiring more than a 3080/6800xt in 2-3 years, again, at 1440p. That said, your limiting factor is more the VRAM than the GPU power itself.

I have kept my cards easily over 3.5 years (in some cases longer), and VRAM was not an issue, because I had 8 GB VRAM in all the cards I had in the past 4+ years.


The other misconception is more VRAM would also need more GPU power, thats not the case actually. Its almost like saying the more system memory you add the more CPU power you need: nope. The opposite is what causes issues: little VRAM, where the GPU is waiting on textures to load (since they are on disk and RAM after VRAM is full), which causes a huge performance hit. Having too much VRAM wont increase GPU performance, however.

Like I said, in 2022 when more and more games need more and more VRAM, and when the new gen of AMD / Nvidia will only have lots of VRAM compared today, that will make the 3080 decrease in value. The 3070 is already outdated as of.... now. My 1080 had 8 GB 4 years ago.

Many games max out 8GB already as of today at 1440p, so the 3070 is not a good purchase IMO
It may be a waste of money to you, but it may not be for others. Corporations make decisions in general, not based on individual users. As for RT - I used it, it works great, it looks amazing (Especially on the 3XXX series), and DLSS is also great (2.0+). Also performs just fine, especially at 1440P, but also at 4K (control with RT is awesome). We honestly haven't seen VRAM limitations outside of the extremes so far (and MSFS being as limited as it is, it doesn't quite count - the last two MSFS releases were both system killers for extended (4-5 years) periods of time, just because Microsoft aims at the future there).

I keep my cards until they die, most of the time - I've still got a pair of 970s here for backups. Rotate them down through family/friends systems as needed.

As for RAM vs power - yes, it matters, but you're only looking at one side, while others only look at the other side. BOTH sides matter. If rendering at 4k is what would require more than 10G of RAM, you have to have the power to actually render AT that resolution at a playable frame rate (assume 30fps, then 248,832,000 pixels total per second) and the RAM to store the needed calculations/etc. At the moment, we hit the GPU horsepower limit before we hit the RAM limit. That may change. It may not. By the time the RAM limit is hit, we may not have the GPU horsepower. Or we may have optimized the GPU pipelines to the point that RAM becomes the limiter. Graphics are somewhat unique in that it is insanely time sensitive - you don't push 248 Million pixels per second, it looks like shit (and that's a bare minimum). CPU and system RAM? Store it, get to it eventually - NVMe vs DRAM performance differences are enough to make not having enough RAM significantly impactful to many calculations, but if you have an over abundance of RAM, it doesn't seem a waste (you're removing a bottleneck). Given how GPUs work, what was usable a second ago is ancient history - there's no reason to store it, you only need the info for the next frame (simplified). Changes how much RAM you need, how it's utilized, etc...

RTOS designs (closest thing I can think of to a GPU pipeline) size very specifically for a reason - no reason to waste money on resources you can't use, since in general, you can't "store for the future" something. Obviously, there are textures/etc that may be needed that get stored / etc, and I'm simplifying, but the metaphor isn't directly comparable - having extra system ram may be useful. Having Video ram that never gets touched? a waste.
 
1. don't buy a card today for tomorrows' requirements and expect it to be anything other than average tomorrow.

2. 10GB of vram is ample for 1440 today and the near future.

3. AMD has a bus problem that criples it under demanding vram loads. So having more ram isn't a cureall.

4. the new consoles will change things in 2-3 years time, once titles drop last gen support and firmly move into and take advantage of new gen hardware.
 
That card is practically a unicorn though. They definitely are one of the hardest if not the hardest one to come by.
TBH I didn't get it. I was trying to get one in my cart & kept failing to checkout. My wife was browsing Best Buy on her mac for something else and she was able to get it in her cart & checkout & couldn't understand why I couldn't just do it myself! :eek::ROFLMAO:
 
TBH I didn't get it. I was trying to get one in my cart & kept failing to checkout. My wife was browsing Best Buy on her mac for something else and she was able to get it in her cart & checkout & couldn't understand why I couldn't just do it myself! :eek::ROFLMAO:
That's how my last 6800XT purchase was. Click, fight with logins on newegg, click, paypal checkout, oh hey a confirmation number :p
 
I was able to get a Founders 3080 at Best Buy for $699. BEST purchase I've made outside of my 1440p g-sync 144hz ips monitor in the last 10 years I think. I play at 1440p and it pretty much just destroys everything. I actually used the weird super resolution 3200x2300 on doom eternal at nightmare quality just because I could and I am almost always above 130fps on my 5800x (and usually maxing out at 165fps).

If I have to turn the details down a notch due to ram in a few years, so be it. I can always hand it down to my linux box on 144hz/1080p and upgrade to a higher ram card.

Same. Got a 3080 FE from Best Buy on the first day they were available there. I got lucky... and I very much love my giant metal brick unicorn. :D


First of all, the next gen of GPU from Nvidia is gonna be 2 years from now. Waiting +- 3 months for a 3080ti when it releases and becomes available wont matter, because the 4080ti will take anyhow 2 years from whatever that date is. Thats the cycle.

buying a $700 GPU every 2 years is a waste of money for me. The 3080 and 6800xt are beasts at 1440p, specially if you forget about RT Bullshit (everyone who tried it said the same thin, its useless). So that aside, I really really really dont see any game requiring more than a 3080/6800xt in 2-3 years, again, at 1440p. That said, your limiting factor is more the VRAM than the GPU power itself.

I have kept my cards easily over 3.5 years (in some cases longer), and VRAM was not an issue, because I had 8 GB VRAM in all the cards I had in the past 4+ years.


The other misconception is more VRAM would also need more GPU power, thats not the case actually. Its almost like saying the more system memory you add the more CPU power you need: nope. The opposite is what causes issues: little VRAM, where the GPU is waiting on textures to load (since they are on disk and RAM after VRAM is full), which causes a huge performance hit. Having too much VRAM wont increase GPU performance, however.

Like I said, in 2022 when more and more games need more and more VRAM, and when the new gen of AMD / Nvidia will only have lots of VRAM compared today, that will make the 3080 decrease in value. The 3070 is already outdated as of.... now. My 1080 had 8 GB 4 years ago.

Many games max out 8GB already as of today at 1440p, so the 3070 is not a good purchase IMO

Cyberpunk 2077 says that RT is not useless. In fact, it's absolutely amazing. Everyone who tried it did not say the same thing, because I've tried it and I love it. Totally not useless. Might be useless on AMD cards because it currently isn't enabled, and even once it is, since AMD doesn't have a DLSS competitor, it will be a torturously slow, especially considering how much faster Ampere/Turing are at Ray Tracing than Big Navi. Throw in DLSS for good measure, and it's game over; Big Navi doesn't stand a chance in Cyberpunk 2077.

Honestly, the way you talk, it feels like you're salty and trying to justify AMD for this generation, so I will say it again... if you're not interested in RT and play at 1440P or below, the RX 6800 XT is, for the most part, a better card. But to say that it will last longer because it has more VRAM is laughable. Turing and Ampere both have a lot more forward-looking tech than Big Navi, so I would dare say that Nvidia's solutions from the past couple of years will last longer.... but no one knows. Please pass me your crystal ball so I can see into the future as well.
 
It may not be what you want to hear, but you're still getting flooded with the same advice. I'm all for more VRAM, but not if the value isn't there.

There's no value and no meaningful performance reason, especially at 1440p, paying a heavy premium for a 20GB card that isn't even out yet.

If it was going to be only $50 more... then sure. But it's going to be $200 to $300 more.

The argument about time frames is perfectly valid. It's still just a 3080 with more VRAM. There's no indication the cards will be some kind of performance refresh that makes a big difference in speed. It's not going to be a "Super" card or something that will stretch its legs on raw performance longevity. At least we havn't heard of that yet. If it ends up being 10-15% faster or something on top of having 20GB (which it wont or it will eviscerate the 3090), we can revisit this. But if its essentially the same speed? That makes it simple.

Now, if you are a content creator or graphics professional? The 3080 20GB may be a real game changer for certain people.
 
Those were the cards available 5 years ago it's not a false comparison.
Guess I should have said bad comparison. If someone purchased a 980 on release day they sure didn't have a 390 to select instead.

However, I'll go with it since the 980 was still on sale the following year.
 
I do not have a crystal ball but can look at trends dealing with VRam over the years. Requirements have been going up, today 8gb shows limitations in some games particularly at 4K, 10gb seems to be OK today. To me there is just not much wiggle room from 8gb to 10gb and even today with some fringed settings/games 10GB shows some limitations but not realistically limiting. I buy cards to play games today and tomorrow games. I also keep cards for a very long time and the 3080 was too questionable for me for a card for over a 2yr use. I use my cards normally 3yrs-5yrs. If in doubt, go with the more Vram card, I would definitely recommend the 6800 XT over a 3080. As for RT, DLSS, look at the games that are out, if you are going to mostly play those, know of future games etc. could give some merit for the 3080.

As for AMD and RT, cough cough, If console games use RT, those games coming to the PC will most likely have RT, if it can run on the Console, it will most likely run better on the PC with 67xx and up AMD Radeon video cards. Using Nvidia RTX designed games to judge AMD RT performance is most likely very misleading.
 
Not in the games you listed.

Tell me about a game where 10GB VRAM is the bottleneck, so much so that an RX 6800 outperforms the RTX 3080 at 4K. Go ahead. I'll wait.
Exactly. The games that have been sited as using more than 10 GB of VRAM at 4K perform better on a 3080 than they do on a 6800xt. If they actually were running into not having enough VRAM that would NOT be the case.

Instead what we see is that the 6800 cards over perform at 1080-1440 and underperform the 3080 at 4K. If the 3080 did not have enough VRAM the opposite would be observed.

In a practice VRAM size is more a function of bandwidth needed (384 vs 256 vs 192 bit) than the actual size needed for the frame buffer.
 
If the countless articles on this subject are not enough to come to a personal decision, this is an unresolvable argument. If you are the kind of person who wants 16+ GB of VRAM and/or "8+ cores to match consoles", manufacturers will be happy to sell all this stuff to you. It's your money. Go be you.
 
Last edited:
It might make more sense than you think to upgrade every 2 years. With the way the resale market has been for the past few years you can sometimes break even after an upgrade. Sometimes even come out ahead, believe it or not.
 
If you want to gauge VRAM 5 years from now, the first place I'd look is 5 years ago and see what happened.

Right now, would you rather have an AMD R9 390X with 8GB VRAM or a GTX 980 with 4GB of VRAM?

Whichever one has better driver support for modern games.

In both cases, they are likely limited to around 1080p high where even 4 GB is plenty.
 
The "what about vram requirements 4 years from now" argument continues to fall on its face.

GPU requirements always seem to outpace vram requirements.

Not a high end source, but look at the vram usage for modern games using playable, keyword PLAYABLE, settings using the older cards. 4 GB looks to be enough!

 
Like I said, am not planning to keep a card for 2 years only. Why buy a powerful card like the 3080 or 6800xt just to get rid of it in 2 years? Well if that was the case, then yes, most cards sold today would be OK to keep for 2 years, even the 8GB on the 3070, which is a joke of a card, but whatever.

16GB VRAM is overkill, regardless if its on a AMD or Nvidia card... I think 12-14GB is plenty, though I still think and believe 10GB is on the short-side of things. Its not bad, but not great either. And from my research it seems VRAM management is really dependent on the game itself: some game engines manage all of this properly, some really dont, and thats where VRAM become an issue. CP77 is an example of a very big and famous game that does NOT manage VRAM properly, or, that it pushes VRAM to its limits.

RT and 4K both require more VRAM, and people actually want better and better graphics, not the opposite. With better graphics comes more space requirements, and therefore more VRAM requirements.

For those who say that the AMD card cannot handle its own 16 GB VRAM, thats false. the GPU never uses all of that 16 GB at once, only certain blocks of it at a time, and GDDR6 is plenty for that.

Just pause and think about it: would 2 GB of VRAM be enough even if it had a theoretical GDDR8x? No, because the amount of graphics that is needed is much bigger, even though the memory is super fast.
 
The 6900XT performs pretty poorly though at 4K compared to the NVidia equivalents because of the bus limitations though. NVidia is not stupid, if they thought 10GB was going to be a limit for the lifecycle of the card (2 years roughly), they never would have specified it the way they did. Nvidia is out to sell more cards each generation, not provide long lasting cards for 4+ years with max graphics, that is simply not possible anyway. The GPU will be the limiting factor long before the vram will be.

As for upgrading, that's always user dependent. I tend to build the best PC (Mobo/CPU/Ram) I can at a given time, so I can then milk that PC through several GPU generations. :) Still on my 5960x rocking a RTX 3090 at 4K and have 0 issues with being CPU limited. Been through 770's, 1080's, a 2080Ti and now a 3090.
 
Like I said, am not planning to keep a card for 2 years only. Why buy a powerful card like the 3080 or 6800xt just to get rid of it in 2 years? Well if that was the case, then yes, most cards sold today would be OK to keep for 2 years, even the 8GB on the 3070, which is a joke of a card, but whatever.
Because there's a better card available in 2 years? As for the 3070, it's as fast as a 2080TI right now, for every use case I can think of or have tried (I have one), and I've never seen it come close to a VRAM limitation in the games I play on it. Works far better than the 1060 we had in use for that before. Price is good for what you get too. Buying for the future is insane; we have no idea what the future holds or will require. In 2 years, there'll be a 4070 that will be (likely) as good as the 3080/3090 of today, built for the next couple of years of games. That's why PC gaming exists - it's easier to keep up with the bleeding edge and push the boundaries of what is possible, vs consoles that are stuck at their configuration "forever."
16GB VRAM is overkill, regardless if its on a AMD or Nvidia card... I think 12-14GB is plenty, though I still think and believe 10GB is on the short-side of things. Its not bad, but not great either. And from my research it seems VRAM management is really dependent on the game itself: some game engines manage all of this properly, some really dont, and thats where VRAM become an issue. CP77 is an example of a very big and famous game that does NOT manage VRAM properly, or, that it pushes VRAM to its limits.
We simply do not know if this is true or not. You hit on the right point in your second part though - this is all dependent on game design. I don't play CP77 yet, as I ALWAYS give CDPR games a good 3 months of patches before I play them (patches, yo!), but it's also something that can be patched/adjusted/etc after the fact.
RT and 4K both require more VRAM, and people actually want better and better graphics, not the opposite. With better graphics comes more space requirements, and therefore more VRAM requirements.
Maybe, maybe not. Depends on how the game is designed, the engine is designed, etc.
For those who say that the AMD card cannot handle its own 16 GB VRAM, thats false. the GPU never uses all of that 16 GB at once, only certain blocks of it at a time, and GDDR6 is plenty for that.
Huh?
Just pause and think about it: would 2 GB of VRAM be enough even if it had a theoretical GDDR8x? No, because the amount of graphics that is needed is much bigger, even though the memory is super fast.
Depends on how the engine is designed and the game is designed. Could you design around that limitation? Absolutely. Would it be a pain? Absolutely. Look up 64k demos some day - there are all sorts of REALLY creative ways to use limited space and resources - if someone has enough of a need to. I can think of ways to use highly compressed textures loaded into system RAM and transferred / decompressed in video ram as needed - you'd just have to design your levels around streaming those textures to video ram, and procedurally generating various things, which isn't hard, just ... different.
 
Because there's a better card available in 2 years? As for the 3070, it's as fast as a 2080TI right now, for every use case I can think of or have tried (I have one), and I've never seen it come close to a VRAM limitation in the games I play on it. Works far better than the 1060 we had in use for that before. Price is good for what you get too. Buying for the future is insane; we have no idea what the future holds or will require. In 2 years, there'll be a 4070 that will be (likely) as good as the 3080/3090 of today, built for the next couple of years of games. That's why PC gaming exists - it's easier to keep up with the bleeding edge and push the boundaries of what is possible, vs consoles that are stuck at their configuration "forever."

We simply do not know if this is true or not. You hit on the right point in your second part though - this is all dependent on game design. I don't play CP77 yet, as I ALWAYS give CDPR games a good 3 months of patches before I play them (patches, yo!), but it's also something that can be patched/adjusted/etc after the fact.

Maybe, maybe not. Depends on how the game is designed, the engine is designed, etc.

Huh?

Depends on how the engine is designed and the game is designed. Could you design around that limitation? Absolutely. Would it be a pain? Absolutely. Look up 64k demos some day - there are all sorts of REALLY creative ways to use limited space and resources - if someone has enough of a need to. I can think of ways to use highly compressed textures loaded into system RAM and transferred / decompressed in video ram as needed - you'd just have to design your levels around streaming those textures to video ram, and procedurally generating various things, which isn't hard, just ... different.
RT and 4K are known to require more VRAM, its not a question of maybe, its already a fact, based on many games, cards and game engines.

Here are some VRAM usage stats:

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,9.html

https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,4.html

https://www.guru3d.com/articles_pages/cyberpunk_2077_pc_graphics_perf_benchmark_review,4.html
 
Not saying they don't - I'm saying that we don't know if it's going to be a limitation or not. There's LOTS of ways to manage resources. I'm arguing that by the time those get to be a limit, there will be new hardware out anyway.
 
Believe it or not, am not 100% decided, but leaning towards the 3080ti / 6800xt. Right now I would give the 3080 a 40% purchase decision, and 80% for the 3080ti / 6800xt
Ladies and Gentemen, your first case study.

Seriously, don't buy a gpu today for tomorrow and expect anything more than average performance tomorrow. Next is buy the best you can afford today.

10Gb is perfectly fine for the next few years, but I went with the 6900xt for various reason, one being I want to nerd with rage mode.
 
k like I said, some want or can afford to grab a high end GPU every 2 years, some simply dont want to, or cannot. Even though I can afford it, I will still not do it. My GPU has to last 4 years at a minimum, up tp 5 years.

I dont like the idea of swapping a GPU every couple of years, deal with selling the old one, waiting for the new one, and so on.

When I buy a GPU I expect it to last more than 2 years, we are not leasing a 2 year car here.
 
k like I said, some want or can afford to grab a high end GPU every 2 years, some simply dont want to, or cannot. Even though I can afford it, I will still not do it. My GPU has to last 4 years at a minimum, up tp 5 years.

I dont like the idea of swapping a GPU every couple of years, deal with selling the old one, waiting for the new one, and so on.

When I buy a GPU I expect it to last more than 2 years, we are not leasing a 2 year car here.

I still run a 970 on a lower end system, my next up has my old Titan Xp which will be taking over the 970 as I install my 2080Ti in this system and my main rig has a 6900XT so I have experience across generations. I typically couple this with decreasing monitor resolutions so I have less of a problem, 4k, then 3440/1440, then straight up 1440, and lastly 1080p for my old Plasma TV.

I know what I'm saying, you can absolutely game on older hardware, but don't expect to keep top end performance, and don't worry about VRAM usage (because you won't be pushing top end IQ settings anyway tomorrow on todays hardware, so compromise the name of the game). Buy the best you can today, and just live with the fact that you will see diminishing returns over the years but I HIGHLY doubt those diminishing returns will be due to VRAM limitations, but instead overall raster performance demand on the GPU.

RTX is right out until at least next generation, but I personally hate DLSS. However DLSS does give you extended legs on your GPU by spoofing higher resolution through upscaling, so if your very worried about future proofing then nVidia is probably the way to go.

Also I forgot your at 1440p not 4k, so yes, VRAM won't be a problem for 1440p for many years.
 
Pretty much that. The cards flow down to other systems or other people as needed. 2 years from now the card will be slow for new games no matter what; that’s how it works. If you want something to last 5 years without changing, that’s where consoles are. This last generation was the one real oddity where the 1080s still did fine for a very long time.
 
I think its because the 1080 came out once the PS4 and XBONE had established the base line of performance (about mid generation for the console, once games solidly came out for them without a thought for the PS3 and 360), at that point most game design is bound by Console restrictions and PC's leap ahead and stay viable for longer. If true that would make this current generation of GPUs probably one of the worst ones to jump on, definitely one of the worst ones when you consider the current supply and other issues.
 
I was all on board that "too little VRAM" train for a while.....for 4k.

For 1440p? This thread is silly. For one Ampere scales performance down to 1440p poorly as its architecture is more aimed at 4k. You're likely to encounter other performance issues at 1440p in the future long before VRAM is of any issue.

Secondly if I am being honest, the last time I actually had issues running out of VRAM where it actually ran out of VRAM was a 2GB GTX 670 in 2012.

That said...I still say 10GB for the 3080 was dumb, only because it regresses from the 11GB 2080 Ti. And 8GB for the 3070 was dumb because it was 3 generations of 70 class cards on 8GB. At 4K, I'd have concerns, but at 1440p any of these cards should be fine.
 
I was all on board that "too little VRAM" train for a while.....for 4k.

For 1440p? This thread is silly. For one Ampere scales performance down to 1440p poorly as its architecture is more aimed at 4k. You're likely to encounter other performance issues at 1440p in the future long before VRAM is of any issue.

Secondly if I am being honest, the last time I actually had issues running out of VRAM where it actually ran out of VRAM was a 2GB GTX 670 in 2012.

That said...I still say 10GB for the 3080 was dumb, only because it regresses from the 11GB 2080 Ti. And 8GB for the 3070 was dumb because it was 3 generations of 70 class cards on 8GB. At 4K, I'd have concerns, but at 1440p any of these cards should be fine.
Like I said, the VRAM usage between 4K and 1440p is not as big as u might think, and def not the same as 1080p to 1440p for example. 1440p is still heavy on the VRAM

The 3070 uses GDDR6 and 8 GB, which in today's standards, is DOA from day 1.

12 GB on the 3080 would been perfectly fine IMO.
 
Nvidia looks bad with the 10gb. AMD made them look even more stupid. Hence look over here guys, 16gb and 20gb cards incoming!
 
That said...I still say 10GB for the 3080 was dumb, only because it regresses from the 11GB 2080 Ti.
Isn't it pretty standard for nvidia to gimp the non-Ti models though? 1080 Ti was 11GB and 2080 was 8 GB. At least the gap is closer between the 2080 Ti and 3080. Still dumb though, I agree.
 
Isn't it pretty standard for nvidia to gimp the non-Ti models though? 1080 Ti was 11GB and 2080 was 8 GB. At least the gap is closer between the 2080 Ti and 3080. Still dumb though, I agree.

This was all about the magical $699 price point. Less memory means more margins on a tight margin product. If the 3080 was $999, you'd have more memory from the get go.
 
This was all about the magical $699 price point. Less memory means more margins on a tight margin product. If the 3080 was $999, you'd have more memory from the get go.
LOL, NVIDIA working with tight margins???
 
On the FE card...absolutely.

Or I should say, tighter than they are used to.
Yea, no... not even close. When AMD can produce a similar card with near double the vram for less. That's a tight margin without knowing the actual margins.
 
Think this thread is moot and needs to be archived for 6 months because the most GPU you can buy right now is a 1650 4GB without paying scalpers...
 
Back
Top