The Slowing Growth of vRam in Games

How much RAM does IDs mega-texture technology chew up? Don't remember the Rage engine getting cited for memory issues.

Rage was kinda awful at the time. I remember very blurry textures all over the place, and tons of streaming updates during play which were jarring.

Later incarnations don't seem to suffer from this. Probably a combination of better management and artistry, and just a lot more RAM overall.
 
Target of med quality= 4gb of vram so you can go back a few generations is over.

That's the low.

Problem is whether 8gb should be the High and 11+gb is "ultra" the spread of top shelf cards isn't nearly as deep as 8gb.

I think game builds should dial back.
 
I mod Bethesda, and other, games to the point - and past the point (Fallout 4) - of breaking just to see what the limits are. Using ENB I've modded Skyrim Original and SSE to the point where both slightly passed 11GB VRAM usage. Win7 DX9 Oldrim, Win 10 DX11 SSE.

As far as I'm aware, Win10 DX9 is limited to 4GB VRAM, unless MS has fixed it.
 
I just started playing GTA V and at 1080p Ultra the game (sim) eats over 4Gb of vram using RX 570 8Gb
 
The RX 570 is paired with a 3700x running at default i been testing out .. it plays so smooth and quiet i didn't miss my RX 580 or RX 5700 for these settings @ 60Hz

 
Last edited:
So HBCC is a better allocation of dynamic ram? It looks like the GTX was able to 'save' itself with dynamic ram since system ram usage was higher, but this may be misleading since we don't know what the actual minimums are on those settings. It is still sort of a moot point since you would not run a GTX at 1440p ultra on that game.
it shares the system ram along the vram which might improve the performance in some games given how some games cache the assets/textures in memory also I would assume might perform better than without HBCC when reaching 8gb /16gb limit
 
A few more games added to the vram chart. Still nothing over the top as far as vram requirements.
vram 2.png


The RX590 got less than 60 fps in Wolcen, Jedi:FI, and RDR2 at 1440p. Since that card is close in performance to the FuryX, the once 'vram deprived' card looks to be still good in today's game at playable settings.
 
I'm betting that VRAM-usage will make another large jump once the next-generation consoles are released, assuming they have more than 8GB of unified-memory. :D
 
That chart of vram usage in games looks like it was done on an 8gb card. If a GPU has more vram, some games may allocate more (not need, just allocate). A point I thought necessary to make since many people dont know the difference between need and allocate.
 
That chart of vram usage in games looks like it was done on an 8gb card. If a GPU has more vram, some games may allocate more (not need, just allocate). A point I thought necessary to make since many people dont know the difference between need and allocate.

Most games were tested with a 2080ti. COD games allocate over 9 gb on the same test
 
This factor often amuses me. Those mods are optimizing in many cases for screenshots - not gameplay. In motion it matters far less.

There are serious diminishing returns on texture size past a point, and this varies per texture and it's usage. Once you get past "my god look at these pixels", it's hard to tell and you need to flip to before/after to see it at all.

Now yes, sometimes in games with lots of textures they clamp the budget per frame too low and mods can help. But overall, I find most mods to be extreme overkill, just to say "4k texpack!". I don't need 4k on a brick on a wall seen in passing.

this - sooooo this. Often times the '4k textures' are poorly done and minimally upgraded over the low rest stock images. While moving, it's almost not even noticeable compared to vanilla, especially in almost anything I've downloaded from Nexus. Often times you get a bigger noticeable visual benefit from the most basic shader mods out there at no cost to the system.

The one exception that I still think was the greatest texture overhaul ever created was qarl's texture mod for the oblivion pc ver. He took pc graphics way beyond what anyone else was doing at the time and created something truly breathtaking both stationary and during gameplay. If anyone has oblivion, but never tried out his graphics overhaul, do your self a favor and see what a true graphics update could look like in the hands of a real talent. This was back when pc graphics were like crack cocaine and people were losing their minds over crisis. Nowadays everything has more or less hit a plateau from current generation consoles, and won't really take off again into the next console gen. Yes I know there are some great visual games like Star Citizen that are pushing the limits today, but not everyone is into that particular game right now, or the drama surrounding it.

god i'm gonna have to go reinstall oblivion again now. . .
 
I'm betting that VRAM-usage will make another large jump once the next-generation consoles are released, assuming they have more than 8GB of unified-memory. :D

I've heard this argument before , but I am still not sold on it. If that was the case, why not build games designed around the 2 GB gddr3 Wii-u? Even if PS4 and One X are the focused bottlenecks, they have done some amazing ports over to the WiiU. It would be even easier to target 8 GB vram GPUs and port them over to the PS4/OneX.

I guess we will finally see once Doom:Eternal and Cyberpunk are released as those look to be developed with the new consoles in mind.
 
It would be even easier to target 8 GB vram GPUs and port them over to the PS4/OneX.
The consoles do not have 8GB of VRAM, they have 8-12GB (depending on the console/model) of unified-RAM, of which 2-3GB is reserved for the OS itself, and the rest is 'shared' between the game data and VRAM sections.
There was a large bump in VRAM usage back in late 2013 and early 2014 when we moved from the PS3 (256MB VRAM) and 360 (512MB of shared memory), and GPUs started moving from 1GB-2GB of VRAM to 3-8GB of VRAM within 1-2 years.

This will most likely happen again, assuming the next-gen consoles have more than 8GB of unified-memory to work with.
I'm guessing it will be 12-16GB of unified-memory, but that's just my personal speculation, not a fact.

Remember too, that DOOM: Eternal and Cyberpunk 2077 are being made with this generation's consoles in mind, and we most likely won't see the boost until after the next-gen consoles are released, which for the PS5, will be ~December 2020.
So we will probably know around 2021 whether or not this will be true or not, officially anyways.
 
A big question mark for next gen is regarding the current rumors with respect to how much they'll leverage SSDs with more direct asset streaming.

A potential problem with the PC is that it might not really be possible to have equivalency in that area and the will result in the need to allocate more into memory, in this VRAM, as the compensation.

I've heard this argument before , but I am still not sold on it. If that was the case, why not build games designed around the 2 GB gddr3 Wii-u? Even if PS4 and One X are the focused bottlenecks, they have done some amazing ports over to the WiiU. It would be even easier to target 8 GB vram GPUs and port them over to the PS4/OneX.

I guess we will finally see once Doom:Eternal and Cyberpunk are released as those look to be developed with the new consoles in mind.

WiiU ports don't look the same.

With PC games when people on forums such these discuss "requirements" they aren't really looking at minimum or even "recommended" requirements or console equivalent settings. Colloquially the term for most is referring to max settings minus hardware (vendor) specific settings and minus deferred anti aliasing (eg. MSAA but not SMAA). In practice this tends to inflate requirements over those of the console as the settings are higher.

That chart of vram usage in games looks like it was done on an 8gb card. If a GPU has more vram, some games may allocate more (not need, just allocate). A point I thought necessary to make since many people dont know the difference between need and allocate.

It's even more complex than that. VRAM usage, both allocation and actual performance impact, can very scene to scene in a game and depend on factors such as how long the game has been running. The OP for example shows <6GB VRAM allocation for Red Dead Redemption 2 with Vulkan even at 4k but if we look Techspots recent test the 6GB card from both vendors (eliminates a potential vendor driver specific issue) show a lot more pressure on 1% lows likely due to lack of VRAM even at 1080p -

https://static.techspot.com/articles-info/1982/bench/1RDR2.png
Guru3D also has different allocation numbers - https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,4.html

Which differs from TPU - https://www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Which differs from gamegpu - https://gamegpu.com/action-/-fps-/-tps/red-dead-redemption-2-test-gpu-cpu

Anecdotally (since I don't have the numbers anymore) I remember and interesting issue with determining how much VRAM a game actually needed when the last console releases spiked requirements. Ryse Son of Rome was a level based game (open world ones through even more issues in) and the first roughly half of the game (levels) used significantly less VRAM than several levels in the last half of the game. As in what wouldn't stutter in something like >50% would in at least >25% of the games levels. So if you only tested it based on any level in the first half you'd have a very differing result.

Even worse is some games regardless of your texture settings will actually silently manage it in the background depending on how much VRAM is detected.
 
Most games were tested with a 2080ti. COD games allocate over 9 gb on the same test
OK. Point was many games dynamically allocate vram according to capacity available. COD = 9gb with 2080ti fully understandable.
 
It's even more complex than that. VRAM usage, both allocation and actual performance impact, can very scene to scene in a game and depend on factors such as how long the game has been running. The OP for example shows <6GB VRAM allocation for Red Dead Redemption 2 with Vulkan even at 4k but if we look Techspots recent test the 6GB card from both vendors (eliminates a potential vendor driver specific issue) show a lot more pressure on 1% lows likely due to lack of VRAM even at 1080p -

https://static.techspot.com/articles-info/1982/bench/1RDR2.png
Guru3D also has different allocation numbers - https://www.guru3d.com/articles-pag...-graphics-performance-benchmark-review,4.html

Which differs from TPU - https://www.techpowerup.com/review/red-dead-redemption-2-benchmark-test-performance-analysis/4.html

Which differs from gamegpu - https://gamegpu.com/action-/-fps-/-tps/red-dead-redemption-2-test-gpu-cpu

Anecdotally (since I don't have the numbers anymore) I remember and interesting issue with determining how much VRAM a game actually needed when the last console releases spiked requirements. Ryse Son of Rome was a level based game (open world ones through even more issues in) and the first roughly half of the game (levels) used significantly less VRAM than several levels in the last half of the game. As in what wouldn't stutter in something like >50% would in at least >25% of the games levels. So if you only tested it based on any level in the first half you'd have a very differing result.

Even worse is some games regardless of your texture settings will actually silently manage it in the background depending on how much VRAM is detected.
There is also the potential mistake where one can assume poor performance or stutters is due to lower vram when it may be other factors at play. Some mid-range or weaker cards can run out of GPU power before the lesser vram has an impact. A good way to be sure is to test identical cards with 4gb and 8gb versions. I think good game developers have ways to mitigate the impact of lower vram or at least warn you of potential problems (ie, ROTR). The only game I've seen where lower vram impacted my performance is Skyrim with tons of high rez texture packs. The developers apparently did not take into account the insane texture mods that would one day be used with the game.
 
New vram hog and once again it is Wolfenstein. This time it is Youngblood + dlss + rtx.
https://www.tweaktown.com/articles/...ed-nvidia-dlss-ray-tracing-tested/index4.html
In this case, 6 GB falls short on even 1080 performance, though 8 GB seems to be enough all the way through 4k quality.

This makes sense as tpu measured 5.5 GB at 1080 and 6.3 GB at 4k. Performance seemed to reflect this. This was with no DLSS and RTX.

RTX seems to use about 1 GB across all resolutions using the few sample available and DLSS seems to use only 250 MB as seen on Metro, though that was version 1.0 dlss.

Given similar numbers, Youngblood with all features will need about 6.8 GB at 1080p and 7.6 GB at 4k. Rather incredible!

RTX 2060 6 GB users will most likely still be able to run with DLSS and no RTX or RTX with no uber textures. Definately worth it as the 'dlss 2.0' gives a free 30% performance in many cases with comparable visuals.
 
I've heard this argument before , but I am still not sold on it. If that was the case, why not build games designed around the 2 GB gddr3 Wii-u? Even if PS4 and One X are the focused bottlenecks, they have done some amazing ports over to the WiiU. It would be even easier to target 8 GB vram GPUs and port them over to the PS4/OneX.

I guess we will finally see once Doom:Eternal and Cyberpunk are released as those look to be developed with the new consoles in mind.
because the marketshare for Xbox and Playstation is much higher than a Nintendo Wii.. also they are the one which provide the best possible performance they are trying to push better graphics/ better performance(at higher frame rate) not wider compatibility along platforms
 
It is my belief that VR will push the standard from 8gb to 16 way before 8k tvs will. In my opinion 4k tv's will have a long run like 1080p did. 1440p will remain good enough for a great deal of people for monitors. That leaves VR to raise the bar earlier. Note that Texture space needs to be quite large for VR because you are often able to get much closer to textures while in VR and thus the textures need to be more detailed.
 
All I know is Final Fantasy 15 eats and eats and eats. When I max that game and run the native FFXV resource viewer the VRAM just vanishes
 
Doom Eternal results and not as bad as I actually expected:
https://www.techpowerup.com/review/doom-eternal-benchmark-test-performance-analysis/4.html

x-value came to 188.5 MB and y-value came to 6806 MB meaning all resolutions use over 7 GB theoretically, but it really doesn't seem to need even that. Perhaps the idtech engine uses dynamic memory more effectively now. 6 GB seems to be enough even at ultra Nightmare.

Nvidia continues to show a more efficient memory controller. The 6 GB 2060 stays 5% ahead of the 8 GB 1080 at all resolutions, while the 6 GB 5600XT starts to fall behind the Vega 64 at higher resolutions. The 1650 Super also demolishes the 4 GB 5500xt.

Both next gen consoles are maxing out at 16 GB vram which is only 2x that of last gen so hardware. Developers apparently saw a similiar trend in game requirements. (Older console gens had 8x increase over prior generations.)
 
Well looks like Doom Eternal does actually need the most vram yet.
HUB took a hit loss on their Doom eternal video due to some issues so I figured they deserve some love for their transparency in pulling the video:

The silver lining is that Steve discovered that the "P trick" doesn't actually work and most likely adjust texture pool size to compensate for lower vram cards despite actually saying "ultra nightmare" textures. Here is a comparison between the HUB results and the TPU results. Focus on where the 5600xt compares to the RTX 2060:
20200325_075918.jpg

Screenshot_20200325-075719_YouTube.jpg


Just as 3 GB Nvidia = 4 GB AMD it seems that 6 GB Nvidia = 8 GB AMD as the 6GB 2060 doesn't seem to have an issue even without the "p trick".

Many rightfully claim that there is not much visual difference among the highest settings, but the results of many other sites are still erroneous and shows that Doom: Eternal is the new vram boss with everything cranked up.
 
Ah yes, 8.29 megapixels. The widely more recognized standard for 3840x2160. Glad you chose the most effective way to communicate resolution, well done.

But really, nice work. Still, yeah, pixel resolution is objectively the better choice.

More games above 8192MB than I thought. 3070 in trouble?
 
Ah yes, 8.29 megapixels. The widely more recognized standard for 3840x2160. Glad you chose the most effective way to communicate resolution, well done.

But really, nice work. Still, yeah, pixel resolution is objectively the better choice.

I had to use pixel count for graphing purposes in order to calculate vram scaling easier.
 
Ah yes, 8.29 megapixels. The widely more recognized standard for 3840x2160. Glad you chose the most effective way to communicate resolution, well done.
He did that, not because of the "standard" resolution of 3840x2160, but because other ultra-wide resolutions can be similar in megapixels but are not that exact resolution.
I actually prefer this, because not everyone uses a 16:9 or 16:10 monitor, especially in this era, and megapixels is easier to calculate out than having recalculate a different resolution.
 
Ah, so is the formula the average of the 4 resolutions? Makes sense then, as you can scale to ultrawides. Glad to see it, just didn't get why you used MP before.
 
Ah, so is the formula the average of the 4 resolutions? Makes sense then, as you can scale to ultrawides. Glad to see it, just didn't get why you used MP before.

It's the linear formula in the A(x) + By = C where x is the mega pixels and C would be the total vram needed. Most ended up with R2 factors of .999 or better, so very accurate.

For example, you want to run MS Flight Simulator on an Nvidia card using 3x 1080p monitors (6.22 mp). For vram, you would need 0.415 * 6.22 + 5.575 = 8.16 GB, at least in the low density areas.
 
Ah yes, 8.29 megapixels. The widely more recognized standard for 3840x2160. Glad you chose the most effective way to communicate resolution, well done.

But really, nice work. Still, yeah, pixel resolution is objectively the better choice.

More games above 8192MB than I thought. 3070 in trouble?
Keep in mind that just because a game uses more VRAM when available, doesn’t mean it actually requires it to be smooth. Some games like CoD MW have an option to use all the VRAM if you want. Normally my usage would be around ~4-4.5GB, if I check that option I’m obviously close 8GB.
 
Keep in mind that just because a game uses more VRAM when available, doesn’t mean it actually requires it to be smooth. Some games like CoD MW have an option to use all the VRAM if you want. Normally my usage would be around ~4-4.5GB, if I check that option I’m obviously close 8GB.
This been said many times and people still fail to understand it. 10gb on the 3080 will be fine outside niche use cases. Would I like more? Sure but for $699 it is a great deal. Go play something else outside heavy moded Bethesda games. They were never good to begin with.
 
  • Like
Reactions: Auer
like this
Keep in mind that just because a game uses more VRAM when available, doesn’t mean it actually requires it to be smooth. Some games like CoD MW have an option to use all the VRAM if you want. Normally my usage would be around ~4-4.5GB, if I check that option I’m obviously close 8GB.

CoD and other games like Lord of the Rings are those that I left off of here for that reason. They seem to cache way above what they need and show no evidence of stuttering with lower vram.

CoD is often the title used in arguing for more vram as they are "showing over 9GB of Vram"
 
Forget what Cod does, let's see what Cyberpunk 2077 eats at 4k maxed settings.

I swear ray tracing in Cod is just a binary fence for black hole shadows or slightly less flat black holes. They really need to figure out how to balance the dark spots of their maps.
 
At 3440 bfv seemed to have maxed my 2080ti out if I remember right.

But vram is more than gaming. For instance I have done fusion projects in Davinci resolve that took the entire 11GiB vram my entire 32GiB system ram and another 20GB of SSD space just to run the fusion job. So vram is definitely not just for games. Id think that games vram should be as minimal as possible, instead of aiming to saturate vram, games should be maximally optimized instead.
 
This been said many times and people still fail to understand it. 10gb on the 3080 will be fine outside niche use cases. Would I like more? Sure but for $699 it is a great deal. Go play something else outside heavy moded Bethesda games. They were never good to begin with.

No reason for them to skimp on the vram for what they are charging.
 
No reason for them to skimp on the vram for what they are charging.

They could have even done 12GB of 384-bit with 16 Gbps GDDR6. Would have been 768 GB/s which is comparable to the 760 GB/s it has. Or even save a few more bucks use 15.5 Gbps again (like the 2080 Super had) which 12GB on a 384-bit bus would use 736 GB/s. What is cheaper 12GB of GDDR6 or 10GB of GDDR6X? Even if the latter is cheaper, it can't be that much of a disparity because per GB there's no way they aren't charging more for the superior GDDR6X.

While I believe Nvidia is right that for current games and games in immediate development (2021) 10GB is enough save 1 or 2 exceptions, the future past that will not be kind to 10GB. Those looking to skip every other gen may be hurting in ~2023ish.

I honestly believe that choosing 10GB of G6X over 12GB of G6 is partly to induce faster obsolescence. A lot of Pascal owners skipped Turing, and JHH even reassured them during the Ampere announcement that now is the time to upgrade. They may want Ampere users to adopt RTX 4000 series in higher numbers. Also another benefit is nervous users might instead go for the 3090, or pay extra for the 3080 Ti or 3080 / Super if there is a 20GB version available later this year or next year. And naturally, the 16GB card's premium would more than make up for the extra cost of the VRAM.
 
Last edited:
While I believe Nvidia is right that for current games and games in immediate development (2021) 10GB is enough save 1 or 2 exceptions, the future past that will not be kind to 10GB. Those looking to skip every other gen may be hurting in ~2023ish.
At $699, 2 years or so of use seems allright.

Can't expect it to go forever, things change fast.
 
At $699, 2 years or so of use seems allright.

Can't expect it to go forever, things change fast.

I can expect VRAM to be appropriate for a 4 year span on a $699 card. It was for the $699 GTX 1080 8GB (I'm using it now 4.33 years after launch and 8GB doesn't fail at this performance level), and GTX 1080 Ti 11GB (certainly will be fine in 2021). 16GB will be fine for the $699 Radeon VII in 2023 without a doubt. I'm not sure how well the $699 3GB GTX 780 Ti did in 2017, but Kepler should not be our target for aging well.
 
Back
Top