The Slowing Growth of vRam in Games

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,279
With the release of BFV, I wanted to see what the requirements of vRam is in the current state. For most games, vRam requirements are as follows: Ax + By = C where x is the number amount of vRAm per 1 mega pixel and y is the "vram overhead" or amount of vRam that is always added on no matter what the resolution.

We know that 1080p is 2.07 Megapixels, 1440p is 3.7 MP and 4k is 8.3 MP.
So if a game has a y of 3 GB and an x of 0.1 GB, vram usage would be:
3.207 GB at 1080p (2.207*0.1+3) , 3.37 GB at 1440p (3.37*0.1+3)and 3.83 GB at 4k (8.3*0.1+3)
If a game has a high "y value" in relation to the x value, it seems inefficient imho.

Lets go way back to Crysis 1 in 2007 with much lower resolutions:
http://hardforum.com/showthread.php?t=1456645

defaultluser reports .31 GB@ .48 MP, .36GB@ .79 MP .45GB@ 1.31 MP
and .575 GB at 1.9 megapixels

Using the highest and lowest res:
1.9x + y = .575 GB and 0.48x + y = .31 GB We can substitute for y and solve for x:
ie y = .31 - .48x which means...
1.9x + (.31 - .48x) = .575 or...
1.42x = .265 GB so x = .187 GB
We can than solve for y in the equation above.
y = .575 - 1.9(.187) so y = .22 GB
Here, x and y factors were VERY close.

This can be checked using the middle resolution
.79(.187) + .22 = .368 GB vs .36 GB that was recorded.

Moving to 2015 where the FuryX looked like it had a very bleak future with only 4 GB of vRam.
Starting with Tomb Raider in 2015 which uses 1.5 GB ram at 1080p and 3.1 GB with 4k
Now use 2.077 (megapixels) for 1080p and 8.1 for 4k
Using the same formulas above...
8.3.x + y = 3.1 .... and 2.07x + y = 1.5 .... using substitution,
8.3x + 1.5 - 2.07x = 3.1 ...reduce to x and
x=.257 or in other words .257 GB needed for each Megapixel
now we can solve for y:
2.07x + y = 1.5
y = 1.5 - 2.07x ....replace x with .257 and
y=.968 or about 1 GB overhead
Now lets test with 1440p: 3.7x + y = 1.92 vs 1.94GB Actual

But other game were far worse:
Take for example Middle Earth:SoM as recorded by Tweaktown. Using the above formulas...
x = .1 GB and y = 4.56 GB
I tested on 1440 p and got 4.93 GB vs. 4.97 GB Actual by Tweaktown.

Metro Last light was also tested
x = .115 and y = 1.06
testing with 1440p gave me 1.49 GB vs. 1.46 GB that they recorded.

And also Far Cry 4.
x = .43 GB/ megapixel (highest so far!) with y being 2.17 GB
Again testing on 1440p: 3.7x + 2.17 = 3.76 GB vs 3.77 GB actual by Tweaktown

So even in 2015, you had "y overheads" ranging from 1 GB to a whopping 4.5 GB.

Which brings us to 2018 with Battle Field V:
Both TPU and Guru 3d had Vram numbers:
Staring with TPU: https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V/4.html
8.3x + 4.83 - 2.07x = 6.74
8.3x - 2.07x = 6.74 - 4.83
6.23x = 1.91 or x = .307 Now solving for y...
2.07x + y = 4.83
6.35 + y = 4.83 or y = 4.195 GB
testing with 1440p... 3.7x + y = 5.33 GB vs 5.36 GB actual

And then Guru3d: https://www.guru3d.com/articles_pages/battlefield_v_pc_performance_benchmarks,7.html
8.3x + 4.94 - 2.07x =6.99 ....
6.23x = 2.05 or x = .329 again solving for y...
2.07x + y = 4.94
.681 + y = 4.94 or y = 4.259
testing with 1440p... 3.7x + y = 5.48 GB vs 5.49 GB actual

Both Reviews had very close values.
The point here is that these numbers are no worse than what 2015 was getting us, despite the 10x increase going from 2007 to 2015. Also, let's not forget that vRam requested is not the same as vRam required. Vulkan perhaps being the exception.

** It is interesting to note that the FuryX was just behind the GTX 1070 in TPU and WAY behind in Guru3D
Perhaps the slightly higher settings and vRam requirements of Guru3d were JUST enough to put the FuryX on its face.

So, would I buy a mid to high end card with 4GB in 2018 to play the newest games? Hell no, but I am willing to wager that 8 GB will be sufficient for FAR longer than most anticipate.
 
And people frequently complain about low texture resolution. 8GB is enough if you're satisfied with mediocrity, I guess. Frankly, with the push for 8K by the display industry I'm surprised we haven't seen 16GB cards in the consumer space, yet.
 
And people frequently complain about low texture resolution. 8GB is enough if you're satisfied with mediocrity, I guess. Frankly, with the push for 8K by the display industry I'm surprised we haven't seen 16GB cards in the consumer space, yet.

No one is running out to buy over priced 8k screens. Almost all your high end cards have over 8 GB and the mid range have around 8 GB which is fine, why pay for more ram you will never use. 4K is barely seeing any traction since the prices are low enough for most now and the broadcast world is stuck on 1080p. 16GB consumer cards would be a waste and add 0 benefit.
 
And people frequently complain about low texture resolution. 8GB is enough if you're satisfied with mediocrity, I guess. Frankly, with the push for 8K by the display industry I'm surprised we haven't seen 16GB cards in the consumer space, yet.

I am not sure what you mean by that as I have never seen "only" 8 GB ever being an issue. Even if a game requests mor3e than 8 GB, the dynamic ram is usually enough to fill the gap. Hell, most of the time, 4 GB sees no penalty over a similar 8 GB card when this much ram is requested - Vulkan not withstanding.

One of the advantages of having the extra vram is that you can get away with less system ram as some games have a combined ram requirement of 12 GB or more.

Techspot showed this when the 6gb gtx 1060 ran fine with all games with only 8gb of ram while the 3 GB gtx 1060 sometime had a penalty for less than 16gb of ram.
 
Last edited:
No one is running out to buy over priced 8k screens. Almost all your high end cards have over 8 GB and the mid range have around 8 GB which is fine, why pay for more ram you will never use. 4K is barely seeing any traction since the prices are low enough for most now and the broadcast world is stuck on 1080p. 16GB consumer cards would be a waste and add 0 benefit.
What about that pimax 8kx VR headset? 2 4K images @80+fps will need plenty of ram. :p
 
How long is far longer? The 8GB consoles ushered in the current wave of higher texture resolutions. If next gen goes to 12 or 16GB we'll see another bump.

Render resolution isn’t that important going forward. It’ll be texture quality and variety that drives memory consumption. Geometric complexity will bring another bump when raytracing takes off in the hopefully not distant future. Character and object detail is still a joke in 2018.
 
You should probably have led with this.

haha sorry.

How long is far longer? The 8GB consoles ushered in the current wave of higher texture resolutions. If next gen goes to 12 or 16GB we'll see another bump.

Render resolution isn’t that important going forward. It’ll be texture quality and variety that drives memory consumption. Geometric complexity will bring another bump when raytracing takes off in the hopefully not distant future. Character and object detail is still a joke in 2018.

Consoles are now 8gb and 12gb of vram, but that is the total combined. When you have 8gb video card with 16 GB of ram, you effectively have 24gb of combined ram minus overhead from Windows. The is assuming the game can leverage the dynamic range which many dx12 games do a great job.

An article from techspot illustrates this.
https://www.google.com/amp/s/www.techspot.com/amp/article/1535-how-much-ram-do-you-need-for-gaming/
 
I was curious to see how other 2018 games compared to older versions from 2015.
We already see that BFV uses about double that of BF1.

Now lets look at Shadows of the Tomb Raider. The 2015 version was one of the most ram efficient games back then.
https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,8.html
For 1080p it uses 6.4 GB and 4k uses 7.25 GB so...
8.3x + 6.4 - 2.07x =7.25
6.23x = .85 or x = .136 GB. Strangely, this is about half of what the 2015 version was.
2.07x + y = 6.4
y = 6.4 - .282 or y = 6.12 GB
Damn, that is over 6x that of the 2015 version. Laura got her a fat ass!
Confirming with 1440p ... 3.7x + y = 6.62 GB vs 6.60 GB actual.

Next is let's look at Far Cry 5 compared to FC4. The 980ti graph was used.
https://www.guru3d.com/articles_pages/far_cry_5_pc_graphics_performance_benchmark_review,9.html
8.3x = 2.99 - 2.07x + 3.99
6.23x = 1 or x = .161 This is about 1/3 of that of FC4!
2.07x + y = 2.99
.333 + y = 2.99 or y = 2.66 GB - slightly more than FC4
1440p calculated is 3.7x + y = 3.26 GB vs 3.36 GB actual

Finally we have Middle Earth: SoW (2017). We know that ME: SoM was already a pig, so let's see ho this one did.
https://www.guru3d.com/articles_pag..._graphics_performance_benchmark_review,8.html
8.3x + 8.75 - 2.07x = 9.31
6.23x = .56 or x = .090 GB / MP Once again this is lower than the previous version and the lowest of any game on this list.
2.07x + y = 8.75
.186 + y = 8.75 or y = 8.564 GB. This is almost twice as much its fat older brother.
1440p check gives 3.7x + y = 8.90 GB vs 8.88 GB actual.

Somebody mentioned 8k, so we can estimate that value using 33.2 Megapixels for 8k.
lets start with the above game, which use a whopping 8.75 GB at 1080p.
33.2x + y = 11.55 GB at 8k.

Now compare this to FC4 which used only 3.06 GB at 1080p with only a 2.17 GB y value, but had a huge x value of .43 GB.
33.2x + y = 16.45 GB at 8k. So at 8k, that game would actually use more than ME: SoW.
 
Last edited:
I realize I am beating a dead horse at this point, but now we get a first look at how Ray Tracing will affect vram usage:
https://www.techpowerup.com/reviews/Performance_Analysis/Battlefield_V_RTX_DXR_Raytracing/4.html

First without:
8.3x + 4.45 - 2.07x = 6.50
x = .329 GB
2.07x + y = 4.45
y = 3.77
1440p check: 3.7x + y = 4.99 GB vs 4.95 actual

And now with Ray Tracing
8.3x + 5.52 - 2.07x = 7.56
x = .327 GB
2.07x + y = 5.52
y = 4.84 GB
1440p check: 3.7x + y = 6.05 GB vs 6.14 actual

So yep, once again the x value did not increase. Ray tracing just added 1.1 GB across all resolutions.

Since you are forced to run 1080p with the best video cards with DXR, you will actually need LESS vram :p
 
I’m going to need to see a spreadsheet

Goddam it, I need to find something better to do with my free time.
Screenshot_20190217-150352_Excel.jpg


I left out the two Middle Earth games as their vram usage seemed to be a cache fill since lower vram cards did not suffer.

Metro Exodus is very efficient given the world size and visual quality.
 
How long is far longer? The 8GB consoles ushered in the current wave of higher texture resolutions. If next gen goes to 12 or 16GB we'll see another bump.

Render resolution isn’t that important going forward. It’ll be texture quality and variety that drives memory consumption. Geometric complexity will bring another bump when raytracing takes off in the hopefully not distant future. Character and object detail is still a joke in 2018.
The consoles have unified memory. Originally the Xbox One had 3GB of that 8GB reserved for the system, leaving 5GB TOTAL left for games. Meaning developers had to actively plan on how that memory was going to be allocated at any given time. You were basically looking at 2-3GB avaiable for video output. Today, the Xbox One and PS4 make 7GB out of the 8GB total available for games, while the One X makes 10GB out of 12GB available. Developers still have to carefully budget available memory between gameplay system and video needs. This is the primary reason why overall visual fidelity hasn't improved as much in recent years as it has in the past, as developers look for other visual tricks to improve it like photogrammetry.
 
all of that math makes me wonder if 6GB is too little for a new gaming card at 2019.

8Gb Vegas are becoming more tempting
I started noticing my GTX 980 Ti GPUs were becoming VRAM-limited with 6GB in quite a few games, and I don't even game at 1440p.
Never thought I would see the day where 6GB VRAM doesn't seem to be enough at 2K resolutions.
 
I started noticing my GTX 980 Ti GPUs were becoming VRAM-limited with 6GB in quite a few games, and I don't even game at 1440p.
Never thought I would see the day where 6GB VRAM doesn't seem to be enough at 2K resolutions.

Curious as to which games run out and what happens. From what I gather, a game will simply pull from system ram, slow down, or drop some textures.
 
So, would I buy a mid to high end card with 4GB in 2018 to play the newest games? Hell no, but I am willing to wager that 8 GB will be sufficient for FAR longer than most anticipate.

even 6GB seems like it'll be enough for the foreseeable future
 
Curious as to which games run out and what happens. From what I gather, a game will simply pull from system ram, slow down, or drop some textures.
Massive slowdowns are what I see when it runs out (system RAM is way slower in data transfer rates than VRAM).
The big one that comes to mind is Fallout 4 with the HD texture pack installed with everything at high @ 2K.
 
4k still need another 5 years to get to mainstream mid tier for 60fps+ ,also depending on how much ram the next gen consoles have. Remember everything is based on this... Now that most of the time the Xbox one x get ultra PC textures.

8k is a good 10-15 years away.. I update my videocard each 3 years so ..
 
even 6GB seems like it'll be enough for the foreseeable future

From all of the tests we have seen and how little ACTUAL vram consumption has increased, the RTX 2060 will almost always run out of GPU power before hitting VRAM limitations.

It is the exact same scenario that the Fury X was on almost 4 years ago. So many made an issue of the frame buffer size being worried about "future titles."

Hardware Unboxed just did a test of Metro: Exodus, which is one of the most demanding titles to date, ignoring a bunch of gay texture modded games.

The Fury X did not take a vram performance hit, even at 4k ultra, despite it only having the gpu power to play 1080p ultra.

Even the 3 GB GTX 1060 did not get affected until 1440p ultra. It only managed 43 fps at 1080p ultra, so a higher setting/resolution is pointless anyhow.

Going further back and using 1080p Medium, the GTX 780 3 GB only managed 41 fps while the 2 GB GTX 1050 managed to get 43 fps with good minimums. So, you can be almost certain that the RTX 2060 will be fine 7-8 years from now with 6 GB as you will most likely need to play at medium settings anyhow.
 
How long is far longer? The 8GB consoles ushered in the current wave of higher texture resolutions. If next gen goes to 12 or 16GB we'll see another bump.

Render resolution isn’t that important going forward. It’ll be texture quality and variety that drives memory consumption. Geometric complexity will bring another bump when raytracing takes off in the hopefully not distant future. Character and object detail is still a joke in 2018.
Console ate not even using 8GB for games. 3 GB is reserved for OS and background tasks.
 
When it comes to VRAM use, Raytracing hasn't really made a splash yet (more like a puddle... a muddy one at best... the kind that's already half dry). :D Tensor cores / AI nonwithstanding, I imagine the only other real need for extra large VRAM capacity would be CUDA cores for video processing / Adobe software use.
 
Massive slowdowns are what I see when it runs out (system RAM is way slower in data transfer rates than VRAM).
The big one that comes to mind is Fallout 4 with the HD texture pack installed with everything at high @ 2K.
This. I was running an R9 Fury until last year. Great card, but the newer games at 1440P+ chew up 4GB of VRAM fast. I was experiencing massive performance hits similar to the feeling when you ran out of system RAM years ago and Windows would start using your virtual memory on whatever old spinner HDD you were using.
 
Like I said, vram scales much faster with settings than resolution. 1080P, 1440p, and 4k all need somewhere between 4 and 6 GB of vram, at least up to 2060 level of performance, yet 1080p medium needs less than 2 GB:

Screenshot_20190219-200334_Samsung Internet.jpg


So it is a good bet that if a game needs more than 8 GB years from now, it will be fine with 4 GB with balanced settings.
 
Like I said, vram scales much faster with settings than resolution. 1080P, 1440p, and 4k all need somewhere between 4 and 6 GB of vram, at least up to 2060 level of performance, yet 1080p medium needs less than 2 GB:

View attachment 143092

So it is a good bet that if a game needs more than 8 GB years from now, it will be fine with 4 GB with balanced settings.
"So it is a good bet that if a game needs more than 256MB years from now, it will be fine with 128MB with balanced settings."
 
The things that impact RD 2 is resolution scaling and texture size as 2 and 3 Gb textures I would think means what it says, extra 2Gbs of vram needed over what the game needs .
 
I never get these kinds of threads with people hand wringing over some new 'development or requirement' that might be coming over the hill in a couple of years time that will mean possible investment in new gear.

Like it was such a pain the previous 33 times this has happened in their computing history. :D

Do like I do, just don't worry about it, sit back let the early adopters soak up the R&D cost/bugs/pain/competing standards and then jump in when it's standardised and costs $250.

You know, like how much did those folks pay for their 4K non HDR screens just three years ago?
 
  • Like
Reactions: Kzoak
like this
Interesting how Guru3D's charts show the optimization working well for AMD.

index.png
 
So I decided to check how vram worked in Rage 2:
https://www.techpowerup.com/reviews/Performance_Analysis/RAGE_2/4.html

Right off the bat, I could tell the standard Ax + By = C model for vram would not work in this game. However, I played with the numbers and quickly found that there was still a pattern. I noticed that the square root of the vram consumption for 4k was exactly twice that of the sqrt for 1080p.

After a bit of struggling to get a solid formula, I came up with the following: (2*sqrt(m))/2 = sqrt(v). where m is megapixel and v is vram consumption. This formula is not only close to what TPU got, but it is EXACT, down to 3 significant figures for all 4 resolutions! The formula is not exactly linear, but to me, it makes alot more sense than the Ax + By = C model of every other game here as those games would all theoretically use 2, 4, or even 8 GB of vram with a 1 pixel screen. This game would basically use nothing.

Still, this seems WAY to neat. Heck, there is not even a constant that is being used to get from megapixels to GBs. Either the game spits out a precalculated value based on the users resolution using the above formula or TPU is messing with the numbers. Either way, it is VERY UNLIKELY that these were real measured values.

TPU Megapixel and Vram numbers below.
1.44 = 2.89
2.07 = 2.92
3.69 = 3.22
8.29 = 5.59
Check for yourself!
 
Kingdom Come: Deliverance nearly maxed my old 980Ti at 3840x1600. Crytek engine. Had 500MB free if I was in town
 
Back
Top