The Slowing Growth of vRam in Games

Recent games have been brutal for 8 GB cards where there has been enough gpu power but a lack of vram









This does not bode well for 4060 and 4060ti cards coming up

Yeah kind of wish Nvidia would stop scraping by with bare minimum vram. Can't have people keeping a card for 3+ gens like the 1080ti, I guess. Apparently nobody learned lessons from the Fury X.
 
Last edited:
I made a point not to upgrade my 1080ti until I could get a videocard with more VRAM, the 3080 12GB barely qualified for that, but I probably should have held out for a 16gb model.

😑
 
Recent games have been brutal for 8 GB cards where there has been enough gpu power but a lack of vram









This does not bode well for 4060 and 4060ti cards coming up

That's a game engine/optimzation problem not a Vram problem.

The dev's are working on it, patch might already be out.

The video is disingenuous, but makes for good click-thru. Hold your judgement until there is more data.
 
That's a game engine/optimzation problem not a Vram problem.

The dev's are working on it, patch might already be out.

The video is disingenuous, but makes for good click-thru. Hold your judgement until there is more data.
Agreed. This port was poorly done.
 

RTX 3070 vs RX 6700 XT in 2023



Will see on the next gpu release when the latest patched title will be retested how much it is something that are not memory leak big issues, maybe there is starting to exist game that will have setting-resolution that would do 70 fps on a 6700xt and would on a 3070 but struggle because of lack of vram or almost certainly for people that would not mind to game at 45-50 it will start to be, but a couple of patch-driver update cycle will be needed to know if it is getting close or over the limit. Those vram issue at launch do seem to show, that once it becomes a significant issue there will not be guessing game needed, the 6700xt-2080Ti will eat the 3070 alive in benchmark result
 
Last edited:
Agreed. This port was poorly done.
As the port does have issues. It does run fine on a lot of higher end system. Usually the people who can’t run it was running a lower end pc’s.

That’s why I ignore steams reviews. A lot of those people bitching are running 1060/2060/3050 cards expecting it to run at high settings.
 
As the port does have issues. It does run fine on a lot of higher end system. Usually the people who can’t run it was running a lower end pc’s.

That’s why I ignore steams reviews. A lot of those people bitching are running 1060/2060/3050 cards expecting it to run at high settings.
Most steam reviews that I read on release day were crashing issues. Not to mention that ridiculous shader compile time.
 
Yeah the xx70 series should have doubled vram every other gen while upgrading to speed in-between while maintaining a 256 bit bus. 1070 8gb gddr5, 2070 8gb gddr6, 3070 16gb gddr6, 4070 16gb gddr6x. Instead, we are at 128 bit gddr6x 8 gb.

8 Gb on the 3070ti is far worse than 4 gb on the Furyx

**edited for mistake
 
Last edited:
Bandwidth matters too and can affect min frame times regardless of vram pool. The fact that a 4070ti has no more bandwidth than a 2015 FuryX is absurd. While nvidia does have their infinity cache, I doubt it's much use in legacy engines. This is part of the reason why the FuryX continued to do well outside of unrealistic demands and classic AMD driver issues when going up against the 980ti.
 
Last edited:
I moved back to AMD from my RTX 3070, the extra 2Gb of vram on the RX 6700 allows me to play Fortnite in 4K.

 
When I bought the 3080 10GB when It came out I saw it pretty close to maxing out at 4k on games. I realized 3090 was the min for 4k ultra so I upgraded and have kept with the xx90 since.
 
There’s a lazy dev problem, hasn’t this developer notoriously been bad with ports (Arkham Knight), I’m sure Naughty Dog will fix the game, it’s a shame they didn’t use Nixxes (both of the new Spider Man games, Tomb Raider reboot trilogy, DE: Mankind Divided), I’m guessing they are most likely working on something already. Anything for views and clicks, integrity be damned. I’m sure 4k will increase VRAM requirements but it shouldn’t be 16GB…
 
Yes, ideally lazy ports shouldn't happen. But they do, and will always continue. And quite a few times will be very sought after titles. The point is I'm sorry, but for the prices nvidia charges, there should really be more vram for even those outlier titles.
 
Agreed. This port was poorly done.
Yeah. It's a console port, both the xbox-x and ps5 have 16Gb of memory. They have just a single shared memory for the OS operations and graphics operations. The game on consoles likely uses 1Gb for the engine and 14Gb or so for graphics. Poorly port you game = runs ok on pc's with 14gb or more graphics memory, but has issues with 8Gb or 12Gb vRam cards. Meanwhile properly ported games play just fine on those same cards.
 
Hardware Unboxed talked more about it in their recent Q&A...timestamped video below...

 
  • Like
Reactions: noko
like this
The frustrating thing here too is that nVidia specifically prevents hardware modification from board partners.
EVGA used to create models with more vRAM as a way to have product differentiation. But if you want to remain an AIB partner you can't do that.

So nVidia is screwing consumers and making so that AIB's that actually care can't do anything to fix the problem. It's a solid reason to buy a 7900XT... (though I think the real solution is to simply wait until the inevitable price drops start coming).
 
Last edited:
The frustrating thing here too is that nVidia specifically prevents hardware modification from board partners.
EVGA used to create models with more vRAM as a way to have product differentiation. But if you want to remain an AIB partner you can't do that.

So nVidia is screwing consumers and making so that AIB's that actually care can't do anything to fix the problem. It's a solid reason to buy a 7900XT... (though I think the real solution is to simply wait until the inevitable price drops start coming).
Props to them for getting out while they could and not putting up with Nvidia's bullshit.
 
The frustrating thing here too is that nVidia specifically prevents hardware modification from board partners.
EVGA used to create models with more vRAM as a way to have product differentiation.
What EVGA Video card ever had more vRam than competitors? I don't remember any.

That is determined by the vram GDDR family and the bus width on the GPU. You can't just decide the card will have 'more', if there is no way to interface them with the GPU.
 
That is determined by the vram GDDR family and the bus width on the GPU. You can't just decide the card will have 'more', if there is no way to interface them with the GPU.
Could you if you are ready to double it by having the same bus, same amount of ram chips, but each of them with a larger amount available ?

A bit like:
https://www.techpowerup.com/gpu-specs/geforce-rtx-3050-4-gb.c3744
https://www.techpowerup.com/gpu-specs/geforce-rtx-3050-8-gb.c3858

Both 128 bits, 224 gb/s bus

Could we have seen 16-20 gig 3070s-3080s ? And 24gb 4070ti

It would have been a lot and show why they went with what they went with, but still possible?
 
What EVGA Video card ever had more vRam than competitors? I don't remember any.

That is determined by the vram GDDR family and the bus width on the GPU. You can't just decide the card will have 'more', if there is no way to interface them with the GPU.
Back in the day EVGA use to have video cards with more memory than what Nvidia reference was. They stopped doing it when Nvidia put their foot down. But, it use to happen quite a bit.
 
  • Like
Reactions: noko
like this
There’s a lazy dev problem, hasn’t this developer notoriously been bad with ports (Arkham Knight), I’m sure Naughty Dog will fix the game, it’s a shame they didn’t use Nixxes (both of the new Spider Man games, Tomb Raider reboot trilogy, DE: Mankind Divided), I’m guessing they are most likely working on something already. Anything for views and clicks, integrity be damned. I’m sure 4k will increase VRAM requirements but it shouldn’t be 16GB…
There are too many games now to say its an unoptimized game. Too many new games are showing you need more than 8GB.
 
It looks like for ports of PS5 exclusive games, you may need at a minimum 12gb VRAM + 8core high perf CPU or 12core+ CPU (Ideally 16gb/32gb VRAM + 16 core CPU + 32gb/64gb RAM)

Fswg7yiXsAIcxb3.jpeg


I think worth keeping the larger picture in mind here.

TLOU 1 is a PS5 exclusive unlike other ports. There are some comments that say ultra high textures make sense only in 4K & you are better off turning down to high in 1440p / 1080p

Also the texture decompression is CPU based (not dedicated hardware like PS5). So if you don't have a 16 core CPU then you might run into occasional CPU bottlenecks View attachment 561681View attachment 561685

Games made for consoles over time will push the hardware. Meaning for the PS5 and XBOX Series X the Vram and CPU threads. 12gb Vram and 16 threads.

If one have lesser hardware, less Vram, expect issues on ports besides the normal less than stellar ones. Plus the overhead on Windows just adds to the requirements.

To have better than console experience you will need more than the consoles, like 16gb of Vram, better CPU but with at least 16 threads. For better textures, higher resolution, higher level of detail if objects plus any added unique pc features.

The Last Of Us is pushing the CPU, at least 12 threads to 100%, 16 threaded CPUs may be pushed towards 100% as well with high end GPUs.

So what is going on with the last of Us PC version. It was built around the console, pushing graphics to the next level. The modern api is being leverage to use multiple threads to push the draw calls needed for the very complex enviroments, added objects, textures, compute shaders, high resolution lightmaps, shadow maps, level of detail to minimize pop ins meaning less culling, hammering the CPU. You could not do this with like DX11 minimal threading capability with draw calls.

Not to say the engine used is as efficient as it could be, but there are reasons for the issues for GPUs less than 12gb of Vram which 12gb would be limited to console level graphics in this game. 16gb and above graphics cards should do well, that is if more of the other bugs are worked out.

As a minimum, you always want a graphics card having more Vram than what the console can use. That is if you want to play games later designed around those consoles with better graphics than the consoles.

Would like to see pcie testing with this game. If the CPU is being hammered, what is going on with pcie transfer rates. I normally wait for games to wrinkle out and get better optimized except a few from great developers like ID software. May have to get this early for some testing. Do want to play this series anyways.

 
Last edited:
How old is 3070 now? Last video i saw said for TLOU you could just chang to high from ultra and it was fine. Not like having to lower a few setting 3 years after card release is that unusual historically....
 
How old is 3070 now? Last video i saw said for TLOU you could just chang to high from ultra and it was fine. Not like having to lower a few setting 3 years after card release is that unusual historically....
Yeah, but I don't have to lower settings on my 6800xt that I got for less :) The 8gb is more of a hinderance than it should be. It's not so much the 3070 is bad, it's that people played up $600+ ones as such an amazing buy.
 
Yeah, but I don't have to lower settings on my 6800xt that I got for less :) The 8gb is more of a hinderance than it should be. It's not so much the 3070 is bad, it's that people played up $600+ ones as such an amazing buy.

Well... it was at the time if you could get it for $600..dont get me wrong i think keeping the vram low is planned obsolecense by NVIDIA and they look to be continuing that trend. I personally own 9 x 3070 but i dont use them for gaming so i guess thats someone elses problem XD.
 
The problem with many of the cards that came with extra vram in the past was that they used slower memory. For instance there was at least one model of 9800gt that had 1gb of vram(instead of 512mb) but it was ddr instead of gddr which made it perform worse than a regular card and about the only way to get a game to even use some of that extra memory at the time was with texture mods.

I do agree with the comments that Nvidia has often skimped on the quantity of memory in recent generations except at the top end. I'm not sure if it's a cost cutting thing or planned obsolescence, it might be both. To be fair I think amd skimped on the bandwidth with the 6000 series which is just as bad just in a slightly different way.

Reading through the last couple pages there were some comments that aged very poorly though most were only humorous if you ignored the age. As far as the original question in the title goes I think vram requirements tend to go in waves anymore since development is tied to the consoles. There is a ramp up period after new consoles release while developers adjust to using the larger resource budget and after that things tend to stagnate until commonly used pc hardware reaches a point far enough ahead of the consoles that some devs bother to start finding ways to utilize the extra resources with ultra textures or something.
 
I made a point not to upgrade my 1080ti until I could get a videocard with more VRAM, the 3080 12GB barely qualified for that, but I probably should have held out for a 16gb model.

😑
Yeah I upgraded from a 4GB GTX 980, but since I also upgraded to 4k, I've already hit some VRAM issues (in Witcher 3) with 12GB. At this stage, I think my next card is 24GB or more or bust. I am just not willing to spend $1600+ for it though.

I bought this one secondhand, looks like I might do that again in the future.
 
What EVGA Video card ever had more vRam than competitors? I don't remember any.
It’s all 900 series and before. Here is one example of a midrange card doubling up on vRAM.

https://www.gamersnexus.net/guides/1888-evga-supersc-4gb-960-benchmark-vs-2gb?showall=1

After this, nVidia disallowed board partners from making hardware alterations like this, if they still wanted to sell nVidia GPUs.
That is determined by the vram GDDR family and the bus width on the GPU. You can't just decide the card will have 'more', if there is no way to interface them with the GPU.
It’s not that complicated. As has been explained to you, simply double the RAM and keep the same number of chips.

A 24GB 4070 Ti I’m sure would’ve been welcomed by probably everyone willing to spend money on nVidias dumb GPU pricing. Well you know, if board partners could correct nVidias stupidity.

There are rumors that the 4060 will only contain 6GB. If true that card is DOA.
 
It looks like for ports of PS5 exclusive games, you may need at a minimum 12gb VRAM + 8core high perf CPU or 12core+ CPU (Ideally 16gb/32gb VRAM + 16 core CPU + 32gb/64gb RAM)

stop it...does that really make any sense to you?...you're taking an unoptimized game and speculating that every PS port going forward will need 64GB system ram and 24GB VRAM???...so only 3% of gamers will be able to play the games with no issues?...the issue is with the port itself

Nvidia has already released a hotfix driver for LoU to mitigate the crashing issues on 30 series cards and Naughty Dog is releasing multiple patches this week...this won't be fixed with a few patches but in a few months should be in much better shape- same way Horizon Zero Dawn took a few months to 'fix' the major issues in regards to CPU utilization
 
stop it...does that really make any sense to you?...you're taking an unoptimized game and speculating that every PS port going forward will need 64GB system ram and 24GB VRAM???...so only 3% of gamers will be able to play the games with no issues?...the issue is with the port itself

Nvidia has already released a hotfix driver for LoU to mitigate the crashing issues on 30 series cards and Naughty Dog is releasing multiple patches this week...this won't be fixed with a few patches but in a few months should be in much better shape- same way Horizon Zero Dawn took a few months to 'fix' the major issues in regards to CPU utilization
Seems to be optimized for PS5 hardware, much less so for low VRam GPUs that Nvidia seems to like to sell. Of course it can be fixed by dumbing it down, cull more, LOD with more objects not being rendered with textures, shaders, compute shaders and so on. Will it look better than the console version, play better? Now releasing a PC version that fails to support lower end hardware is definitely a problem, so yes, the game is unoptimized for PC hardware. Will developers pushing Console graphics at this point then spend just as much time to rework it to make it play on less asset wise GPUs? This game would have played great on a 3070 if it was 16gb and so on. Publishers may just wait on releasing games that push console graphics on the PC where the majority of the PCs used for gamings will just run it poorly or just look like crap when played, why even bother making another game for the lowest denominator.
 
Seems to be optimized for PS5 hardware, much less so for low VRam GPUs that Nvidia seems to like to sell. Of course it can be fixed by dumbing it down, cull more, LOD with more objects not being rendered with textures, shaders, compute shaders and so on. Will it look better than the console version, play better? Now releasing a PC version that fails to support lower end hardware is definitely a problem, so yes, the game is unoptimized for PC hardware. Will developers pushing Console graphics at this point then spend just as much time to rework it to make it play on less asset wise GPUs? This game would have played great on a 3070 if it was 16gb and so on. Publishers may just wait on releasing games that push console graphics on the PC where the majority of the PCs used for gamings will just run it poorly or just look like crap when played, why even bother making another game for the lowest denominator.

in the Digital Foundry LoU PC port video the PS5 looks better than a PC using Medium textures and is almost as good as a PC with a 4090 and high end Intel CPU...a PS5 is roughly equivalent to a 2070 Super and AMD 3600 CPU but when you compare it head-to-head the PS5 version looks much better
 
in the Digital Foundry LoU PC port video the PS5 looks better than a PC using Medium textures and is almost as good as a PC with a 4090 and high end Intel CPU...a PS5 is roughly equivalent to a 2070 Super and AMD 3600 CPU but when you compare it head-to-head the PS5 version looks much better
We have to see how the developers patch this game over time, get better at PC specific paths etc. Maybe get some help from Nvidia as well.
 
in the Digital Foundry LoU PC port video the PS5 looks better than a PC using Medium textures and is almost as good as a PC with a 4090 and high end Intel CPU...a PS5 is roughly equivalent to a 2070 Super and AMD 3600 CPU but when you compare it head-to-head the PS5 version looks much better
As someone who has played it on PC and the PS5. The PC is hands down that much better looking that it's not even close. I have never have trusted Digital Foundry after a couple of shady things they did in regards to reviews etc. But the PS5 is nowhere near as good as the PC version.
 
As someone who has played it on PC and the PS5. The PC is hands down that much better looking that it's not even close. I have never have trusted Digital Foundry after a couple of shady things they did in regards to reviews etc. But the PS5 is nowhere near as good as the PC version.

the PC with a 4090 is the best version, that is not in dispute...but a PC with a 2070 Super and AMD 3600 which is roughly equivalent to a PS5 is noticeably inferior...the high end PC also has it own issues as far as Ultra shadows not working properly etc...you can brute force the game with a 4090 but it's still not an ideal port
 
There are too many games now to say its an unoptimized game. Too many new games are showing you need more than 8GB.
I would argue there are more games out that are showing that you absolutely don't need 12-16GB VRAM to have a stellar looking game that also performs great (God of War, Spider Man, Tale: Requiem etc.), you also don't need to max all settings all the time, especially at 4K or 1440P UW, this is part of the problem / reason why they'll continue to sell more expensive hardware (people thinking they need 100+ FPS and Maximum settings to have a great experience, just another form of FOMO). TLoU is a bad port in it's current form and shouldn't be used as any type of indicator, just as Arkham Knight was when it first launched. Ultra is for screenshots, High is for good looking gameplay, High/Medium is also good looking with more of an emphasis on performance imo.
 
I would argue there are more games out that are showing that you absolutely don't need 12-16GB VRAM to have a stellar looking game that also performs great (God of War, Spider Man, Tale: Requiem etc.), you also don't need to max all settings all the time, especially at 4K or 1440P UW, this is part of the problem / reason why they'll continue to sell more expensive hardware (people thinking they need 100+ FPS and Maximum settings to have a great experience, just another form of FOMO). TLoU is a bad port in it's current form and shouldn't be used as any type of indicator, just as Arkham Knight was when it first launched. Ultra is for screenshots, High is for good looking gameplay, High/Medium is also good looking with more of an emphasis on performance imo.
But but... My max settings that have to be zoomed in 300 percent on a screenshot to see vs high and ultra mixed!
 
Back
Top