The "You need 8GB vRAM for 4K" myth

Status
Not open for further replies.

EdZ

Gawd
Joined
Jan 28, 2008
Messages
773
First, go watch Linus Tech Tips' video on testing a 10320x1440 surround rig using 3x R9 290x.

Watched it?

Now, 10320x1440 is 14,860,800 pixels. 4k (UHD, really) - 3840x2160 - is 8,294,400pixels. So, that surround load is nearly 2x the raw pixel output load of 4K.
Linus specifically chose the R9 290x due to the prevailing wisdom that 4K required a lot of RAM, and they were the fastest cards available with 8GB of RAM per card. In use, only Shadow of Mordor using the ultra textured exceeded 4GB of GPU RAM during use.
With only one case of high RAM usage (and that using abnormally large textures, and even then only by a small amount) at an output resolution double that of 4K, it can safely be said that 4K does not "require" more than 4GB of GPU RAM. Certainly not for current games, and likely not for future games unless very large texture reslution become both common and worthwhile (would require further testing with Shadow of Mordor at very high resolutions with different RAM constraints).
Further testing will be needed to even evaluate what performance penalty, if any, that single case of Shadow of Mordor exceeding 4GB would have, given it was already performance constrained by raw GPU power.

But, I hear you cry, consoles use 8GB!
Yes, but consoles also use that RAM pool for everything, and need to share not only RAM capacity betyween CPU and GPU, but also RAM access bandwidth. A PC not only has at least another 8GB of system RAM to work with for non-GPU-specific tasks, the PCI-E bus can pass data from system RAM to GPU RAM far faster than a console can load data into it's RAM from HDD or disc, alleviating that potential bottleneck.



tl;dr 4GB of RAM per GPU is sufficient for 4K in current games. It is unlikely that more than 4GB will be necessary for the near future at the very least.

If somebody has a test that shows a noticeable improvement going from 4GB to 8GB at 4K (or any other resolution) I'd love to see it. A direct comparison of a R9 290x 4gb and R9 290x 8gb would be perfect to eliminate other variables.
 
Last edited:
You've miscalculated that - "4K" is 3840x2160, or 8,294,400 pixels.

I don't think anybody is saying 8GB is an absolute necessity for 4K, most games don't need it, but some need more than 4GB - perhaps 5, maybe 6 at a push - the nearest available quantity above that is 8GB in most cases, as 6GB cards aren't all that common.
 
You've miscalculated that - "4K" is 3840x2160, or 8,294,400 pixels.
Whoops, amended.
I don't think anybody is saying 8GB is an absolute necessity for 4K, most games don't need it, but some need more than 4GB - perhaps 5, maybe 6 at a push - the nearest available quantity above that is 8GB in most cases, as 6GB cards aren't all that common.
It's not even 'some'. At the moment, ONE game actually uses more than 4GB, at a higher resolution than 4K, and even then only with one specific setting enabled.
And there very much is an undercurrent of "high end cards need 6/8/12gb because 4K!", which does not seem to be backed up by evidence.
 
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Far_Cry_4-nv-test-FarCry4_vram.jpg
- only just over, but if the Geforce in that test had 6GB, based on the results below I'd expect it to be calling 4.2 ish, might not affect performance, but you can't be certain about that.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Assassins_Creed_Unity-test-ac_vram.jpg
- Laughable title, but that's definitely an 8GB for 4K title, even if it's a terrible port.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-cod_vram.jpg
- Things are looking a bit tight on this one too.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Evolve_Alpha-cach-evolve_vram.jpg
- Definitely at least 6GB needed for 4K here.

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Lords_Of_The_Fallen-test-vram_2.jpg
- and here.

Is it every major title? Of course not, the new Borderlands doesn't even need 1GB at 4K, but that said, the above results show that even if it's only just over, several games are tipping the balance at 4GB already, and would arguably benefit from 6GB in some cases, undeniably in a small number of cases. That's now - what about next year?
 
I'd be more interested in Apples to Apples to test. I'd love to see a test between the following (SLI'ed and XFire'd, as that will be when combined GPU is powerful enough to determine when/if bottleneck occurs at given resolutions):

290X 4GB
290X 8GB
780 3GB
780 6GB
980 4GB
(980 8GB if it comes out)

That should be enough data to at least ascertain how much of a gimmick is VRAM
 
I'd be more interested in Apples to Apples to test. I'd love to see a test between the following (SLI'ed and XFire'd, as that will be when combined GPU is powerful enough to determine when/if bottleneck occurs at given resolutions):

290X 4GB
290X 8GB
780 3GB
780 6GB
980 4GB
(980 8GB if it comes out)

That should be enough data to at least ascertain how much of a gimmick is VRAM

This.
 
Posting memory usage graphs reminds me of people complaining when Vista came out that the OS dared use their precious memory. Never mind the fact that memory consumed by SuperFetch is released instantly in the face of a request that requires more memory than is "available".

I don't think it should matter to anyone what the video memory consumption number is, as there's no way to know what caching algorithm, if any, is at play. Bring on the apples-to-apples tests that the poster above is calling for. If insufficient memory introduces hitching or lowers framerates, then it's time to care. Stressing over the memory consumption is pointless until a performance impact presents itself.
 
Agreed - I have tried looking for one but haven't really come across anything useful (i.e. with proper frame rate over time graphs rather than just min/ave) so for now those are all we have to go on.
 
It's not the resolution that makes much difference for VRAM usage. It's areas such as textures. Sure, you can use 4 GB of VRAM for 4k, just like you can use 128 MB of RAM for 1080p. But some of us would prefer to have higher quality textures along with more stuff on the screen to (less pop up, longer draw distance, etc.)
 
I never cared to watch anything "Linus" does. He is just annoying to me to begin with. But it is what it is.

Moving on now. I would rather have more gpu ram and not need it. Than to see I need it and not have it. Plus games are changing and becoming more demanding now.
 
It's not the resolution that makes much difference for VRAM usage. It's areas such as textures.

But for the same texture settings, VRAM usage goes up pretty linearly with an increase in resolution.

My personal conclusion is that 4K just isn't feasible for gaming yet...at least not without an insane budget.
 
The concern people are expressing is that the newest batch of games is showing very high VRAM usage even at 1080p and how this trend is likely to continue in the future.

Four games were used for this test -

Beyond Earth
Dirt Showdown
Portal 2
Shadows of Mordor

So 3 of the 4 games tested were not the new VRAM pushing games and actually have benchmark data showing them playable with a single GPU at 4K.

Then we have Shadows of Mordor which in the test he mentions that there may have been a certain section that eclipsed 4GB VRAM but he was not able to verify due to issues with the menu.

Expanding on Shadows of Mordor from what I understand the heavy VRAM usage is not throughout the game but certain sections using the highest textures does have issues at certain segments of the game even at 1080p.

So regarding the last point this also brings up an issue related to testing. What counts as a VRAM issue? Do performance issues that occur at only a certain segment count? What if it only manifests every 15 minutes on avg of gameplay? 5 minutes? 1 minute?

Also PCPer did benchmark Mordor at 4k - http://www.pcper.com/reviews/Graphi...rmance-Testing/4K-Testing-and-Closing-Thought
 
people are still wasting time arguing over this topic? :rolleyes:

and as if Linus's video is even close to being a case/argument/point anyway
 
Moving on now. I would rather have more gpu ram and not need it. Than to see I need it and not have it. Plus games are changing and becoming more demanding now.

Sigh. I just don't understand this.
 
Future proofing is silly and impossible. Most the posters in this thread who have 8GB of ram today when barely 4GB is needed will be the first to have 16GB cards in five years when barely 8GB is required.

Seems so silly to me.
 
Future proofing is silly and impossible. Most the posters in this thread who have 8GB of ram today when barely 4GB is needed will be the first to have 16GB cards in five years when barely 8GB is required.

Seems so silly to me.

Gotta have the best, to say you have the best. Get it? :p

On fire with these questions! :D

After my Titan fascination, I promised myself I wouldnt jump on "upgrade bandwagons" anymore
 
Future proofing is silly and impossible. Most the posters in this thread who have 8GB of ram today when barely 4GB is needed will be the first to have 16GB cards in five years when barely 8GB is required.

Seems so silly to me.

There will always be a market (though likely very limited) for the best-of-the-best tech gear. Sites like [H] attract these people, so it is plausible that they represent a larger percentage of the community here than in the actual broader tech community or tech consumer communities at-large. Therefore, more people here (as a percentage of total population) will demand more VRAM than is arguably necessary or coveted by the PC gaming community in general.
 
You really don't need more than 4GB now, and if you drop AA which is much more feasible at the better DPI and resolution 4K affords, you can't really come close to burning up 4GB to where it becomes a performance hindrance. In Assassin's Creed Unity for example it is well-remarked by many reviews that the top texture quality level is barely distinguishable in screenshots from the one below it which drastically reduces VRAM usage, as well. Shadows of Mordor gains essentially nothing by using the separately downloadable Ultra pack, since all they are is uncompressed versions of the same texture resolutions etc.

Having gamed with 4K since May of 2014, as always, VRAM capacity is WAY overblown in importance by people who don't know better and maybe want to for some perverse reason "justify" to themselves why 4K gaming isn't "feasible" at this time, just as they used to for 2560 resolutions and before that 1680/1920 resolutions many years ago when they had less. It's a very strange and freaky psychological dementia, in my opinion.
 
There is a lot of things people don't get in here though. We had this same argument about pc ram. Now people are getting 32Gb as a normal now. Does it make it right? It is for the user to decide.

The games I play take up my 4Gb of gpu ram I have very easily. I will gladly take more gpu ram. If you don't need it then do not get it. If that is how you see it. some of us do more than just surf the internet and post in forums with our pcs. So more ram would help us out. Everyone will have a different opinion on the matter here.

Sigh. I just don't understand this.
 
IMO this sounds a little like Nvidia damage control.

Most people I know consider future-proofing (as far as possible) when buying computer hardware, especially video cards, and the R9's 8GB is a nice selling point for AMD. IMO it's not any kind of stretch to claim this much memory will eventually be required as screen resolutions and texture sizes get bigger, it's just a matter of time before it happens. I'm old enough to remember when 512MB vRAM was considered infintely cavernous.
 
IMO this sounds a little like Nvidia damage control.

Most people I know consider future-proofing (as far as possible) when buying computer hardware, especially video cards, and the R9's 8GB is a nice selling point for AMD. IMO it's not any kind of stretch to claim this much memory will eventually be required as screen resolutions and texture sizes get bigger, it's just a matter of time before it happens. I'm old enough to remember when 512MB vRAM was considered infintely cavernous.

I remember when 40mb hard drives were absolutely enormous ;), and my 1MB video card was slick stuff.

The point is, though, for today and tomorrow, 4GB is plenty for all available consumer-level display resolutions in gaming. In 2016 maybe that'll change some finally to a meaningful amount, but by then most people with currently top-end GPU solutions will be wanting to upgrade for more GPU horsepower anyway. I've had this same silly argument time and again and always been proven right over the years... the GPU ability always runs out far quicker than the VRAM capacity on these things in terms of what becomes the bottleneck for settings and performance levels.

Sounds like AMD fud-spreading from their "Advocacy Program" site (which doesn't require participants to disclose their affiliation, unlike nvidia's "focus group" which does require them to state they are being rewarded for their postings) to me, or genuinely ignorant people extolling the virtues of 8GB for today's games :rolleyes:. The allocation numbers afterburner and other programs display is almost meaningless other than just to get a rough idea of relative memory usage between settings; the absolute number includes cached or allocated-but-not-freed-when-no-longer-used memory sections that inflate the number huge amounts otherwise.

Posting memory usage graphs reminds me of people complaining when Vista came out that the OS dared use their precious memory. Never mind the fact that memory consumed by SuperFetch is released instantly in the face of a request that requires more memory than is "available".

I don't think it should matter to anyone what the video memory consumption number is, as there's no way to know what caching algorithm, if any, is at play. Bring on the apples-to-apples tests that the poster above is calling for. If insufficient memory introduces hitching or lowers framerates, then it's time to care. Stressing over the memory consumption is pointless until a performance impact presents itself.

Finally, someone who gets it! This is spot-on, and in past comparisons every single time over the years it's been shown that the extra VRAM the typical forum-dwelling "gamer" who doesn't really understand any of the tech in-depth is demanding and crowing is an "ERMAHGERD NEEDDDDDD!!!11111" is actually of no performance benefit (heck, in the past it's been a performance loss of slight amounts in some cases because of higher latencies from denser memory chips :p ).
 
The games I play take up my 4Gb of gpu ram I have very easily.

What games? Proof dictates otherwise. That is what I don't understand you saying you're using up all of your VRAM right now. I have 3GB of VRAM and game at 5970x1080 and I'm not seeing any issues.

IMO it's not any kind of stretch to claim this much memory will eventually be required as screen resolutions and texture sizes get bigger, it's just a matter of time before it happens. I'm old enough to remember when 512MB vRAM was considered infintely cavernous.

You'll be using the same videocard for that time period?
 
Shadow of Mordor easily does it and Far Cry 4 is 3.5Gb average and 4Gb is locations. This is just 1080P resolutions. So now you show me why I don't need it?

What games? Proof dictates otherwise. That is what I don't understand you saying you're using up all of your VRAM right now. I have 3GB of VRAM and game at 5970x1080 and I'm not seeing any issues.



You'll be using the same videocard for that time period?
 
The concern people are expressing is that the newest batch of games is showing very high VRAM usage even at 1080p and how this trend is likely to continue in the future.
This is good and desired behaviour, the exact opposite of concerning. Any RAM not being utilised is simply RAM being wasted doing nothing. It's better to cache even rarely used data in RAM not being used than to simply leave it empty. LstOfTheBrunnenG is entirely correct here.
 
There is a lot of things people don't get in here though. We had this same argument about pc ram. Now people are getting 32Gb as a normal now. Does it make it right? It is for the user to decide.

You're wrong about 32GB being common. People are still buying 4GB for light/no gaming, 8GB mainstream gamer, and 16GB enthusiast gamer. Only the ultra-high-end buyers are paying the $300+ for 32GB RAM...if you think otherwise, you should pop into the General Hardware forum sometime.

You can still play most of today's games smoothly with 4GB ram, and we probably won't see games topping 8GB ram for 2-3 years. By the time games top 16GB system RAM, most people on the forums will have given in to the upgrade itch :D

Also, people tend to buy more ram because they tend to keep the rest of the system longer (because we are inevitably GPU-limited). Why bother with extra GPU ram when most of these enthusiasts discussing 4k gaming are just going to upgrade to more GPU horsepower every year anyway? 4k gaming = cutting-edge in terms of graphics hardware, but requires exactly the same amount of CPU as lower resolutions (or LESS if you insist on 120/144Hz gaming at lower resolution).
 
Last edited:
Shadow of Mordor easily does it and Far Cry 4 is 3.5Gb average and 4Gb is locations. This is just 1080P resolutions.

So Linus barely eclipsed 4GB running 10,390x1440 or whatever and you're using more than 4GB in the same game at a mere 1920x1080? Doesn't sound feasible imo.
 
That is why laptops come with 4Gb - 8Gb normally these days. I wouldn't even dare use a pc with 4Gb of ram in this day and age. Glad you think 4Gb is normal though. Like I said people do use their computers for more than posting in the forums and surfing the web. This has nothing to do with what people sale in the forums.

* 7 people that posted in here has 8Gb all the way up to 32/64Gb in just this thread. Those are just the ones that have their specs listed.

You're wrong about 32GB being common. People are still buying 4GB for light/no gaming, 8GB mainstream gamer, and 16GB enthusiast gamer. Only the ultra-high-end buyers are paying the $300+ for 32GB RAM...if you think otherwise, you should pop into the General Hardware forum sometime.



Stop walking around the subject and tell me why I do not need it. Your broken record rinse and repeat is now getting old. No one cares what Linus does. I wouldn't base cutting a light switch on from his information. We all use different settings in a game. I'm still waiting on you to show me why I don't need it though.

So Linus barely eclipsed 4GB running 10,390x1440 or whatever and you're using more than 4GB in the same game at a mere 1920x1080? Doesn't sound feasible imo.
 
Last edited:
That is why laptops come with 4Gb - 8Gb normally these days. I wouldn't even dare use a pc with 4Gb of ram in this day and age. Glad you think 4Gb is normal though. Like I said people do use their computers for more than posting in the forums and surfing the web. This has nothing to do with what people sale in the forums.

* 7 people that posted in here has 8Gb all the way up to 32/64Gb in just this thread.





Stop walking around the subject and tell me why I do not need it. Your broken record rinse and repeat is now getting old. No one cares what Linus does. I wouldn't base cutting a light switch on from his information. We all use different settings in a game. I'm still waiting on you to show me why I don't need it though.

Honestly, you can buy whatever you want, but Linus ran SoM with Ultra textures at probably a far higher resolution than you. And he said it was the only game that needed more than 4GBs of vram. You have to realize there was still not enough GPU processing power to get a steady 60FPS+ (let alone 120+) in the most demanding games even at 1080p. So I dunno why you're so hung up on the vram, man. I stepped down to an R9 290 and I'll be waiting for GPUs with more grunt, not so much the memory as I haven't hit 4GB at 1440p with ultra settings in any game that I've got and I'm sure as heck not getting a steady 60FPS in ultra in those demanding games (although I have run some Dishonored, ME3 and some other UE engine games at 5k to make up for the lack of AA and had a decent time playing).
 
* 7 people that posted in here has 8Gb all the way up to 32/64Gb in just this thread. Those are just the ones that have their specs listed.

See? You're trying to justify massive spec inflation based only upon the sigs of some of the highest-end users in these forums. Or would you expect anything less of people posting in a 4k gaming thread that actually care enough to populate their sigs for bragging rights?

OF COURSE those will be top-end, either to fulfill some use beyond gaming, or just to have the best. I guarantee that nobody here will still be using their current systems when 16GB isn't enough to play a game, so unless you do other things, 32/64GB is overkill.

Believe me, 4GB is more than enough for an office/web machine built today, and 8GB is plenty for a midrange gamer (Core i3 or AMD FX). I recommend 16GB for anyone building an enthusiast-level machine (Core i5 or i7 1151) that will last more than 5 years.

And obviously, pure gamers don't need Socket 2011 except for bragging rights, since Tri-SLI and 4xCFX is such a crapshoot. So those with 64GB are either just dreaming, or they are running virtual machines.
 
No one cares what Linus does. I wouldn't base cutting a light switch on from his information. We all use different settings in a game. I'm still waiting on you to show me why I don't need it though.

If you don't take his information as factual then any evidence I provide will just be dismissed by you as "provided by Linus". So not much I can do about it now is there.
 
This is how I feel also. It doesn't make much sense to buy 4Gb of ram when you can get 8Gb for $25 - $30 dollars more. Gpu processing power is not a great thing also. When you buy a top of the line gpu and then have to turn down settings. It is rather annoying to do so after you spent all that money. You honestly shouldn't have to turn something down like that. You should enjoy your products that you bought. Other than the 4K people. They need all the extra help they can get to enjoy it at that resolution.

Honestly, you can buy whatever you want,

You based your information off people selling ram in the forums. But when I based mine off the people that posted in here. You turned it against me again. One sided are we?

See? You're trying to justify massive spec inflation based only upon the sigs of some of the highest-end users in these forums. Or would you expect anything less of people posting in a 4k gaming thread that actually care enough to populate their sigs for bragging rights?


Wait you put it all on me. Then when I answered you told me about Linus. I asked for you to prove to me why I didn't need it. But see you are back are square one with Linus again. You're are right there isn't much you can say or do at this point.

If you don't take his information as factual then any evidence I provide will just be dismissed by you as "provided by Linus". So not much I can do about it now is there.

* I got to get back to work. So delays in answering and posting. Because if I get fired. I will have to be selling ram and other hardware in the forums.
 
Linus tested the EXACT game you're claiming is using more than 4GB of ram at 1920x1080 at a resolution of 10,390x1440 and found that it only exceeded 4GB in some situations when using ultra texture pack. Yet you're claiming Shadows of Mordor as a title that proves 4GB of VRAM isn't enough right now, at the low resolution of 1080p. I game in a surround setup with 3x as many pixels with less than 4GB of VRAM without issues but according to you I should be experiencing them.
 
i think the current gen consoles (and the games designed for them) are going to make an impact on how much vram is required on the pc side. i think some game developers are going to do engine tricks that use more vram instead of gpu to implement nicer looks on the consoles. and some of these ideas are going to stick around for the pc ports.
 
Consoles don't have more VRAM in the first place so I don't see how designing games to use more would come from that facet of games development.
 
You based your information off people selling ram in the forums. But when I based mine off the people that posted in here. You turned it against me again. One sided are we?

No, I based it upon NEW build threads from the General Hardware Forum. NOT THE FOR SALE FORUM.

Just check the threads there, and you will see a preponderance of 8-16GB in new builds TODAY.
 
Consoles don't have more VRAM in the first place so I don't see how designing games to use more would come from that facet of games development.

This. They have about 5gb or a bit more for system and video memory available for games, combined, so much less for graphics alone (probably 2gb average on that front for top end aaa titles). The rest is reserved for the os, playback features, etc.

Don't worry though, the uninformed masses will be back soon. It's funny how the most ignorant people are always the most conceited and self assured, while the smarter people stop, ask questions, and research.
 
They also have sub-1080p reslutions most of the time, no AA, low quality textures, etc..

So they dont need any extra memory , as they cant use it if they wanted to.

Whats you point? This changes nothing

PCs have higher resolutions, higher texture density, higher AA, all of which all requires more vram
 
Consoles don't have more VRAM in the first place so I don't see how designing games to use more would come from that facet of games development.

i believe that the ratio of "vram"-esque ram to system ram-esque is higher in consoles than to PC's vram to ram. i also believe that the GPUs in consoles are usually weaker than in PCs now. as such i fully expect that some console games, will try to obtain the most aesthetic value out of the "vram"-esque ram instead of gpu usage, but only for a year or two.
 
Status
Not open for further replies.
Back
Top