The "You need 8GB vRAM for 4K" myth

Status
Not open for further replies.
You'll be using the same videocard for that time period?
The honest truth is, when I first got it virtually nothing used 512MB, by the time I got rid of it the same amount of vRAM was a bottleneck.
 
They also have sub-1080p reslutions most of the time, no AA, low quality textures, etc..

So they dont need any extra memory , as they cant use it if they wanted to.

Whats you point? This changes nothing

PCs have higher resolutions, higher texture density, higher AA, all of which all requires more vram


The consoles are the main argument you guys use as far as quote unquote "needing" more vram, and it's outright false.
 
The consoles are the main argument you guys use as far as quote unquote "needing" more vram, and it's outright false.

No, it isn't. It's exactly because of those consoles and their unified memory that game developers have gotten lazy with texture compression and generally making good use of VRAM. Since they have so much more memory available (compared to previous-gen consoles), there's no need to try to make good use of it, so the PC versions of those games also consume more VRAM. And since we're very commonly also running higher resolutions than the console versions, it's even more of an issue.

Regardless, the number of games capable of fully using 4GB+ of VRAM at 1080p or 1440p, much less 2160p, is already significant and growing with each new batch of game releases. And it's not like the VRAM usage is going to go down over time.
 
Does anyone have links to solid data of how PS4/Xbone use their 8gb unified memory? I saw a couple of articles with some rough numbers along the lines of 2-3gb reserved for system/OS, 3-4gb for game data and 2gb for video memory.

Providing those numbers are close to reality, theoretically wouldn't 4gb VRAM give a fair bit of headroom for PC gamers even at higher resolutions?
 
I don't think output resolution and AA are that big a memory hit.

Let's assume Deferred Rendering, which uses 4 buffers (G, Z, Normal, Final). Let's also assume triple-buffering for the output. And let's assume Single Precision Floats are used for everything. And everything gets an Alpha channel
1920 x 1080 x 4 (3 colours + alpha) x 4 (bytes in an SP float) x 6 (buffers) = 199,065,600 bytes, or 190MB of buffers.
For UHD/4k, that's 760MB of buffers. Big, but still under 1MB.

Now, MSAA is incompatible with Deferred Shading. Here, you have the depth buffer that ends up getting scaled by the MSAA scaler, but you don't have the G and Normal buffers. Just the final output buffers and the scaled Z buffer. Let's assume 4x MSAA:
1920x1080x4x4x(3+1x4) = 232,243,200 bytes, or 222MB
For UHD/4k, that's 886MB. Still under 1GB. 4xMSAA is probably overkill for UHD anyway.

So at a worst-case, with 4GB vRAM and 4xMSAA at UHD, you still have over 3GB for texture storage.

::EDIT:: Technically the Z-buffer is only one channel, not 4, and most buffers don't need an Alpha channel. I didn't want to make the calculations even more fiddly.
 
lol and here i am, waiting for the 8GB 980's for 2560x1440, overkill, but thats how i like it, i expect GTA V to eat it
 
lol and here i am, waiting for the 8GB 980's for 2560x1440, overkill, but thats how i like it, i expect GTA V to eat it

I wouldn't be surprised if GTA V has some astronomical VRAM requirements for maximum texture detail + all extra bells n whistles. I'm holding out for the min/max system requirements to be released before pulling the trigger on the next GPU upgrade.
 
I don't think output resolution and AA are that big a memory hit.

Let's assume Deferred Rendering, which uses 4 buffers (G, Z, Normal, Final). Let's also assume triple-buffering for the output. And let's assume Single Precision Floats are used for everything. And everything gets an Alpha channel
1920 x 1080 x 4 (3 colours + alpha) x 4 (bytes in an SP float) x 6 (buffers) = 199,065,600 bytes, or 190MB of buffers.
For UHD/4k, that's 760MB of buffers. Big, but still under 1MB.

Now, MSAA is incompatible with Deferred Shading. Here, you have the depth buffer that ends up getting scaled by the MSAA scaler, but you don't have the G and Normal buffers. Just the final output buffers and the scaled Z buffer. Let's assume 4x MSAA:
1920x1080x4x4x(3+1x4) = 232,243,200 bytes, or 222MB
For UHD/4k, that's 886MB. Still under 1GB. 4xMSAA is probably overkill for UHD anyway.

So at a worst-case, with 4GB vRAM and 4xMSAA at UHD, you still have over 3GB for texture storage.

::EDIT:: Technically the Z-buffer is only one channel, not 4, and most buffers don't need an Alpha channel. I didn't want to make the calculations even more fiddly.

They really aren't a very big hit, but that's exactly what I meant about the uninformed barging back in before to espouse how needed eleventy billion gigabytes of vram are for gaming. Vram requirements are the gigahertz myth of recent years :rolleyes:. Nice explanation for newer members, by the way :).
 
lol and here i am, waiting for the 8GB 980's for 2560x1440, overkill, but thats how i like it, i expect GTA V to eat it


Just remember that if it wasn't for people going for overkill now we wouldn't have more VRAM for a much cheaper price in the future with games then taking advantage of it. It's a never ending circle that is often useless at the time, but keeps things moving forward. :p
 
Moving on now. I would rather have more gpu ram and not need it. Than to see I need it and not have it.
Nothing I said earlier should be misconstrued as to disagree with this statement. Had there been 8 GB 980s around when I was in the market for a video card, and if they weren't some ridiculous amount more expensive than the 4 GB version, I very likely would have purchased them. More RAM is always better so long as you're still within a reasonable price range.

That's not the same thing as asking for stronger evidence to back up a statement like "x GB is absolutely necessary for 4K."
 
Can't run max settings at 4K because of VRAM? Card(s) suck ass.

Reasonable people will just turned down the AA and saved a few bucks.
 
Just remember that if it wasn't for people going for overkill now we wouldn't have more VRAM for a much cheaper price in the future with games then taking advantage of it. It's a never ending circle that is often useless at the time, but keeps things moving forward. :p

Yeah i agree, but i get flamed when i mention i want 8GB versions for 2560x1440, both here on [H] and on the GTA forums, oh Facebook aswell.

But i don't care, i want the buffer regardless, and intend going SLI so makes even more sense to me.
 
That begs the question, is the reason 4GB is currently enough for most games because you don't need more at current resolutions and 4k, or because the texture quality is lower than it otherwise might be?
 
Or perhaps GPU performance bottlenecking before VRAM bottlenecking, I don't know for sure because there isn't enough apples to apples and Multi-GPU tests done to even suggest which it might be.

Yes, there was one for 290x 4GB vs 8GB, but that is more likely that 290x was GPU bottlenecked before VRAM bottlenecked, since 970 was able to outperform 290x 8gb in some cases despite having only 4GB VRAM
 
I manged to to run out of 4GB Vram testing @ 1080p om my 3x290s in BF4 Ultra and 200% resolution scale and 4xAA, on my 2560x1600, not a chance.
Lucky i dont like all the features of Ultra settings in BF4 in multi player as while it looks nice, i think its a hindrance, darker shadows, more suppression blur, more smoke. if i did like all that i would be selling up and getting the 8GB versions, i dont like compromising if i dont have to.
 
Last edited:
These threads are always amusing, it's the same tired arguments and short term thinking every time. The difference this time is that we're already seeing games push the limit of most current high end cards, mainly because of increased memory in the new consoles that raises the bar for pc ports. I've been meaning to upgrade for about a year but prices were wacky and now I'm holding out for a mainstream card with 6gb+ because I don't think that 4gb will be enough by the end of 2015.

If you replace your GPU every 6 months it's probably not a major concern but if you like to skip a generation or two before upgrading then it is something you should be concerned about. My last several GPUs have had what many people considered at the time to be more than is necessary but by the end of their life lack of vram was the biggest problem.

I have also never seen a game use prefetch like memory management. In every game that I've run out of vram it starts using system memory and/or the page file which cause performance to plummet, for instance a solid 60 fps might drop into the teens or twenties.
 
Beside my very first setup all my other setups have run out of Vram over there life before GPU grunt.
 
Shadow of Mordor easily does it and Far Cry 4 is 3.5Gb average and 4Gb is locations. This is just 1080P resolutions. So now you show me why I don't need it?

Do we know why? FC3 ran fine at ultra settings with my lowly 2gb vram.
 
All I know is playing some of these newer games on 1080p with my 660ti even at max settings, while my vram is capped, my frames aren't stuttering like they did in the old days. Dragonage I'm getting in the high teens, to low 20's and it's smooth. It's not a true vram cap. Back in the day when we had 256mb cards trying to run games like Oblivion would stutter like a b!@tch!! Those days you couldn't run AA and all the high end bells and whistles. I think it's more of a gimmick these days than anything, programmers and studios in bed with card manufacturers to get you to buy the latest and causing more hype.
 
All I know is playing some of these newer games on 1080p with my 660ti even at max settings, while my vram is capped, my frames aren't stuttering like they did in the old days. Dragonage I'm getting in the high teens, to low 20's and it's smooth. It's not a true vram cap. Back in the day when we had 256mb cards trying to run games like Oblivion would stutter like a b!@tch!! Those days you couldn't run AA and all the high end bells and whistles. I think it's more of a gimmick these days than anything, programmers and studios in bed with card manufacturers to get you to buy the latest and causing more hype.

Teens to 20's is the opposite of smooth
 
These threads are always amusing, it's the same tired arguments and short term thinking every time. The difference this time is that we're already seeing games push the limit of most current high end cards, mainly because of increased memory in the new consoles that raises the bar for pc ports. I've been meaning to upgrade for about a year but prices were wacky and now I'm holding out for a mainstream card with 6gb+ because I don't think that 4gb will be enough by the end of 2015.

If you replace your GPU every 6 months it's probably not a major concern but if you like to skip a generation or two before upgrading then it is something you should be concerned about. My last several GPUs have had what many people considered at the time to be more than is necessary but by the end of their life lack of vram was the biggest problem.

I have also never seen a game use prefetch like memory management. In every game that I've run out of vram it starts using system memory and/or the page file which cause performance to plummet, for instance a solid 60 fps might drop into the teens or twenties.

The thing I don't get is: let's say you game at XYZ resolution. You have frame rate issues turning everything to max, you don't have the gpu processing power to get 60+FPS. You're going to have to turn something down at any rate. The difference between the textures on Ultra/Uber/SuperDuper and High is minimal at best. Is there any point to keeping the textures on Ultra?

Why would I buy a card now with more memory for the sake of turning on Ultra textures if I don't even get 60FPS at a given resolution? I'll just sell it later on and buy a new one.
 
Teens to 20's is the opposite of smooth


Ok it's not smooth, but when you really run out of vram, it's a stuttering mess where your charactger moves, then nothing, then moves, then nothing. etc. I can play dragonage in the teens in 20's. It's not smooth like 30+ fps, but not not stuttery like it's running out of vram either.
 
All they need to do is compare a R9 290X 4GB to a R9 290X 8GB at the same resolutions and there you go..

Instead of comparing different GPU's at different resolutions in different games where many other things factor in..

Just test higher VRAM versions of same cards to verify the effectiveness of VRAM only..

R9 290X 4GB to R9 290X 8GB @ same clocks
GTX780 3GB to GTX780 6GB @ same clocks

Maybe GTX780TI 3GB to GTX TITAN Black? Since the titan is sort of like a 6GB version of the 780ti?
 
The thing I don't get is: let's say you game at XYZ resolution. You have frame rate issues turning everything to max, you don't have the gpu processing power to get 60+FPS. You're going to have to turn something down at any rate. The difference between the textures on Ultra/Uber/SuperDuper and High is minimal at best. Is there any point to keeping the textures on Ultra?

Why would I buy a card now with more memory for the sake of turning on Ultra textures if I don't even get 60FPS at a given resolution? I'll just sell it later on and buy a new one.

IME I've had to turn things down to below an acceptable level due to vram limitations much more often than I've had to for lack of gpu power. As I mentioned in my other post I've found that the performance drops from running out of vram are much more severe than when I run out of gpu power. This makes a big difference because I can usually deal with an occasional drop from 60fps to 50fps but an occasional drop to 20fps makes many games unplayable.
 
IME I've had to turn things down to below an acceptable level due to vram limitations much more often than I've had to for lack of gpu power. As I mentioned in my other post I've found that the performance drops from running out of vram are much more severe than when I run out of gpu power. This makes a big difference because I can usually deal with an occasional drop from 60fps to 50fps but an occasional drop to 20fps makes many games unplayable.



That's the exact opposite of my experience over the years and I have often run sli setups, even. I always have had a high resolution for any given time, though, too. Gpu horsepower has always run dry for me long before vram.
 
Now that we're past the 512MB vs 1GB VRAM debate from the Radeon 4870 era, the 2GB vs 4GB debate from the GTX 680 era, we are currently debating 4GB vs 8GB, and both camps are right in their context of usage.

People that say you "need" the extra VRAM never compromise on their 4xSSAA at Ultra even at 4k, and they are in fact correct. If you need the extra VRAM, you will damn need it.

People that say you "don't need" the extra VRAM never play beyond 1080p, nor have they loaded a heavily-modded skyrim at 4k, nor have they maxed out Shadow of Mordor, and they are in fact correct. If you don't need the extra VRAM, you will not need it.

What it boils down to in threads like these, is a bunch of people with different hardware configurations and gaming criteria doing the timeless VRAM mexican hat dance. Carry on.
 
This is good and desired behaviour, the exact opposite of concerning. Any RAM not being utilised is simply RAM being wasted doing nothing. It's better to cache even rarely used data in RAM not being used than to simply leave it empty. LstOfTheBrunnenG is entirely correct here.

Yes I'm aware of that and I've made numerous comments in past how simply monitoring VRAM usage (along with using task manager for core activity in regards to threading) is not a valid indicator of whether or not extra VRAM is beneficial.

Perhaps I was not clear in indicating this (typed in a hurry) but the concern is centered around how games will trend going forward in terms of actual VRAM requirements because of the recent higher VRAM requirements (and not just monitored usage) in games.
 
I think busybeaver is spot on myself.

Guys actually running 4k: "4gb is fine now and will be for a while yet."

Random 1080p users: "ermahgerd, you need 200000gb to run 4k, even 1080p takes 1955732gb, don't listen to those guys who are running it now, I base this on nothing but fantasy but I am right!"

And you agree with the latter.... :p?

Run high resolutions for decades and even with sli setups I always run out of gpu grunt on my high end gpus long before vram. Almost 1080p in 2004. 2560x1600 in 2008. 4k in may 2014. Always higher than average res by a long shot ;) and it's been without fail.

I guess people can sit whining they "need" x all they want without ever having used it or knowing though, it's a free country. I just wish they wouldn't scare off people who could afford and love high res gaming with their fud, from doing so.
 
That begs the question, is the reason 4GB is currently enough for most games because you don't need more at current resolutions and 4k, or because the texture quality is lower than it otherwise might be?
The latter. Using larger textures will very quickly raise RAM usage (due to textured being powers-of-two in dimensions, every size increase will be in 4x steps). Raising the texture resolution at ANY output resolution will result in a very large increase in vRAM needed to store textures in use. Raising output resolution without changing texture resolution will only result in a moderate increase in vRAM use (calculated earlier in the thread).

Then there's texture caching. This is using the vRAM to store textures that AREN'T needed to render the current scene, but might be needed soon. This means they're available right away if they then do become needed without needing to be requested from the cache in main memory (or from storage on an SSD or HDD). There is no reason not to fill all available vRAM with cached textures, because there is no penalty for dumping that memory and filling it with in-use textures on-the-fly. This is why measuring vRAM usage for modern games will always show near to full utilisation of vRAM (unless the game itself does not have enough textures to fill vRAM).

In XB1/PS4, caching textures to RAM is really important. If the texture is not in RAM when needed, that means the texture needs to either be loaded off of a pretty slow HDD, or a REALLY slow optical drive. That ends up with texture popping if not handled correctly. There are also other considerations for consoles in that that same RAM bank is also being used for everything else (OS tasks, physics, etc).
On a PC, you have system RAM. Textures that are not cached in vRAM can be streamed in the background to system RAM, and passed very quickly over the PCI-E bus when needed. The might-need-it- texture cache capacity is the spare vRAM capacity PLUS the system RAM.

For a 4GB GPU with 4GB of system RAM and a spinning-rust HDD, there's the possibility you might run into a situation where a game ported from a console assuming a contiguous 8GB of RAM and making very heavy and highly optimised use of dynamic texture caching may in certain situations become bottlenecked by the PCI-E bus between the two RAM pools, but that seems pretty unlikely (and would be a case of poor porting). With 8GB of system RAM, you have more than sufficient space to cache texture of at least equal quality to those used on consoles.


In all likelihood you'll be limited by ROP could at high resolutions before you'll be limited by available vRAM for buffers. At high texture resolutions you DO need more vRAM, but the presence of next-gen consoles with heterogenous architectures does not alone provide pressure to implement these higher resolution textures.
 
IMO the argument that 4GB is or could be preferable to 8 is ridiculous. The fact Nvidia puts 12GB on their Titan cards indicates the larger vRAM has some use, or at least useful potential. Again imo for what the company charges for these top-end cards the least they can do is throw another 4GB of memory on them, to provide a bit of extra future-proofing.
 
IMO the argument that 4GB is or could be preferable to 8 is ridiculous.
Which is why nobody is making it. The argument is that 4GB is sufficient, and that neither rendering at 4K nor the use of 8GB combined RAM by modern consoles mean that 8GB for PC is necessary.
8GB will not incur a performance penalty, but for the vast majority of games will also not confer a benefit.

The fact Nvidia puts 12GB on their Titan cards indicates the larger vRAM has some use, or at least useful potential.
For GPU compute datasets.
The whole differentiating feature of the Titan line from regular 780s/780TIs is the unrestricted Double Precision floating point performance. Currently the only time DP comes into use in gaming is for a handful of space sims using DP depth buffers to alleviate massive distance objects and very close objects in the same scene causing Z precision issues (and most simply use SP buffer tricks or multiple overlaid cameras for performance reasons).
 
On a PC, you have system RAM. Textures that are not cached in vRAM can be streamed in the background to system RAM, and passed very quickly over the PCI-E bus when needed. The might-need-it- texture cache capacity is the spare vRAM capacity PLUS the system RAM.

Do any games actually do this, though? I was of the belief that textures are only cached in VRAM and otherwise need to be loaded from a drive.

The fact Nvidia puts 12GB on their Titan cards indicates the larger vRAM has some use, or at least useful potential.

Except they don't put 12GB on the Titan cards. The single GPU Titans have 6GB. Only the Titan Z has 12GB, but it's a dual GPU card, so it only has a 6GB frame buffer.
 
Do any games actually do this, though? I was of the belief that textures are only cached in VRAM and otherwise need to be loaded from a drive.



Except they don't put 12GB on the Titan cards. The single GPU Titans have 6GB. Only the Titan Z has 12GB, but it's a dual GPU card, so it only has a 6GB frame buffer.



Every game does it automatically as part of DirectX functionality.
 
COD: Advanced Warfare uses 3.9 gigs of vram on my 1080p setup with all highest settings with 8x super sampling and 16x AF.
 
Do any games actually do this, though? I was of the belief that textures are only cached in VRAM and otherwise need to be loaded from a drive.
The graphics API runtime actually manages these resources. Developers are provided handles to resources that may reside in video memory, in RAM or, in the worst-case, hard disk.

Resource creation itself is pretty costly, but shuffling them between video memory and system memory is actually relatively quick. In some cases quick enough to not matter for a given frame.
 
People are so hung up on having max settings that they do it even if there is no real difference in visual quality. At 1080p I'd say those uncompressed "ultra" textures are lost on many. Even at 1440p the difference in most games is minimal and nonexistent when moving. At 4K, GPU horsepower usually becomes an issue before VRAM.

You also have to remember that game developers have to develop with multiple PC configurations in mind, most of which don't have over 2 GB today. Despite consoles having more memory on tap, there is not much point cramming really high res textures in those games as that detail is lost at 1080p, even more so on games that upscale to 1080p from a lower res for performance reasons. As seen on most new multiplatform titles, consoles seem to have a combination of medium and high settings compared to the PC.
 
Wrong, beaver. People arguing you're fine with 4gb right now, including myself, are actually running 4k. The guys claiming you need 8gb are running lower resolutions like 1080p or 2560. They are as always, mistaken. Every time this same argument comes up over the years, it is ultimately proven that you run out of gpu horsepower long before vram, even in sli generally. You gpu simply ends up too slow to make use of the extra features the vram would let you enable. Back in 2004 I argued about vram when I ran a high res 1680x1050 monitor. Then again at the end of 2008 when I got a 2560x1600 monitor. Now the same behind the times crowd advises wasting heaps of cash or avoiding 4k entirely because of vram as usual and are completely wrong (I've run 4k since may 2014).
 
Wrong, beaver. People arguing you're fine with 4gb right now, including myself, are actually running 4k. The guys claiming you need 8gb are running lower resolutions like 1080p or 2560. They are as always, mistaken. Every time this same argument comes up over the years, it is ultimately proven that you run out of gpu horsepower long before vram
You've already been given multiple cases where it wasn't true, and with existing games already knocking on the 4GB door your claims are genuinely ridiculous imo. Save this post and come back and read it again in three years, maybe two.

See we told you so.
 
Status
Not open for further replies.
Back
Top