The honest truth is, when I first got it virtually nothing used 512MB, by the time I got rid of it the same amount of vRAM was a bottleneck.You'll be using the same videocard for that time period?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The honest truth is, when I first got it virtually nothing used 512MB, by the time I got rid of it the same amount of vRAM was a bottleneck.You'll be using the same videocard for that time period?
They also have sub-1080p reslutions most of the time, no AA, low quality textures, etc..
So they dont need any extra memory , as they cant use it if they wanted to.
Whats you point? This changes nothing
PCs have higher resolutions, higher texture density, higher AA, all of which all requires more vram
The consoles are the main argument you guys use as far as quote unquote "needing" more vram, and it's outright false.
lol and here i am, waiting for the 8GB 980's for 2560x1440, overkill, but thats how i like it, i expect GTA V to eat it
I don't think output resolution and AA are that big a memory hit.
Let's assume Deferred Rendering, which uses 4 buffers (G, Z, Normal, Final). Let's also assume triple-buffering for the output. And let's assume Single Precision Floats are used for everything. And everything gets an Alpha channel
1920 x 1080 x 4 (3 colours + alpha) x 4 (bytes in an SP float) x 6 (buffers) = 199,065,600 bytes, or 190MB of buffers.
For UHD/4k, that's 760MB of buffers. Big, but still under 1MB.
Now, MSAA is incompatible with Deferred Shading. Here, you have the depth buffer that ends up getting scaled by the MSAA scaler, but you don't have the G and Normal buffers. Just the final output buffers and the scaled Z buffer. Let's assume 4x MSAA:
1920x1080x4x4x(3+1x4) = 232,243,200 bytes, or 222MB
For UHD/4k, that's 886MB. Still under 1GB. 4xMSAA is probably overkill for UHD anyway.
So at a worst-case, with 4GB vRAM and 4xMSAA at UHD, you still have over 3GB for texture storage.
::EDIT:: Technically the Z-buffer is only one channel, not 4, and most buffers don't need an Alpha channel. I didn't want to make the calculations even more fiddly.
lol and here i am, waiting for the 8GB 980's for 2560x1440, overkill, but thats how i like it, i expect GTA V to eat it
Nothing I said earlier should be misconstrued as to disagree with this statement. Had there been 8 GB 980s around when I was in the market for a video card, and if they weren't some ridiculous amount more expensive than the 4 GB version, I very likely would have purchased them. More RAM is always better so long as you're still within a reasonable price range.Moving on now. I would rather have more gpu ram and not need it. Than to see I need it and not have it.
Just remember that if it wasn't for people going for overkill now we wouldn't have more VRAM for a much cheaper price in the future with games then taking advantage of it. It's a never ending circle that is often useless at the time, but keeps things moving forward.
Shadow of Mordor easily does it and Far Cry 4 is 3.5Gb average and 4Gb is locations. This is just 1080P resolutions. So now you show me why I don't need it?
All I know is playing some of these newer games on 1080p with my 660ti even at max settings, while my vram is capped, my frames aren't stuttering like they did in the old days. Dragonage I'm getting in the high teens, to low 20's and it's smooth. It's not a true vram cap. Back in the day when we had 256mb cards trying to run games like Oblivion would stutter like a b!@tch!! Those days you couldn't run AA and all the high end bells and whistles. I think it's more of a gimmick these days than anything, programmers and studios in bed with card manufacturers to get you to buy the latest and causing more hype.
These threads are always amusing, it's the same tired arguments and short term thinking every time. The difference this time is that we're already seeing games push the limit of most current high end cards, mainly because of increased memory in the new consoles that raises the bar for pc ports. I've been meaning to upgrade for about a year but prices were wacky and now I'm holding out for a mainstream card with 6gb+ because I don't think that 4gb will be enough by the end of 2015.
If you replace your GPU every 6 months it's probably not a major concern but if you like to skip a generation or two before upgrading then it is something you should be concerned about. My last several GPUs have had what many people considered at the time to be more than is necessary but by the end of their life lack of vram was the biggest problem.
I have also never seen a game use prefetch like memory management. In every game that I've run out of vram it starts using system memory and/or the page file which cause performance to plummet, for instance a solid 60 fps might drop into the teens or twenties.
Teens to 20's is the opposite of smooth
The thing I don't get is: let's say you game at XYZ resolution. You have frame rate issues turning everything to max, you don't have the gpu processing power to get 60+FPS. You're going to have to turn something down at any rate. The difference between the textures on Ultra/Uber/SuperDuper and High is minimal at best. Is there any point to keeping the textures on Ultra?
Why would I buy a card now with more memory for the sake of turning on Ultra textures if I don't even get 60FPS at a given resolution? I'll just sell it later on and buy a new one.
IME I've had to turn things down to below an acceptable level due to vram limitations much more often than I've had to for lack of gpu power. As I mentioned in my other post I've found that the performance drops from running out of vram are much more severe than when I run out of gpu power. This makes a big difference because I can usually deal with an occasional drop from 60fps to 50fps but an occasional drop to 20fps makes many games unplayable.
This is good and desired behaviour, the exact opposite of concerning. Any RAM not being utilised is simply RAM being wasted doing nothing. It's better to cache even rarely used data in RAM not being used than to simply leave it empty. LstOfTheBrunnenG is entirely correct here.
I think busybeaver is spot on myself.
The latter. Using larger textures will very quickly raise RAM usage (due to textured being powers-of-two in dimensions, every size increase will be in 4x steps). Raising the texture resolution at ANY output resolution will result in a very large increase in vRAM needed to store textures in use. Raising output resolution without changing texture resolution will only result in a moderate increase in vRAM use (calculated earlier in the thread).That begs the question, is the reason 4GB is currently enough for most games because you don't need more at current resolutions and 4k, or because the texture quality is lower than it otherwise might be?
Which is why nobody is making it. The argument is that 4GB is sufficient, and that neither rendering at 4K nor the use of 8GB combined RAM by modern consoles mean that 8GB for PC is necessary.IMO the argument that 4GB is or could be preferable to 8 is ridiculous.
For GPU compute datasets.The fact Nvidia puts 12GB on their Titan cards indicates the larger vRAM has some use, or at least useful potential.
On a PC, you have system RAM. Textures that are not cached in vRAM can be streamed in the background to system RAM, and passed very quickly over the PCI-E bus when needed. The might-need-it- texture cache capacity is the spare vRAM capacity PLUS the system RAM.
The fact Nvidia puts 12GB on their Titan cards indicates the larger vRAM has some use, or at least useful potential.
Do any games actually do this, though? I was of the belief that textures are only cached in VRAM and otherwise need to be loaded from a drive.
Except they don't put 12GB on the Titan cards. The single GPU Titans have 6GB. Only the Titan Z has 12GB, but it's a dual GPU card, so it only has a 6GB frame buffer.
The graphics API runtime actually manages these resources. Developers are provided handles to resources that may reside in video memory, in RAM or, in the worst-case, hard disk.Do any games actually do this, though? I was of the belief that textures are only cached in VRAM and otherwise need to be loaded from a drive.
You've already been given multiple cases where it wasn't true, and with existing games already knocking on the 4GB door your claims are genuinely ridiculous imo. Save this post and come back and read it again in three years, maybe two.Wrong, beaver. People arguing you're fine with 4gb right now, including myself, are actually running 4k. The guys claiming you need 8gb are running lower resolutions like 1080p or 2560. They are as always, mistaken. Every time this same argument comes up over the years, it is ultimately proven that you run out of gpu horsepower long before vram