Now you do need 8 GB VRAM, folks

How many of these releases do we need to see to understand this eye candy crap isn't worth it? This is the very definition of "lazy programming". 16GB and a Titan Z for smoothness with those textures? Come on fan boys, really!

Quite honestly I don't care why I need a certain amount of VRAM. It doesn't change the fact you need it to play at your desired settings.

Star Citizen for example would crash if I tried to use DSR due to VRAM. With a Titan X I have no such issue.

Regardless the circumstances where you run out of VRAM capacity before GPU raw power are slim and hard to find. Not really a consideration for me unless I was going 3x+ multiGPU.
 
The only thing the next-gen consoles are going to do for PC users is make 8GB of VRAM the norm, which honestly thanks to HBM was going to happen regardless.

All you're seeing from these consoles is developers wasting VRAM because they have the means to just be wasteful. Can't even do 1080p most of the time and before these consoles I never had a problem with 2GB @ 1080p. Wasn't that long ago the 780 3GB was excessive, then came the console ports where rather than optimizing the frick'n game they'd just see users grab 8GB+ GPU's as fast as possible.

Just shows the industry hasn't learned a damn thing from last generation. No reason any COD game should run like crap on a PC. They've never been superior graphically whatsoever.

the consoles don't have 8 GB of RAM for use with games. the operating systems on both the ps4 and the xbox one use ~3 GB of RAM, leaving 5 GB to games, and that's spread across traditional DRAM use as well as graphics stuff like textures. this whole thing just screams shitty programming.

anyway, who gives a fuck about singleplayer resource usage in a call of duty game? i played the multiplayer beta with a 970 and had no issues at all.
 

What is this nonsense? This isn't indicative of anything. Only time that happens to me is when my cpu gets choked by the advanced options. My statement meant that 4GB DDR5 will be maxed out long before 4gb HBM. In terms of capacity yes 4gb is 4gb, but it takes much much more to max it out versus ddr5
 
Last edited:
My statement meant that 4GB DDR5 will be maxed out long before 4gb HBM. In terms of capacity yes 4gb is 4gb, but it takes much much more to max it out versus ddr5

To be honest I'm not following. Capacity is capacity. That's like saying a 20 oz cup made of glass needs much more water to be filled than a 20 oz cup made of plastic.

What am I missing here?
 
To be honest I'm not following. Capacity is capacity. That's like saying a 20 oz cup made of glass needs much more water to be filled than a 20 oz cup made of plastic.

What am I missing here?

its easy to overlook something you don't care about

Perfect example. Shadows of mordor ultra 4k will use around 6gb.

Gtx 980

http://youtu.be/Px3I9cP6WUw

R9 fury x

http://youtu.be/k_wNEOviaeE

Watch the VRAM. Gtx 980 maxed the fuck out, HBM doesn't even reach 3.9 and repeatedly drops down 3.2 etc. this is right in line with my experiences (also have 295x2 and 970 sli) which would be that it just takes a lot more to max 4gb HBM than it does to max 4gb ddr5. HBM really blows ddr5 out. There will be many occasions where 4gb ddr5 is a bottleneck where 4gb HBM is not.
 
Last edited:
It's two different reviewers, running two different setups, in two different parts of the game.
I have yet to see any benchmarks that show 980 SLi thrashing and Fury/X CrossFire running fine on equal benchmarks. I can, however, see at least one situation where the Fury X hits a VRAM ceiling and the 980 Ti doesn't.

I see a lot of people claiming "HBM has more effective capacity than GDDR5" but haven't seen any tangible proof. AMD never made any claims like that in their marketing, no reviewers have made those claims. It's more like an urban myth at this point. Somebody on a forum somewhere said it, other people started repeating it, and now it's truth even though there's no supporting evidence. Just a handful of AMD fanboys in denial.

Fury X has 4 GB not because HBM can utilize it more efficiently but because it's all their current interposer can support w/ HBM1. It's a limitation, not a feature.
I think we should just acknowledge that 4GB isn't really causing any problems in today's games, including at 4K resolution. And that's that.
 
so now I'm an amd fanboy in denial. I've been using nvidias best cards straight for the last 4 years. Just bought amd this year because nvidia won't make 4k ips gsync screen 40+". I'm not here looking for an argument, I'm just posting my experiences and posted supporting evidence. Feel free to post up some numbers and not just your own personal doubt. I don't go only with what reviewers say, majority of sites are bias. [H] said VRAM was a limiting factor in gta v review but posted no numbers? I also see you bashing amd on the regular, so I wouldn't expect you to believe anything good about team red. Your sig used to say worlds #1 fury critic/hater or something

Heres another one for ya

http://youtu.be/xKtvDGySoAI
 
Last edited:
Some of the Vram problems lies in caching algorithms as soon as you have a smart way or doing this it can be really easy for the PC but the way how it works on consoles is prolly the main reason why it does not function well since both consoles can address memory from cpu or gpu without copying it to "vram".

That people doing console port not doing something smart is evident that it should change is obvious. If you have 8gb main memory and 4GB of Vram (or even 2 GB Vram) it should not produce such bad ports..
 
Actually yes. Because the problems disappear when you use a Titan X and have sufficient RAM.

That there are issues with the game itself is outside the scope of this thread.

Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt, so why slam a game which is so graphically demanding because of VRAM grunt?

I would think the answer is obvious. Because this game isn't graphically special in any way and isn't going to be talked about in those types of circles ever. There is no reason this game should require anything close to the resources that it does. Period. Crysis was special in that regard this game is junk by comparison.
 
It's two different reviewers, running two different setups, in two different parts of the game.
I have yet to see any benchmarks that show 980 SLi thrashing and Fury/X CrossFire running fine on equal benchmarks. I can, however, see at least one situation where the Fury X hits a VRAM ceiling and the 980 Ti doesn't.

I see a lot of people claiming "HBM has more effective capacity than GDDR5" but haven't seen any tangible proof. AMD never made any claims like that in their marketing, no reviewers have made those claims. It's more like an urban myth at this point. Somebody on a forum somewhere said it, other people started repeating it, and now it's truth even though there's no supporting evidence. Just a handful of AMD fanboys in denial.

Fury X has 4 GB not because HBM can utilize it more efficiently but because it's all their current interposer can support w/ HBM1. It's a limitation, not a feature.
I think we should just acknowledge that 4GB isn't really causing any problems in today's games, including at 4K resolution. And that's that.

Actually it was stated by someone at AMD that the issue was not with Vram but with coding and that a great of what is in Vram was more a result of laziness than necessity. This is likely why you will see one game max Vram at about 2.9-3.2gb and others go so far as to run at >6gb. Now I am not sure if the solution they were alluding to was driver related or needed per game. ACTUALLY I think I remember him saying it was driver fixable and would not require dev involvement. Think he stated that they were putting people on that specifically. ( of course could have just been saying that because he knew the HARD 4gb limit with intial HBM and Nvidia using 6gb.
 
Well we know the game is not DX 12 and we don't know what 290x there testing as the 390x runs 1050Mhz gpu and 1500Mhz ram and ref 290x is up to 1000Mhz and 1250Mhz ram.

My 290x is 1020Mhz gpu and 1350Mhz ram.. 30Mhz gpu is a lot to Hawaii as I have tested at 1050Mhz (stock voltage and ram left at 1350Mhz) and it scaled up it Firestrike close to the 390x.
 
So we need 8gb vram for a crappy port and mediocre graphics? Sounds like we have a real winner on our hands :rolleyes:
 
Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt

You've got to be kidding. For years there were people complaining that Crysis must be "poorly optimized" because their shitbox value PCs couldn't cope with it.
 
Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt, so why slam a game which is so graphically demanding because of VRAM grunt?

That's not true, but yeah, the current whining from the AMD crowd seems like sour grapes because they've been stuck with outdated rebrands.
 
That's not true, but yeah, the current whining from the AMD crowd seems like sour grapes because they've been stuck with outdated rebrands.

Not sure in what universe people weren't complaining that their mid-range cards couldn't run Crysis at high IQ -- it was the pariah of unrealistic PC gamers everywhere for a while. In retrospect, however, of course you can't run a game with that rendering and effects on an 8600 GT.

I really wwouldn't, however, equate an envelope-pusher like the original Crysis with a console port like CODBO that's likely badly optimised.
 
Well Vram usage is going to go up all the time.
The idea that it wont is just stupid. It takes one or two good games with high Vram usage before people start realizing that they might need more Vram. Fury is shit due to its Vram limitation just like the 970 is shit due to its Vram limitation. I honestly prefer the 980Ti and 390s due to the Vram (and depending on budget!).
I just find it odd that even though history proves requirements will keep going up we have people trying to make that sound like laziness. Sure some of it is but the reality is that it will go up and you should know that by now.
 
Last edited:
Actually they set apart around 3-3.5 GB for the OS, so the game can play around with 4.5 to 5 GB ram total.

The consoles allocate resources dynamically. Remember HSA? The CPU and GPU literally use blocks of memory that are shared with both. So you can't think of workloads in terms of Black or White. The AMD chips have a CPU section and a GPU section. But according to need and load these absolutes can change on the fly.

So you should think of your PC running console ports in the worst possible scenario where the code from the developer may allocate most of the resources to one extreme or the other. There is the potential for the CPU to have 8GB of ram. There is also the potential for the GPU to have 8GB of VRAM. Sure these extremes will most likely never happen, but all you need is for whatever the developer allocated to exceed your PC's resources to induce stuttering.

Now is CoD the beginning of an upgrade cycle for PC gamers? I don't know because it's just one game and it's Activision which has a less than stellar PC track record. Need to wait and see how more titles run like Hitman and Deus Ex run. Battlefront's recommended spec wants 16GB of ram. Wonder if that is for texture swapping due to most graphics cards being limited to 4GB of VRAM or less?
 
Well Vram usage is going to go up all the time.
The idea that it wont is just stupid. It takes one or two good games with high Vram usage before people start realizing that they might need more Vram. Fury is shit due to its Vram limitation just like the 970 is shit due to its Vram limitation. I honestly prefer the 980Ti and 390s due to the Vram.
I just find it odd that even though history proves requirements will keep going up we have people trying to make that sound like laziness. Sure some of it is but the reality is that it will go up and you should know that by now.

Reasonably speaking yes but to go from 2-3gb a year or 2 ago to the now 5-8gb is not reasonably nor an expectation of software upgrades. This game alone orchestrates nothing more than crappy dev/software/code work, not some limitation of hardware to handle some new level of software. This game is not pushing any envelopes other than patience of the customers.

And keep in mind you have to look at the industry as a whole to see that as of 2 years ago we just got mainline 4gb GPUs (GPUs that could actually handle 4gb or make efficient use of it). Just before that it was 3gb. If you consider that it takes YEARS not months to make a game, at least AAA titles, then you see that they would have been originally aimed at max of 3-4gb for ones coming out now.

So no, this game is not a reasonable expectation of growth.
 
On this game I agree entirely. But the reality is we are going to have a AAA game dev come along within a year or two and push out a game that actually uses that level of resources for a meaningful purpose.
 
I'd say 8GB of RAM and 4GB of VRAM has been the de-facto standard for the past year.

I wouldn't have doubled my RAM to 16 but prices have finally become reasonable again.
 
I think 4 has been the want for most enthusiasts since the 680 honestly. I remember everyone craving for the 4GB versions back in the day.
In my experience with this 7970 I will say that I have not wanted to upgrade to a new card until recently. The 4GB cards do not appeal to me as gaining 1GB in Vram after this many years seems like a giant mistake. Ive been patiently waiting for 6GB and 8GB and now we finally have that.
 
I would just drop the resolution to reduce VRAM consumption. If I can't do 1600x1200 @ 180, then 1280x960 @ 240 or 1024x768 @ 300 would be OK. The higher refresh rate is a bonus too...
 
Now is CoD the beginning of an upgrade cycle for PC gamers? I don't know because it's just one game and it's Activision which has a less than stellar PC track record. Need to wait and see how more titles run like Hitman and Deus Ex run. Battlefront's recommended spec wants 16GB of ram. Wonder if that is for texture swapping due to most graphics cards being limited to 4GB of VRAM or less?

Hell no. Activision + COD + console port = perfect shit storm for a shitastic game.

Well at least this game delivered on its expected shitiness.
 
I agree its just a poorly ported under optimised game console cr*p

that is why it needs 8 gig of vram as the software house cannot be bothered to convert it to pc format correctly!
 
the consoles don't have 8 GB of RAM for use with games. the operating systems on both the ps4 and the xbox one use ~3 GB of RAM, leaving 5 GB to games, and that's spread across traditional DRAM use as well as graphics stuff like textures. this whole thing just screams shitty programming.

anyway, who gives a fuck about singleplayer resource usage in a call of duty game? i played the multiplayer beta with a 970 and had no issues at all.



Yeah I'm familiar with it being actually much less, but I was giving a worst case scenario and even that doesn't make any sense. Let alone when the actual amount is much less on the consoles, which by the way aren't pushing anything truly to the limits other than wasting VRAM likely on uncompressed textures.
 
What is this nonsense? This isn't indicative of anything. Only time that happens to me is when my cpu gets choked by the advanced options. My statement meant that 4GB DDR5 will be maxed out long before 4gb HBM. In terms of capacity yes 4gb is 4gb, but it takes much much more to max it out versus ddr5
It's nothing to do with CPU usage - it's an overclocked 5960X!
Watch the video again. Every time the game freezes like that, it's dumping out the VRAM.

With HBM AMD are dynamically paging out VRAM into RAM, treating it as one big pool of memory they can use, instead of treating that 4GB as a hard limit.
And clearly that doesn't work in GTA V. I would be surprised if that's the only game where it's a problem.

The only thing the next-gen consoles are going to do for PC users is make 8GB of VRAM the norm, which honestly thanks to HBM was going to happen regardless.
Consoles have a unified pool of 8GB RAM and about 5GB is actually available to games.
I would not expect them to be using even 3GB of that pool as "VRAM."
So far the only games using more than 4GB are games which have an option to use uncompressed textures, rather than higher resolution textures, which should look almost identical 95% of the time.

There's certainly no reason for a game like Call of Duty to be using so much VRAM.
That said, I do think that 8GB or more will become the norm, since PC gamers expect to be using better than console settings, and not console-equivalent settings.

I look forward to seeing what HMB2 brings. Hopefully that will make 16GB cards standard and 32GB high-end, since it will be 4GB/8GB per stack, rather than 1GB with HBM1.

Wasn't that long ago the 780 3GB was excessive, then came the console ports where rather than optimizing the frick'n game they'd just see users grab 8GB+ GPU's as fast as possible.
Well the last generation of consoles had significantly less VRAM to work with. The PS3 had 256MB and the Xbox 360 had a unified pool of 512MB RAM. So it's understandable that games are now using a lot more VRAM - and they do look better for it.
It's also worth mentioning that many game engines will now pre-allocate as much VRAM as your card has available, which was not typical before. So newer games may "max out" your card but would perform just as well on a card with less VRAM, rather than requiring that amount.

I would just drop the resolution to reduce VRAM consumption. If I can't do 1600x1200 @ 180, then 1280x960 @ 240 or 1024x768 @ 300 would be OK. The higher refresh rate is a bonus too...
Drop the texture resolution, not the game resolution. And few people are using a CRT where you'll actually get higher refresh rates at lower resolutions.
 
Last edited:
^^ like I said regardless of what's going on those stutter never happen to me unless My cpu is getting choked. a 970 would be using over 12gigs in that situation
 
^^ like I said regardless of what's going on those stutter never happen to me unless My cpu is getting choked
Right, you have a magic 4930K that's faster than an overclocked 5960X.
The CPU usage is visible there, no core is above 50% load. (though the compression makes it difficult to see)

It's very clearly caused by the driver paging out VRAM into RAM.
You can see that the VRAM usage drops and RAM usage increases every time it freezes.

AMD are trying to use HBM's bandwidth as a way to "cheat" having more than 4GB VRAM and it clearly doesn't work with GTA V.
 
Right, you have a magic 4930K that's faster than an overclocked 5960X.
The CPU usage is visible there, no core is above 50% load. (though the compression makes it difficult to see)

It's very clearly caused by the driver paging out VRAM into RAM.
You can see that the VRAM usage drops and RAM usage increases every time it freezes.

AMD are trying to use HBM's bandwidth as a way to "cheat" having more than 4GB VRAM and it clearly doesn't work with GTA V.

You really don't have a clue what your talking about. You also seem to not understand that the advanced options in gta v will choke a overclocked 5960x ON AMD DX11. Are you that dense or do you need proof? Every time the VRAM drops almost 1gb but ram stays at 7.6-7.9. Regardless of whether that's being done or not, it doesn't affect my performance. I bet you didn't know the gtx 970 does exacrly what your are describing and cheating having more then 3.5gb. Members on this forum had to upgrDe from 8gb to 16gb just to alleviate stuttering
 
Last edited:
My personal experience with BO3. At 2560x1440 max settings my system uses around 9 GB of ram and my 390X maxed out at 4.9 GB of VRAM.

Probably a poorly coded game with a memory leak. Hoping they patch it.
 
Why does it need that much in gpu vram and system resources? Doubt BO3 is the best looking game ever so this is just lazy inefficient coding when their pc dev team was porting the game to PC.
 
My personal experience with BO3. At 2560x1440 max settings my system uses around 9 GB of ram and my 390X maxed out at 4.9 GB of VRAM.

Probably a poorly coded game with a memory leak. Hoping they patch it.

Do you have it set to actually render at 2560x1440? By default it renders at a lower resolution and then upscales (at least on the 2 systems I have run it on).

At 1920x1200 with everything maxed, it uses 5.6GB of VRAM and around 8.5GB of system RAM. This is on Windows 10 with a R9 390.
 
Ok, so looking at different web pages about the console versions of BOP3, both PS4 and XB1 use what they are calling adaptive v-sync.. which really isn't that. They are dynamically enabling/disabling vsync when the game drops below 60fps.

The XB1 also doesn't run at 1080p some/most of the time.

And textures and other detail of course are not going to be near as good as what the max settings on PC are.

I also tested on a family member's machine which we put one of my old 7970s in. It defaults to medium settings for most stuff. We changed the real render resolution to 1080p.

It maintains 60fps at those settings.

Raising it up a lot made it tank. Textures set to max made it drop to 5fps.

So, yeah, the PC version is of course going to be more demanding than the console versions if you want to turn everything up.
 
thats what adaptive v-sync is.

Its doing what its suppoed to.
 
thats what adaptive v-sync is.

Its doing what its suppoed to.

Adaptive v-sync dynamically adjusts the refresh rate in order to keep screen tearing from happening like when v-sync is disabled.

This not only requires video card/driver support but also requires that the screen it is hooked up to support it.

What they are describing here is disabling v-sync and it having screen tearing.
 
Do you have it set to actually render at 2560x1440? By default it renders at a lower resolution and then upscales (at least on the 2 systems I have run it on).

At 1920x1200 with everything maxed, it uses 5.6GB of VRAM and around 8.5GB of system RAM. This is on Windows 10 with a R9 390.

Rendering set to 100%. So, no upscaling for me.
 
Adaptive v-sync dynamically adjusts the refresh rate in order to keep screen tearing from happening like when v-sync is disabled.

This not only requires video card/driver support but also requires that the screen it is hooked up to support it.

What they are describing here is disabling v-sync and it having screen tearing.

no, that is not what Adaptive V-sync is. It simply turns off vsync when under the target rate to avoid 1/2 refresh rate drops.

You are thinking of Gsync/Freesync/Adaptive Sync. Notice no V on the Sync.
 
Last edited:
no, that is not what Adaptive V-sync is. It simply turns off vsync when under the target rate to avoid 1/2 refresh rate drops.

You are thinking of Gsync/Freesync/Adaptive Sync. Notice no V on the Sync.

Ok, I wasn't aware of the difference. I think the choice of name is confusing at best and purposefully deceptive at worst.

In that case, why not just use triple-buffering? And don't say it adds input lag. If done properly it doesn't add input lag.. and you don't have to worry about screen tearing.
 
Drop the texture resolution, not the game resolution. And few people are using a CRT where you'll actually get higher refresh rates at lower resolutions.

If you have a CRT, you might as well drop the game res.
 
Ok, I wasn't aware of the difference. I think the choice of name is confusing at best and purposefully deceptive at worst.

In that case, why not just use triple-buffering? And don't say it adds input lag. If done properly it doesn't add input lag.. and you don't have to worry about screen tearing.

True TB cant be done in dx9/dx10/dx11. It will always have some input lag. It has something to do with how the back buffers sync to the presentation buffer.

That having been said, im not a twitch gamer, so i dont care.

I always enable vsync and TB whenever possible.
 
Back
Top