Now you do need 8 GB VRAM, folks

Quartz-1

Supreme [H]ardness
Joined
May 20, 2011
Messages
4,257
Techpowerup reports Black Ops uses 8 GB VRAM and stutters if you have less.

It also needs lots of RAM.

Regardless of the value of the game itself, this can only point the way forward for future GPUs.
 
Techpowerup reports Black Ops uses 8 GB VRAM and stutters if you have less.

It also needs lots of RAM.

Regardless of the value of the game itself, this can only point the way forward for future GPUs.

or the game is terribly optimized :p which in this case is exactly the problem.
 
No. It is far more likely speaking to very poor coding. Was reading last night how the ram fills for Nvidia to using it all and and even the Ram was getting 8Gb use. The graphs though showed AMD not doing the same on Vram. But then again the graphs showed the 390X with a 10% lead on a 290X, not really feasible unless as speculated the 290X was a ref card throttling.

At any rate it is a poorly coded game that seems to be a great scenario for a resource hog but not so much for efficiency.
 
Yeeeeeeeah...NO!

I think Hilbert over at Guru3D said it best :

So the results you see today are INDICATIVE and not precise. This game is a mess to measure. We found a sweet spot setting that works pretty good and measure in a level (In Darkness Quarantine Zone, Singapore) that allows fairly consistent frame-rates. However the number you see WILL change even your GTX 980 TI or Fury X at one point will drop to 30 FPS These means that the results shown today are indicative, not a precise measurement by any standard.

http://www.guru3d.com/articles-pages/call-of-duty-black-ops-iii-vga-graphics-performance-benchmark-review,1.html
 
Yeeeeeeeah...NO!

Actually yes. Because the problems disappear when you use a Titan X and have sufficient RAM.

That there are issues with the game itself is outside the scope of this thread.

Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt, so why slam a game which is so graphically demanding because of VRAM grunt?
 
Personally I'd love to see 8GB become the mainstream VRAM amount. Bigger textures, more detail, and less dependent on tricks to make the environment look better. Also points to the forward thinking of certain GPU designs.

Can't wait for HBM2!
 
So you need 8gb vram if you are 15 years old and want to play black ops 9.

Cool.
 
Actually yes. Because the problems disappear when you use a Titan X and have sufficient RAM.

That there are issues with the game itself is outside the scope of this thread.

Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt, so why slam a game which is so graphically demanding because of VRAM grunt?

Crysis was an incredible looking game that scaled well as settings were increased. Performance was fairly consistent on a given collection of settings. This is just a CoD game that looks like another CoD game and isn't pushing next level graphics while running somewhat poor.
 
"We noticed that with a variety of cards the behaviour, in terms of graphics memory utilization, per card is rather different. We have seen some scenes use a little more and others a little less, so please take a 20% deviation into account. Once you go to WHQD at 2560x1440 we'll pass 2GB EASILY and since graphics cards in that resolution often have 3 to 4 GB graphics memory, you should be fairly safe. In Ultra HD you'll need a lot more memory as we jump towards close to 4 GB of VRAM usage. COD simply caches as much as it can towards graphics memory & yes .. system memory now as well. But the 2GB cards up-to 1080P performed pretty OK. We recommend 4GB or better for 2560x1440 though."
From the article, other games will cache everything to, doesn't mean it has to actually use it all.
 
BLOPS3 also apparently has a system memory leak, so I'm erring on the side that it's a buggy mess.

I'm sure we'll need more VRAM and system RAM eventually (640k anyone?), but today isn't that day.
 
Last edited:
Seems like just about every AAA PC game that comes out nowadays is a complete trainwreck.

I thought the argument was that since the consoles are so similar to PC these games would translate over easily?
 
Crysis was an incredible looking game that scaled well as settings were increased. Performance was fairly consistent on a given collection of settings. This is just a CoD game that looks like another CoD game and isn't pushing next level graphics while running somewhat poor.

Exactly! If this was the next level in Graphics over anything we have now then I will understand the 8GB need but just because of sloppy programming now we get people thinking any card without 8GB is useless is ridiculous.
But hey, maybe he is practicing to apply for a tabloid position. :D
 
this thread earns the coveted Dumbest Thread of The Day™ award!

Congrats
 
Seems like just about every AAA PC game that comes out nowadays is a complete trainwreck.

I thought the argument was that since the consoles are so similar to PC these games would translate over easily?

Notice that most of them are running GameWrecks code. ;) Also the consoles can dynamically have access to a potential of 8GB of VRAM. I doubt that CoD developers cared enough to recode their game for mere mortals who game without a TitanX or AMD 300 series.
 
He means Titan x. And 4gb HBM does not run like 4gb ddr5

In terms of capacity it does.

Notice that most of them are running GameWrecks code. ;) Also the consoles can dynamically have access to a potential of 8GB of VRAM. I doubt that CoD developers cared enough to recode their game for mere mortals who game without a TitanX or AMD 300 series.

That's one of the reasons I don't really get a hard-on for "AAA games". They're all console games with barely an afterthought for PC gaming.
 
Yeah I also put my money towards bad coding. For example Witcher 3, despite its looks and how much it takes to max out, is surprisingly light when it comes to VRAM use. GTA V is not insane VRAM hog either if you can sacrifice a bit on texture quality and yet it provides excellent visuals. 2gb cards *sigh* are hopelessly outdated (and became so almost overnight i might add) no matter how fast the GPU is but no way in hell is 4gb already too small for relatively simple scripted railroad shooter like CoD.
 
Wow funny how high-end mobile cards (that have 8gb) may actually play this better than some desktops. Well at least my Titan X will be good for something!
 
And these are guys people are trusting to be able to handle "close to the metal" coding required for D3D12 and Vulcan....
 
Not this shit again.

What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB.

On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.

It's not purely a vram issue, and if anything looks more like a serious memory leak issue, and just "terribad shitty console port which happens to be worse than usual because it's a COD game".
 
Notice that most of them are running GameWrecks code. ;) Also the consoles can dynamically have access to a potential of 8GB of VRAM. I doubt that CoD developers cared enough to recode their game for mere mortals who game without a TitanX or AMD 300 series.

Actually they set apart around 3-3.5 GB for the OS, so the game can play around with 4.5 to 5 GB ram total.
 
It's not purely a vram issue, and if anything looks more like a serious memory leak issue, and just "terribad shitty console port which happens to be worse than usual because it's a COD game".

Shitty console ports?

When did that start?
 
Shitty console ports?

When did that start?

as soon as the Xbone and PS4 were introduced in the market, sadly. consoles are always a cancer, but today its just worse than ever.
 
I read over on wtftech that Blizzard's newest game (Overwatch) was "made with consoles in mind". You know what that means................
 
And these are guys people are trusting to be able to handle "close to the metal" coding required for D3D12 and Vulcan....

Console Master Race of Developers using l33t bare-metal programming tricks to get up to 30fps out of 720p no AA no AF low textures on last year's hardware. Bow down, DX / OpenGL peasant.
 
Arkham Knight and Black Ops III are poorly optimized games...if Witcher 3, Phantom Pain etc can run fine and look great with mess less VRAM then I don't see 8GB becoming a new trend
 
I read over on wtftech that Blizzard's newest game (Overwatch) was "made with consoles in mind". You know what that means................

I kinda feel like the tables have turned, and 5 years down the road PC might mean Personal Console instead. The good news is PCMR can continue to use their old acronym I guess
 
I read over on wtftech that Blizzard's newest game (Overwatch) was "made with consoles in mind". You know what that means................

Overwatch was almost certainly originally built for PC, but the playability lends very well to console and conversion isn't that hard. Diablo is the same, definitely made for PC but plays very well on console.
 
So at 1920x1200 with BOPS3 on my system.. just switched to my R9 390, it was using over 5GB VRAM and a little over 8GB system RAM according to MSI Afterburner.

I didn't have all the graphics settings maxed out in the game, but it never once dropped below 60fps except on one cut screen which is apparently locked at 30fps.

Edit: So turned all settings to max and VRAM usage went up to 6741 and system RAM usage went up to 8782.

The crazy thing is that Afterburner is reporting 17364MB for the page file. Absolutely no reason for this.

The fps is still at 60fps constant.
 
Last edited:
4gb HBM does not run like 4gb ddr5
Check the Advanced Options test:

http://www.hardocp.com/article/2015/10/06/amd_radeon_r9_fury_x_crossfire_at_4k_review/5#.Vj6VTr-hfCN

Turning the distance sliders to maximum just kills performance on the AMD Radeon R9 Fury X CrossFire at 4K. Without these options enabled we were averaging above 60 FPS. Just turning them on to the full distance cuts performance in half and makes Grand Theft Auto V very unplayable. This does not happen on the TITAN X SLI and GTX 980 Ti SLI configuration.
 
How many of these releases do we need to see to understand this eye candy crap isn't worth it? This is the very definition of "lazy programming". 16GB and a Titan Z for smoothness with those textures? Come on fan boys, really!
 
Last edited:
How many of these NV gameworst releases do we need to see to understand this proprietary eye candy crap isn't worth it? This is the very definition of "lazy programming". 16GB and a Titan Z for smoothness with those textures? Come on fan boys, really!

it's not a gameworks game
 
The only thing the next-gen consoles are going to do for PC users is make 8GB of VRAM the norm, which honestly thanks to HBM was going to happen regardless.

All you're seeing from these consoles is developers wasting VRAM because they have the means to just be wasteful. Can't even do 1080p most of the time and before these consoles I never had a problem with 2GB @ 1080p. Wasn't that long ago the 780 3GB was excessive, then came the console ports where rather than optimizing the frick'n game they'd just see users grab 8GB+ GPU's as fast as possible.

Just shows the industry hasn't learned a damn thing from last generation. No reason any COD game should run like crap on a PC. They've never been superior graphically whatsoever.
 
Back
Top