Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Techpowerup reports Black Ops uses 8 GB VRAM and stutters if you have less.
It also needs lots of RAM.
Regardless of the value of the game itself, this can only point the way forward for future GPUs.
So the results you see today are INDICATIVE and not precise. This game is a mess to measure. We found a sweet spot setting that works pretty good and measure in a level (In Darkness Quarantine Zone, Singapore) that allows fairly consistent frame-rates. However the number you see WILL change even your GTX 980 TI or Fury X at one point will drop to 30 FPS These means that the results shown today are indicative, not a precise measurement by any standard.
Yeeeeeeeah...NO!
Actually yes. Because the problems disappear when you use a Titan X and have sufficient RAM.
That there are issues with the game itself is outside the scope of this thread.
Nobody slammed Crysis because it was so graphically demanding in terms of GPU grunt, so why slam a game which is so graphically demanding because of VRAM grunt?
Crysis was an incredible looking game that scaled well as settings were increased. Performance was fairly consistent on a given collection of settings. This is just a CoD game that looks like another CoD game and isn't pushing next level graphics while running somewhat poor.
But isnt their top card Fury X only 4GB?
Seems like just about every AAA PC game that comes out nowadays is a complete trainwreck.
I thought the argument was that since the consoles are so similar to PC these games would translate over easily?
He means Titan x. And 4gb HBM does not run like 4gb ddr5
Notice that most of them are running GameWrecks code. Also the consoles can dynamically have access to a potential of 8GB of VRAM. I doubt that CoD developers cared enough to recode their game for mere mortals who game without a TitanX or AMD 300 series.
That's one of the reasons I don't really get a hard-on for "AAA games". They're all console games with barely an afterthought for PC gaming.
What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB.
On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
Notice that most of them are running GameWrecks code. Also the consoles can dynamically have access to a potential of 8GB of VRAM. I doubt that CoD developers cared enough to recode their game for mere mortals who game without a TitanX or AMD 300 series.
It's not purely a vram issue, and if anything looks more like a serious memory leak issue, and just "terribad shitty console port which happens to be worse than usual because it's a COD game".
Shitty console ports?
When did that start?
And these are guys people are trusting to be able to handle "close to the metal" coding required for D3D12 and Vulcan....
Umm yeah but the 390X is also basically tying the 980, which also only has 4GB.Look at the difference between the 290x and the 390x at 1080p
http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,7.html
I read over on wtftech that Blizzard's newest game (Overwatch) was "made with consoles in mind". You know what that means................
Sure.4gb HBM does not run like 4gb ddr5
I read over on wtftech that Blizzard's newest game (Overwatch) was "made with consoles in mind". You know what that means................
4gb HBM does not run like 4gb ddr5
Check the Advanced Options test:
Turning the distance sliders to maximum just kills performance on the AMD Radeon R9 Fury X CrossFire at 4K. Without these options enabled we were averaging above 60 FPS. Just turning them on to the full distance cuts performance in half and makes Grand Theft Auto V very unplayable. This does not happen on the TITAN X SLI and GTX 980 Ti SLI configuration.
How many of these NV gameworst releases do we need to see to understand this proprietary eye candy crap isn't worth it? This is the very definition of "lazy programming". 16GB and a Titan Z for smoothness with those textures? Come on fan boys, really!