Wolfenstein II shows 8GB vRam is a must

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,279
Guru3d did a review on Wolfenstein and I thought the results were pretty amazing:
http://www.guru3d.com/articles-page...-pc-graphics-analysis-benchmark-review,1.html

I had noticed that the GTX 980ti was up to 30% slower than the 1070 - two cards that are usually much slower. I figured it was the typical nVidia rot that we have seen with Fermi and Kepler.

Then I noticed the FuryX results.

Man that card got destroyed by an 8 gb 480. It seems that 8 gb might finally be necessary.

We have seen prior games "use" way more than 4 gb before, but this always seemed to be the game just using up the frame buffer. 4gb card didnt seem to take a hit, even at 4k.
Case in point:
https://www.techpowerup.com/reviews/Performance_Analysis/Middle_Earth_Shadow_of_War/5.html

This game really NEEDS more than 4gb vRam. No doubt 8 gb will be the norm for even mid tier cards going forward.
 
To bad they didn't drop it one preset quality setting and see how the Fury X does?

Looks like Vega got some legs in this game (based on the Rapid Packed math???) - passing up the 1080 with flying colors - but still lagging pretty seriously behind 1080TI.
 
The following quote from Guru3D does warrant mentioning: "Up-to Full HD (1920x1080) in fact any ~4 GB graphics card of decent caliber should do the job well."

For anyone gaming it higher resolutions and at maximum settings, the title of this thread certainly rings true though.
 
The following quote from Guru3D does warrant mentioning: "Up-to Full HD (1920x1080) in fact any ~4 GB graphics card of decent caliber should do the job well."

For anyone gaming it higher resolutions and at maximum settings, the title of this thread certainly rings true though.
It's not a "must" anytime you can drop one setting one tick and be perfectly smooth.

This same sort of claim was made with Doom and nightmare settings.

The difference between nightmare setting and ultra setting was almost entirely unnoticeable -- even in still screenshots. One required 6+ GB of VRAM. The other worked perfectly fine with 4GB.

For that matter - the difference between Low and Ultra was inconsequential with Doom.
http://www.game-debate.com/news/20177/doom-low-vs-ultra-graphics-comparison-video
 
When im playing at most memory usage i see is around 5gb using 1070ti. When i had my 980 it was using 3.5gb
 
8 GB of memory is a "must" if you select settings which use 8 GB of texture memory - right?
Snarkiness noted, but that never seems to be the case as was pointed out in the TPU link.

The following quote from Guru3D does warrant mentioning: "Up-to Full HD (1920x1080) in fact any ~4 GB graphics card of decent caliber should do the job well."

For anyone gaming it higher resolutions and at maximum settings, the title of this thread certainly rings true though.
https://hardforum.com/threads/wolfenstein-ii-shows-8gb-vram-is-a-must.1947517/reply?quote=1043307686
Despite this games "mein" settings, vRAm requirements seem to have stagnated the last couple of years, just like system ram - a good thing with the current price of ram. I think it will be a LONG time until a game ever NEEDS (not uses) 16 gb of vRam. For a while, vRam requirements seemed to double for games every couple years. Looking at all the 2016 and 2017 games, this one was the only game to really suffer on ANY settings when using 4 gb of frame buffer.
 
Snarkiness noted, but that never seems to be the case as was pointed out in the TPU link..

Sorry, I am just surprised by your use of the word "must" when that's clearly just the highest possible setting which may or may not have any value (as I've often seen with "ultra" level settings).

If you really meant "8gb is a must for highest possible settings", you'll get no snark or disagreement. :) But as indicated by others, you can drop the setting a notch and this never happens - and is really not noticeable, making the word "must" a little questionable.

I guess what I'm saying is I'm really being more pedantic than snarky, at least by intent.
 
My applogies. Really, my point is that it is suprising that this the first time we have seen a 4 gb vRam of any kind.
One of the biggest "shortcomings" of the Fury cards was its "inadequate 4gb of vRam which wont be enough for 2016 games... we are already seeing games use up more than 6 gb!!"
Well that was all BS and here we are at the end of 2017 and it is maybe the first case where it holds true.
This is a good thing imho. 8 gb will probably suffice well into 2020.
 
My applogies. Really, my point is that it is suprising that this the first time we have seen a 4 gb vRam of any kind.
One of the biggest "shortcomings" of the Fury cards was its "inadequate 4gb of vRam which wont be enough for 2016 games... we are already seeing games use up more than 6 gb!!"
Well that was all BS and here we are at the end of 2017 and it is maybe the first case where it holds true.
This is a good thing imho. 8 gb will probably suffice well into 2020.

This isn’t the first time a 4GB Card has gotten killed by a 8GB Card, just the first one you noticed.
 
Ok... well can you back up this claim with a link or something?

Jesus, what is this amateur hour?

Not saying you are wrong, but people expect a source.
 
How much did you actually search? If you went back in time via search you'd have seen that Nightmare mode required at least 5 GB VRAM to run. Sure, there were hacks to get around the limitation and apparently it now doesn't require it, but it sure did at one point.
 
The interesting thing here is that the game uses more than 4GB if it is available, but still runs fine in many cases where the game is limited to 4GB. So simply using it because it's there is not equal to it being necessary to run.

But I'd agree it's an early sign that we are seeing a use-case for 4GB+ of vram.
 
How much did you actually search? If you went back in time via search you'd have seen that Nightmare mode required at least 5 GB VRAM to run. Sure, there were hacks to get around the limitation and apparently it now doesn't require it, but it sure did at one point.

I searched the entire internet. All 12 billion pages.

If that is the case, simply show me an actual test where 4 gb took a hit instead of just being lazy and running your mouth.
 
Important point about buying a card with enough vRAM: It could actually save you money if you are doing a budget build. With enough vRam, you can get by with 8 GB of system Ram instead of 16 GB which would save you upwards of $100 with ddr4 prices.
https://www.techspot.com/article/1535-how-much-ram-do-you-need-for-gaming/page2.html

Key points:
1.) A 6 GB 1060 with 8 GB ram always does better than a 3 GB 1060 with 16 GB of Ram, and it would be cheaper.
[most likely, 8 GB RX580 with 8 GB ram > 4 GB RX470 with 16 GB ram, and again cheaper]
2.) 16 GB Ram is always enough for all of the cards, but is a MUST for the 3 GB 1060
3.) CoD WW2 was the only game that showed the 3 GB taking a hit vs. the 6 GB (ignoring the 10-15% loss from less shaders) no matter how much RAM was used.
4.) WW2 was also the only game where having only 8 GB of Ram put a real hit on the 6 GB 1060.
[It would be interesting to see how an 8 GB Rx 580 does with 8 GB of Ram in this game]

Obviously, having 6-8 GB of vRam AND 16 GB of Ram is the best scenario, but if you are building on a tight budget, the $100 in RAM can go a long way on a better GPU.
 
FWIW, Wolfenstein New Colossus at 4k is only using around 2.5-3 GB system memory on my rig. I didn't check, but I suppose the majority is being consumed on my 11 GB 1080 Ti.
 
Yes i have tried the game in DSR 4k resoultion ultra it consumed 8gb+ system ram of 3-4gb
This is a confusing sentence.

FWIW, Wolfenstein New Colossus at 4k is only using around 2.5-3 GB system memory on my rig. I didn't check, but I suppose the majority is being consumed on my 11 GB 1080 Ti.

Interesting, so around 14 GB of "combined ram" total. Battlefield 1 and BF2 had similar totals in Steve's test, but basically the opposite. The were using around 4 GB of vRam and closer to 9 GB of system ram. Curious to see how much system ram other very high vRam using games such as Shadows of Morder and GTA V.
- Using lots of vRam could just mean you system often consumes less system Ram, such as the case with Wolf 2.

Some games suffer more when the ratio of vRam to system ram consumption is low, such as the case with Wolf 2 - probably a bandwidth thing.

WW2 is definitely the mack daddy of total ram usage right now. It uses 15-20 GB combined depending on the GPU power. However, it looks to do ok with a lower vRam to Ram ratio seeing as the Furyx holds up ok in this game ( assuming the 1% lows are doing ok)
 
Important point about buying a card with enough vRAM: It could actually save you money if you are doing a budget build. With enough vRam, you can get by with 8 GB of system Ram instead of 16 GB which would save you upwards of $100 with ddr4 prices.
https://www.techspot.com/article/1535-how-much-ram-do-you-need-for-gaming/page2.html

Key points:
1.) A 6 GB 1060 with 8 GB ram always does better than a 3 GB 1060 with 16 GB of Ram, and it would be cheaper.
.

That's because the card is cut. They call both "1060," but , it the 3GB version has 10% of it's functional units cut.

TechSpot misses little details like this in most of their in-depth articles. Otherwise, they could have overclocked the 3GB card to make up the difference.

Nvidia really brought this shit on themselves, by labeling both "1060." At least AMD has the decency to call their cut part the 570.
 
Last edited:
How in the hell does a cod game need that much horsepower?

You mean "how does it need so much VRAM?" The 1060 6GB is perfectly capable of playing the game at a solid 80fps min, just look at those 32GB ram scores!

THE ANSWER IS, IT'S A BULLSHIT TEST DESIGNED SPECIFICALLY TO BEAK THAT 3GB GRAPHICS CARD.

TechSpot lists the test as "Extra" quality, which I'm sure loads gargantuan textures into VRAM, and other VRAM-hungry effects. Stuff your average user would have trouble telling the difference with, but is there for people with monster rigs to justify spending thousands of dollars.

FROM THE ARTICLE, THE AUTHOR ADMITS THE TEST IS BULLSHIT:

The 1060 3GB on the other hand just doesn't have enough VRAM to play Call of Duty WWII at 1080p using the extra quality settings. That's not the end of the world and while some will make a big deal out of this, lowering a few select quality settings will improve things without taking much away from the game.

But it sure is great for benchmarking games and kicking-off MASSIVE freak-out sessions with readers like yourself :D

You want those "extra quality" settings, then you buy a 6GB graphics card and 16GB ram. But if you couldn't give a shit, the 3GB card still delivers 70fps if you drop a few settings, and have 16GB system ram.

I think it''s already been understood that this year's crop of games is pushing above the 8GB system ram barrier. Anything less than 16GB is useless for a midrange gaming build. But it's going to be awhile before we top 16, so you can avoid spending several hundred dollars on ram.
 
Last edited:
That's because the card is cut. They call both "1060," but , it the 3GB version has 10% of it's functional units cut.

TechSpot misses little details like this in most of their in-depth articles. Otherwise, they could have overclocked the 3GB card to make up the difference.

Nvidia really brought this shit on themselves, by labeling both "1060." At least AMD has the decency to call their cut part the 570.

It really should have been a 1060 and 1060ti. But, I think they purposely left the $150 price range between the 1060 6b and 1070 open for a 1060ti at a later date, which never materialized.
 
Back
Top