So much for NVIDIA claiming 10 gigs of vram would not be a limitation

Status
Not open for further replies.
Wolfenstein youngblood absolutely cannot run properly on 8 gigs of vram when maxed out with ray tracing even at only 1440p. And every time somebody tries to argue with me about that they don't realize that they don't actually have the game fully maxed out. That's because this is one of many games that don't actually run the highest possible settings when choosing the highest preset. If you were to have image streaming on Uber with Ray tracing and all other settings maxed out at 1440p and DLSS on quality then the game will lock up. You have to add a minimum turn image streaming down to ultra and even then you'll still get some hitching until you lower the DLSS quality to balanced. Other than that rise of the tomb raider is the only other game I know will have any issue with 8 gigs of vram on playable settings at 4K. It will hitch in a few areas of the game but it's not too bad but those same areas were perfectly smooth on the 1080 TI compared to the 2080 super. And both those games I just mentioned were also noticed by digital foundry so no that's not just me making up stuff as I have tested it firsthand.
Been waiting for that game to go on steep sale (the previous Wolfenstein wasn't all that great for me).

Stupid question though - you can turn on the aforementioned max setting WITH raytracing (that gives the 8GB 2080 Super problems) on a 1080Ti, a card that doesn't have RTX cores? (Same for Shadow of the Tomb Raider, which I've benchmarked and neither does 60 FPS max settings at 4K - well, the 1080 Ti SLI setup that I ran for a few years actually did better than 60 FPS - but no raytracing)
 
Been waiting for that game to go on steep sale (the previous Wolfenstein wasn't all that great for me).

Stupid question though - you can turn on the aforementioned max setting WITH raytracing (that gives the 8GB 2080 Super problems) on a 1080Ti, a card that doesn't have RTX cores?
The comparison with the 1080 ti was only for Rise of the Tomb Raider. In Wolfenstein Youngblood it would probably be a slideshow turning ray tracing on with the 1080 ti if it even worked at all.
 
The comparison with the 1080 ti was only for Rise of the Tomb Raider. In Wolfenstein Youngblood it would probably be a slideshow turning ray tracing on with the 1080 ti if it even worked at all.
Since the 2080 super at true max settings for SoTTR (RoTTR, while an enjoyable game, didn't have raytracing) doesn't get a smooth 60+ FPS, I can't help you there - not because of potential VRAM usage, but because raytracing as well as everything else set to max is too much for the 2080 super to handle at 4K, 8GB of VRAM or 48GB.

Or, put another way - not enough raytracing computing power. Turn raytracing off, and everything else at max, and it's smooth like butter.
 
Ok, but we still have to deal with the reality of the meal that is on the menu. Even if the mythical 20GB 3080 was out right now, which it isn't, I'm doubting it would be less than $200 more. GDDR6x is expensive.

So, at $900 for a founders type, or more like $1000 for an AIB... at what point do I say just take the damn $750 AIB 10GB card and upgrade in 2 years. I'm more likely to recover my money on the cheaper card no matter what happens with VRAM usage.

That's not really an excuse, just the reality of it. If there was a 3080 20GB available now for only $75-100 more would I just do it? Yes. Will it only be $75 to $100 more? Dream on. The cards are probably going to need completely modified coolers or it'll have to be the 3090 PCB and cooler minus some RAM... all of which means a lot more cost.
 
Yeah it is a good point that if it was 150 to 200 bucks more to double the vram then most people probably wouldn't fool with it anyway since you'd be looking at $1,000 for some of these aib cards.
 
And yet, arguably the best looking game to date, Hunt Showdown uses barely any VRAM.

20200916000055_1.jpg
 
What a stupid thing to say. This is a public forum where we discuss things and while discussing things it can turn into a debate or an argument when looking at the pros and cons. That's how the world works when you discuss specifications of anything whether it's computers, cars, televisions or whatever else. So again if you don't want to see or hear any arguments then stay off of a public forum.

There's a difference about discussing something and constantly bitching about it.
This topic is brought in differnet threads and there's NO PROOF that any game ACTUALLY uses 10+ gb of vram.
NO PROOF.

Adobe Lightroom on my computer uses all available ram out of my 32gb ram when it's exporting photos.
OMG 32GB RAM IS NOT ENOUGH! WHY ARE THERE STICKS WITH 8GB ONLY, THAT IS NOT ENOUGH, WE NEED 32GB RAM STICKS!

Also, it's really funny how people think that there's a big difference between 10gb and 11gb, lol.
Ridiculous.
 
There's a difference about discussing something and constantly bitching about it.
This topic is brought in differnet threads and there's NO PROOF that any game ACTUALLY uses 10+ gb of vram.
NO PROOF.

Adobe Lightroom on my computer uses all available ram out of my 32gb ram when it's exporting photos.
OMG 32GB RAM IS NOT ENOUGH! WHY ARE THERE STICKS WITH 8GB ONLY, THAT IS NOT ENOUGH, WE NEED 32GB RAM STICKS!

Also, it's really funny how people think that there's a big difference between 10gb and 11gb, lol.
Ridiculous.
Oh the irony. I've never seen somebody complain as much about other people's supposed complaining. Seriously look at these threads as it's just people discussing stuff back and forth like they do on any other specs about a video card. It is mostly calm and really pertaining to upcoming games and the potential that it might be an issue. It's clear you don't even pay close attention to what anybody is really saying anyway based on your ridiculous remarks in this thread alone. Bottom line is you seem to be the only one coming unhinged about it so you might want to look at yourself as being the problem not the topic itself.
 
Last edited:
In MY day 256 was an upgrade! I remember my first card with a whopping 128mb, Radeon 9600SE AGP. Could barely play counter strike and I was happy.
Back in my day we had a NES with 2kB of memory.
 
Last edited:
And I wish people would stop complaining about those that are complaining about 10 gigs not being enough.

If it doesn't bother you then great but it's going to be a talking point for quite a while now so deal with it if you're going to be on a public forum.

I think the general strategy is going ot be DLSS and data streaming. If it is playing back at 4k and looks good, I'm not sure people are oging to care how you got there.
 
There's a difference about discussing something and constantly bitching about it.
This topic is brought in differnet threads and there's NO PROOF that any game ACTUALLY uses 10+ gb of vram.
NO PROOF.

Adobe Lightroom on my computer uses all available ram out of my 32gb ram when it's exporting photos.
OMG 32GB RAM IS NOT ENOUGH! WHY ARE THERE STICKS WITH 8GB ONLY, THAT IS NOT ENOUGH, WE NEED 32GB RAM STICKS!

Also, it's really funny how people think that there's a big difference between 10gb and 11gb, lol.
Ridiculous.

Uh, there's tons of proof.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

And this is last generation crapola. You really think that next gen games aren't going to need more? Stop bsing. Honestly, the internet's so compromised I don't trust anything people write on forums anymore. Most of the people here could be astroturfing accounts. How much is Nvidia paying you?
 
Back in my day we had a new with 2kB of memory.

Which 1970's machine was that? Every real personal computer I know of from the 80's tended to have at least 64KB of system ram and most computers had more like 512KB plus
 
Which 1970's machine was that? Every real personal computer I know of from the 80's tended to have at least 64KB of system ram and most computers had more like 512KB plus
Was suppose to say NES lol. It had 2KB I believe with addition added with each game cart adding w/e expandable ram it needed.
 
I've been posting on 3D video forums since the dawn of 3D accelerators. I'm no shill for any company I've owned them all from Matrox to Diamond to 3dfx to ATI to Nvidia. I'm flat out saying my opinion is that while the 10GB limit MAY become an issue over the next few years, based on the technologies being pushed on PC -AND- on consoles, by both AMD -AND- NVIDIA it appears to me that the strategy for rendering at high resolutions is changing rapidly and permanently and that methods of holding down VRAM requirements are going to QUICKLY become standardized. These companies don't ever want to have to make 30GB or 50GB VRAM cards for average consumers... it's just not practical. The goal in my opinion is clearly to get texture streaming and other techs in place ASAP specifically to cap VRAM needs where they are now, or at least slow the needed growth by an order of magnitude and hold it just to what is needed for a given resolution's frame buffer.

There might be one more bump in average VRAM over the next few years, but my crystal ball guess says we are already at or near the max VRAM hardware developers are willing to tolerate designing for with current architectures. Hence the huge push for other ways to improve performance, texture quality and lighting.
 
Gamers Nexus pointed out in their 3080 review that a lot of people concerned about 10GB VRAM are confusing memory allocation vs memory use. People see 10GB or 11GB in software and think the game is using all of that, but it's just allocation.
 
Uh, there's tons of proof.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

And this is last generation crapola. You really think that next gen games aren't going to need more? Stop bsing. Honestly, the internet's so compromised I don't trust anything people write on forums anymore. Most of the people here could be astroturfing accounts. How much is Nvidia paying you?

Great data point! Now I am leaning toward 3090 again, at least with 3090 I am getting VRAM insurance policy for a long while.
 
Gamers Nexus pointed out in their 3080 review that a lot of people concerned about 10GB VRAM are confusing memory allocation vs memory use. People see 10GB or 11GB in software and think the game is using all of that, but it's just allocation.

That's actually horseshit because it ignores the fact that we're talking about next generation games--not just old shit. No one is buying a 3080 to play games that belong in a museum. Also, don't forget that your OS uses VRAM. It's not like you get 100% of that 10GB. Windows uses your graphics card to render the desktop, too.

The bottom line is that 10GB is NOT enough for next generation and shills are parroting Nvidia talking points so they keep getting review samples and don't have to get jobs at Walmart where they'd be working otherwise.

id Software has already started warning people that 8GB of VRAM is going to be a MINIMUM requirement for next generation games. Hi. MINIMUM. What do you think maximum's going to be? 10GB isn't going to cut it.

https://twitter.com/billykhan/status/1301129891035914240
 
Actually, no, it's memory usage. That crappy avengers game chokes and dies with everything maxed at 4k with 11GB of VRAM.

How do you know? Best I can tell they don't even say in that article how they're getting those numbers. Their OSD looks like GPU-Z or something. GPU-Z will report memory allocation, not usage.
 
That's actually horseshit because it ignores the fact that we're talking about next generation games--not just old shit. No one is buying a 3080 to play games that belong in a museum.
Horseshit based on what? It doesn't ignore anything. Nobody has claimed that we're only talking about games that "belong in a museum". You're just making horseshit up.
 
How do you know? Best I can tell they don't even say in that article how they're getting those numbers. Their OSD looks like GPU-Z or something. GPU-Z will report memory allocation, not usage.

Read the fucking Avengers performance review dude. It tanks at 4k with max textures because it doesn't have enough VRAM. And that's 11GB. Not 10GB. It'd be even worse on a 3080.
 
http://www.extremetech.com/gaming/2...y-x-faces-off-with-nvidias-gtx-980-ti-titan-x

Nvidia’s Brandon Bell on this topic, who told us the following: “None of the GPU tools on the market report memory usage correctly, whether it’s GPU-Z, Afterburner, Precision, etc. They all report the amount of memory requested by the GPU, not the actual memory usage. Cards will larger memory will request more memory, but that doesn’t mean that they actually use it. They simply request it because the memory is available.”
 
Read the fucking Avengers performance review dude. It tanks at 4k with max textures because it doesn't have enough VRAM. And that's 11GB. Not 10GB. It'd be even worse on a 3080.

So you expect the 3080 to perform worse in this game than the 2080 Ti?
 
Read my fucking posts, dude. They're measuring memory allocation, NOT MEMORY USAGE.

THE FRAMERATE LITERALLY GOES TO SHIT IN AVENGERS WHEN YOU RUN IT AT 4k WITH MAX TEXTURES BECAUSE IT HAS TO COPY FROM SYSTEM RAM BECAUSE IT RUNS OUT OF VRAM. RUNS FINE WHEN YOU RUN WITH LOWER TEXTURE SETTINGS THAT USE AS MUCH RAM.

HI.

SPEAK ENGLISH?

CAPABLE OF READING?

TRY IT.
 
So you expect the 3080 to perform worse in this game than the 2080 Ti?

It runs like shit with both. You can't copy from system ram to VRAM while a game is running without completely tanking the framerate. 11GB of VRAM isn't enough for it, and 10GB certainly isn't enough for it.

If you go with a 3080, you will be playing games at medium texture settings OR LOWER. Buyer beware.
 
THE FRAMERATE LITERALLY GOES TO SHIT IN AVENGERS WHEN YOU RUN IT AT 4k WITH MAX TEXTURES BECAUSE IT HAS TO COPY FROM SYSTEM RAM BECAUSE IT RUNS OUT OF VRAM. RUNS FINE WHEN YOU RUN WITH LOWER TEXTURE SETTINGS THAT USE AS MUCH RAM.

HI.

SPEAK ENGLISH?

CAPABLE OF READING?

TRY IT.

So VRAM is the only factor that could possibly limit performance on a GPU? Got it.
 
Read the fucking Avengers performance review dude. It tanks at 4k with max textures because it doesn't have enough VRAM. And that's 11GB. Not 10GB. It'd be even worse on a 3080.
Well to be fair that game performance is ass for how shitty it looks. There will be outliers and Avengers is just a very shitty console port.
 
THE FRAMERATE LITERALLY GOES TO SHIT IN AVENGERS WHEN YOU RUN IT AT 4k WITH MAX TEXTURES BECAUSE IT HAS TO COPY FROM SYSTEM RAM BECAUSE IT RUNS OUT OF VRAM. RUNS FINE WHEN YOU RUN WITH LOWER TEXTURE SETTINGS THAT USE AS MUCH RAM.

HI.

SPEAK ENGLISH?

CAPABLE OF READING?

TRY IT.

Calm your tits child. You are likely going to get yourself in trouble with the mods if you keep up your attitude.
 
How do you know? Best I can tell they don't even say in that article how they're getting those numbers. Their OSD looks like GPU-Z or something. GPU-Z will report memory allocation, not usage.

They are also interested to see how the VRAM limitation affects 3080:


" However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM. "
 
Well to be fair that game performance is ass for how shitty it looks. There will be outliers and Avengers is just a very shitty console port.

No one's arguing that. I wouldn't wipe my ass with that game. The bottom line, though, is that we have several companies, including id Software, saying 8GB of VRAM is going to be the MINIMUM required spec for next generation games, so you're sniffing glue if you think you're going to be running games at ultra settings with 10GB of RAM at 4k.
 
They are also interested to see how the VRAM allocation affects 3080:


" However, when we used Ultra Settings/High Textures, our performance skyrocketed to 42fps. This appears to be a VRAM limitation. As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM. "

Again, what do they mean by "used"? We need to know how they're getting their numbers. 95% chance they're reporting allocation not usage. The game probably just requests more VRAM when you set it to high textures.
 
Status
Not open for further replies.
Back
Top