So much for NVIDIA claiming 10 gigs of vram would not be a limitation

Status
Not open for further replies.
Again, what do they mean by "used"? We need to know how they're getting their numbers. 95% chance they're reporting allocation not usage. The game probably just requests more VRAM when you set it to high textures.

Don't know anyone here, and I was given a warning around thirty minutes ago.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

unknown.png
 
Again, what do they mean by "used"? We need to know how they're getting their numbers. 95% chance they're reporting allocation not usage. The game probably just requests more VRAM when you set it to high textures.

That's one of the big thing with games. Very few actually use the amount of VRAM they allocate. Will be interesting to see tests on supposedly high VRAM "using" titles to see how much/little effect the lower VRAM amount on the 3080 and 3070 effect things.
 
Again, what do they mean by "used"? We need to know how they're getting their numbers. 95% chance they're reporting allocation not usage. The game probably just requests more VRAM when you set it to high textures.

Right, so in my experience frame rate does not drop down to 17FPS due to allocation? It drops down by over 50% due to usage? At least this is my understanding. If it was just an allocation issue, we would not see frame go down to 17fps then back up to 42fps once textures size is reduced? Or am I wrong, do games tank based just on allocation alone?
 
Right, so in my experience frame rate does not drop down to 17FPS due to allocation? It drops down by over 50% due to usage? At least this is my understanding. If it was just an allocation issue, we would not see frame go down to 17fps then back up to 42fps once textures size is reduced? Or am I wrong, do games tank based just on allocation alone?

It could be any number of factors. VRAM isn't the only metric that determines GPU performance. We don't know how much memory the game is actually using, all we know is how much is requested. For example, maybe it's not the amount of memory that's the limiting factor. Maybe it's the memory bandwidth.
 
It could be any number of factors. VRAM isn't the only metric that determines GPU performance. We don't know how much memory the game is actually using, all we know is how much is requested. For example, maybe it's not the amount of memory that's the limiting factor. Maybe it's the memory bandwidth.

I look at those screenshots with rivatuner info. I wouldn't expect to see GPU pegged at 100% if its having to fileswap. I haven't actually tested that before, but I do know that you can't be busy building a wall if you're waiting for the building materials to be delivered.
 
  • Like
Reactions: Epos7
like this
Right, so in my experience frame rate does not drop down to 17FPS due to allocation? It drops down by over 50% due to usage? At least this is my understanding. If it was just an allocation issue, we would not see frame go down to 17fps then back up to 42fps once textures size is reduced? Or am I wrong, do games tank based just on allocation alone?

Once you have to start copying from main RAM, the framerate tanks, and it makes sense with the numbers they're putting out there. The OS using some VRAM, the game wants over 10.5GB. It follows that there isn't going to be enough.
 
Once you have to start copying from main RAM, the framerate tanks, and it makes sense with the numbers they're putting out there. The OS using some VRAM, the game wants over 10.5GB. It follows that there isn't going to be enough.
Ranting and raving about one poorly coded console port outlier, and accusing everyone disagreeing of being paid Nvidia shills.

Here's an idea, don't buy the GPU if it won't suit your perceived needs. And FFS switch to decaf
 
Last edited:
Ranting and raving about one poorly coded console port outlier, and accusing everyone disagreeing of being paid Nvidia shills.

Here's an idea, don't buy the GPU if it won't suit your perceived needs.

Outlier my ass. I'm just making sure that people have all the information and aren't just getting pumped full of Nvidia PR.

Here's an idea, don't tell people to stop sharing information just because it gets your panties in a bunch.
 
Uh, there's tons of proof.

https://www.dsogaming.com/pc-performance-analyses/marvels-avengers-pc-performance-analysis/

And this is last generation crapola. You really think that next gen games aren't going to need more? Stop bsing. Honestly, the internet's so compromised I don't trust anything people write on forums anymore. Most of the people here could be astroturfing accounts. How much is Nvidia paying you?

I don't get it... you're probably the same guy arguing that 16gb of system memory isn't enough for gaming.
Gamers Nexus pointed out in their 3080 review that a lot of people concerned about 10GB VRAM are confusing memory allocation vs memory use. People see 10GB or 11GB in software and think the game is using all of that, but it's just allocation.

There are some old COD games that will allocate 100% of your GPU memory and only use 5 gbs of it at 4k. I own an 11gb 1080ti and I couldn't give two shits about dropping down to 10gb. I've literally never used that extra memory in the last 5 years, including while playing 4k games on my OLED.
 
Outlier my ass. I'm just making sure that people have all the information and aren't just getting pumped full of Nvidia PR.

Here's an idea, don't tell people to stop sharing information just because it gets your panties in a bunch.

No, you're acting like an overly-aggressive child mad that the adults won't listen to your inane rambling. If you don't think it'll be enough, that's fine, you can share your opinion in a calm, reasonable, manner. Instead you act like Nvidia kicked your dog, fucked your mom, and shot your dad.
 
No, you're acting like an overly-aggressive child mad that the adults won't listen to your inane rambling. If you don't think it'll be enough, that's fine, you can share your opinion in a calm, reasonable, manner. Instead you act like Nvidia kicked your dog, fucked your mom, and shot your dad.

Well they have made me put up with a phantom HDMI monitor for over a decade. Almost as bad.
 
I don't get it... you're probably the same guy arguing that 16gb of system memory isn't enough for gaming.
Gamers Nexus pointed out in their 3080 review that a lot of people concerned about 10GB VRAM are confusing memory allocation vs memory use. People see 10GB or 11GB in software and think the game is using all of that, but it's just allocation.

There are some old COD games that will allocate 100% of your GPU memory and only use 5 gbs of it at 4k. I own an 11gb 1080ti and I couldn't give two shits about dropping down to 10gb. I've literally never used that extra memory in the last 5 years, including while playing 4k games on my OLED.

I know reading's difficult, but if you actually tried it, you would have seen that the framerate tanks with ultra textures. It isn't some academic allocation vs. usage scenario. It literally doesn't have enough VRAM.
 
I know reading's difficult, but if you actually tried it, you would have seen that the framerate tanks with ultra textures. It isn't some academic allocation vs. usage scenario. It literally doesn't have enough VRAM.
I'd suggest that you get them to test with GPU usage shown. Or you can do it, if that game is out. (Games like that don't interest me, so I won't be)

As previously mentioned, if the GPU usage dives way below 100%, then yes - the crappy console port that looks like crap does indeed appear to be running out of VRAM and has to hit the disk/system page file for more data. (A shocker that it doesn't just crash with an out of memory error like most games are wont to do)

If the GPU usage stays pegged at 100%, then no - the GPU itself is the likely limitation and not the VRAM. Why? Maybe they use insane resolution textures that well and truly tank rendering performance.
 
And I wish people would stop complaining about those that are complaining about 10 gigs not being enough.

If it doesn't bother you then great but it's going to be a talking point for quite a while now so deal with it if you're going to be on a public forum.
Dude you have to understand, they are trying to feel better about their 2080s right now so they will grab on to anything to deal with this. :D
 
It could be any number of factors. VRAM isn't the only metric that determines GPU performance. We don't know how much memory the game is actually using, all we know is how much is requested. For example, maybe it's not the amount of memory that's the limiting factor. Maybe it's the memory bandwidth.
Lol, memory bandwidth.. that's funny. That does not cause frame rates to drop and come back like that. It's when it runs out of vram and has to shuffle data through the pcie bus, aka not enough room to fit everything in vram. Games don't suddenly tank due to bandwidth. Streaming 10% more textures doesn't give you a 60-70% hit in performance... it'd be around 10%.

That said, this is a complete outlier and a crappy implementation. Most games today are safe. Games of tomorrow, we can only speculate, but I don't think we'll see them using less ram going forward. I'm sure most people will get plenty of use out of their card even if it "only" has 10GB, just a lot of people kind of feel ripped off coming from an 11GB card and getting less memory.
 
Ok, but we are talking about 4k again. What about 1440p... DLSS on (we are talking about next gen titles after all which should support this going forward). This should work within 10GB for quite a number of years.

If my choices are between $700 for a 3080 and $1500 for a 3090 I'm going to run the 3080 for 2 years until next gen comes out then sell the dang thing and get the next $700 card. I'll come out $100 ahead and not have eaten the depreciation on a $1500 card AND ... I'll have the latest next gen card that will almost certainly have (some number) more than 10GB. Even if you don't sell your card... you can buy another $700 in two years and STILL come out ahead or even. And have TWO good cards in your house.

What do I have to "pay" in penalty in the meantime by buying now? A few games where I can't turn on uber textures at 4k. Or in my case at 1440p so far ONE game mentioned here I can't turn on uber textures at 1440.

I see the argument. I get the issue. But if you don't like 10GB VRAM right now, you have exactly THREE options. The first is buy a 3090. The second is WAIT for a 3080 20GB or Ti/Super. The third is WAIT for big Navi and hope AMD doesn't fall on its face on yet another video card generation. I hope they succeed, but my 2080 is sold, I need a card, AMD has a terrible track record and they aren't out first anyway so we have no idea what they can do.

So I'll be playing through 3-4 games in my high end backlog over the next month with excellent results ona a 3080 while others ... wait for Navi and pray or wait even longer toward the end of the year (maybe) for a higher VRAM 3080 that may or may not materialize.

If you are seriously looking to buy the card for a 5 year+ long stretch and you are actually playing the VRAM future proofing game then you really don't have a choice but to wait this launch out or buy a 3090. Which has got to be frustrating, but you are doing it to yourself you could make different decisions.

I'd rather be playing games on the new tech and get as much time on the new cards as I can before next gen. So I'm buying a 3080 now. I learned that lesson with the 2080 and didn't regret it.
 
DLSS is a game changer once it gets into most mainstream engines. No doubt about it.
 
Should have a lot more data on this soon as the 3070 and 3080 will have multiple memory configs for apples to apples comparisons.
 
several companies, including id Software, saying 8GB of VRAM is going to be the MINIMUM required spec for next generation games

Source for id and several companies stating this? Edit: saw your identical post on the prior page with the id source at least. Buying the most VRAM a budget allows for is pretty obvious advice but still, we have a 10GB card so there's obviously a little mental justification going on given we don't have official word on upcoming alternate models and availability is shit now anyway.
 
Source for id and several companies stating this? Edit: saw your identical post on the prior page with the id source at least. Buying the most VRAM a budget allows for is pretty obvious advice but still, we have a 10GB card so there's obviously a little mental justification going on given we don't have official word on upcoming alternate models and availability is shit now anyway.

Getting really bored of obtuse bullshit being spouted like this. He said 8GB MINIMUM. Hi. MINIMUM. He probably wasn't even talking about 4k, which makes it even worse. Even TODAY, there's a 3GB+ difference between usage with the lowest graphics settings versus the highest settings. This is only going to increase with next generation games.

Everyone with a brain is waving the warning signs and telling everyone that it's not going to be enough for next generation games at max settings, and you weirdos are burying your heads in the sand.

Fine. Get fucked over. Buy a paperweight. No skin off my back.
 
Getting really bored of obtuse bullshit being spouted like this. He said 8GB MINIMUM. Hi. MINIMUM. He probably wasn't even talking about 4k, which makes it even worse. Even TODAY, there's a 3GB+ difference between usage with the lowest graphics settings versus the highest settings. This is only going to increase with next generation games.

Everyone with a brain is waving the warning signs and telling everyone that it's not going to be enough for next generation games at max settings, and you weirdos are burying your heads in the sand.

Fine. Get fucked over. Buy a paperweight. No skin off my back.
I currently game at 4K with a 2080 Super which only has 8GB without issue.

Why do people freak out over this?
 
Buy a paperweight.

A paperweight because it kind of maybe can't play a handful of games at absolute max settings in three years. Goddamn, some of you losers need to get some perspective. Coming from someone who's waiting for 16 GB Navi or a 20 GB 3080.
 
Fair warning....If you can't have a constructive debate without name calling then it's best if you just stay out of the conversation....
 
Getting really bored of obtuse bullshit being spouted like this. He said 8GB MINIMUM. Hi. MINIMUM. He probably wasn't even talking about 4k, which makes it even worse. Even TODAY, there's a 3GB+ difference between usage with the lowest graphics settings versus the highest settings. This is only going to increase with next generation games.

Everyone with a brain is waving the warning signs and telling everyone that it's not going to be enough for next generation games at max settings, and you weirdos are burying your heads in the sand.

Fine. Get fucked over. Buy a paperweight. No skin off my back.

Hi yourself. If it's no skin off your back, can you calm down with the aggressive and condescending attitude? Throwing a tantrum isn't making your point any more valid.
 
Status
Not open for further replies.
Back
Top