5070 will reportedly only have 12 GB of VRAM

It always starts with "one game" and the point was that we have that game NOW so we wont be waiting 2 or 3 years. It was the settings I said not DLAA and saw it for myself on the 4090.
So you're running a 24gb card which means it's an invalid test. Games will use more vram than they need to run smoothly if it's available. Try on a 4080 and come back. I've heard of no issues with it on one.
 
So you're running a 24gb card which means it's an invalid test. Games will use more vram than they need to run smoothly if it's available. Try on a 4080 and come back. I've heard of no issues with it on one.
I am not talking about allocating and I thought I made that clear. And if it is working on the 4080 then you would be spiling over into system ram.
 
I am not talking about allocating and i thought I made that clear.
You can't tell anything but with software tools unless you're the developer of the game. All external tools report allocations, not usage.

Again'

So you're running a 24gb card which means it's an invalid test. Games will use more vram than they need to run smoothly if it's available. Try on a 4080 and come back. I've heard of no issues with it on one.
 
You can't tell anything but with software tools unless you're the developer of the game. All external tools report allocations, not usage.

Again'
Then you are not aware of how to use afterburner and RTSS to monitor VRAM usage as there is a setting for BOTH allocation and how much the game actually needs.
 
Then you are not aware of how to use afterburner and RTSS to monitor VRAM usage as there is a setting for BOTH allocation and how much the game actually needs.
No, there's a setting for full system allocation and one that shows game allocation separately from OS+game allocation. Under no circumstances can it display actual usage.

As I said, too, you need a 16gb card to see how it behaves due to how game vram usage and allocations work. So your test is 100% inaccurate.
 
No, there's a setting for full system allocation and one that shows game allocation separately from OS+game allocation. Under no circumstances can it display actual usage.
Well I disagree and that was the point of adding other vram setting in the first place because allocation does not tell you what the game actually needed. Are you familiar with with the youtube channel "zWORMz gaming" ? He has ran the 4090 and the 4080 in that game with very long in depth reviews of various settings. He could NOT run the settings I said on the 4080 but could on the 4090. For the 4080 he had to run LOW textures to also run full pathtracing and DLSS quality at 4k even without frame gen.
 
  • Like
Reactions: noko
like this
Are you familiar with with the youtube channel "zWORMz gaming" ? He has ran the 4090 and the 4080 in that game with very long in depth reviews of various settings. He could NOT run the settings I said on the 4080 but could on the 4090. For the 4080 he had to run LOW textures to also run full ptahtracing and DLSS quality at 4k even without frame gen.
Yes the two cards have different gpus with vastly different performance. That isn't testing the 16gb card in settings it's capable of running smoothly regardless of vram. That has nothing to do with the availability of vram. It's the path tracing chewing up performance.

P. S. The vram reading in afterburner is allocation, not usage. There is no way for external tools to determine usage outside the game engine. Again, I think you're a little out of it here.
 
Yes the two cards have different gpus with vastly different performance. That isn't testing the 16gb card in settings it's capable of running smoothly regardless of vram. That has nothing to do with the availability of vram.
Sigh, textures have almost no impact on performance if you have enough vram.
 
Sigh, textures have no impact on performance if you have enough vram.
Wrong. They still chew up memory bandwidth and fill rate. However, that is not why the 4080 can't max it with path tracing. I've already explained this multiple times... It's not seeming like you want to learn, but just confirm existing biases.
 
Wrong. They still chew up memory bandwidth and fill rate. However, that is not why the 4080 can't max it with path tracing. I've already explained this multiple times... It's not seeming like you want to learn, but just confirm existing biases.
Bullshit. If you have enough vram then textures setting have very little to no impact and that has been known for years. And you clearly keep missing the point. He tests cards at the setting he can just to see what the performance is. He could NOT run those setting due to vram. All your arguing in the world does noy change that fact. So again the games need over 16GB to run those setting I mentioned whether you like it or not.
 
  • Like
Reactions: noko
like this
Bullshit. If you have enough vram then textures setting have very little impact. And you clearly keep missing the damn point. He tests cards at the setting he can just to see what the performance is. He could NOT run those setting due to vram. All your arguing in the world does noy change that fact.
You're incorrect about the vram consumption and the overall framerate being unplayable due to the textures. It's the computational demands and memory usage of full maxed path tracing doing it. And yes, textures do affect overall performance. It isn't a huge impact, but it's there. This isn't arcane knowledge.
 
You're incorrect about the vram consumption and the overall framerate being unplayable due to the textures. It's the computational demands and memory usage of full maxed path tracing doing it. And yes, textures do affect overall performance. It isn't a huge impact, but it's there. This isn't arcane knowledge.
Maybe slow down and read what I about to say TWICE before replying. He runs games at different settings to see how the cards are impacted so the fact that the 4080 is lower in overall performance than the 4090 does not mean jack in this context. The POINT was that he could NOT run those settings at all due to vram. To repeat yet again he could even test at the settings I mentioned BECAUSE of vram. So AGAIN Indiana Jones with max pathtraced settings at 4k and DLSS quality and frame gen requires more than 16gb. NOTHING you can type or argue about here changes that simple fact.
 
  • Like
Reactions: noko
like this
Maybe slow down and read what I about to say TWICE before replying. He runs games at different settings to see how the cards are impacted so the fact that the 4080 is lower in overall performance than the 4090 does not mean jack in this context. The POINT was that he could NOT run those settings at all due to vram. To repeat yet again he could even test at the settings I mentioned BECAUSE of vram. So AGAIN Indiana Jones with max pathtraced settings at 4k and DLSS quality and frame gen requires more than 16gb. NOTHING you can type or argue about here changes that simple fact.
I read your responses. You're simply wrong as to the cause of running out of vram and the performance lacking. Anyway, you're clearly decided, facts be damned. Have a good night.
 
I read your responses. You're simply wrong as to the cause of running out of vram and the performance lacking. Anyway, you're clearly decided, facts be damned. Have a good night.
If the game was a non playable single digit slideshow or refused to run on higher texture settings and simply lowering the textures allowed it to play normally for its expected performance level then it does not take a genius to figure out that the VRAM was the limitation. The 4090 in the same situation using the same 4k maxed path tracing, quality DLSS and frame generation settings could run max textures and was dedicating (not allocating) well over 17GB of VRAM so that again backs up that the issue was VRAM.
 
Last edited:
  • Like
Reactions: noko
like this
If the game was a non playable single digit slideshow or refused to run on higher texture settings and simply lowering the textures allowed it to play normally for its expected performance level then it does not take a genius to figure out that the VRAM was the limitation. The 4090 in the same situation using the same 4k maxed path tracing, quality DLSS and frame generation settings could run max textures and was dedicating (not allocating) well over 17GB of VRAM so that again backs up that the issue was VRAM.
That was the issue I ran into trying to play Witcher 3 RT in 4k on a 12GB card even with DLSS. Slide shows / crashes. Sometimes you just need more VRAM (or in my case I turned down the settings, but was a grump about it :sour:) Why I am hesitant to pick up a 16GB card for something I might hold onto for 4-5 years or so, and am going to wait and see if we get a 24GB Super refresh of the 5080 next year.
 
I know they wont this time, but at some point like they did with CPU's I hope they come back at all levels to drive some competition. They don't have to be the best. They never really pushed on Titan in the past, but they had solid cards that performed well. I loved the handful I had over the years but I need the productivity side to have something which leaves me with NVIDIA :(
Don't think AMD will even be competitive for at least another 2 years. Their only chance to leapfrog Nvidia is to go to multi-gpu chiplet solutions with RDNA5. In terms of monolithic GPU, Nvidia has them easily beaten.
 
Yeah. Took them a while to get back on track with CPU's. Just hoping they can do the same for GPU's along with Intel.
 
jobert if like you say they had to reduce textures to low to get rid of 1gb of vram allocation, is argue the real culprit is the vram hungry max path tracing settings, not the textures ;).

Either way, as I said at the start, it's a whopping single game and there won't likely be many you need to *gasp* lower a setting a notch inside of a few years on 16gb. Unless you have severe FOMO on having every slider maxed no matter how demanding, most people can do it.
 
jobert if like you say they had to reduce textures to low to get rid of 1gb of vram allocation, is argue the real culprit is the vram hungry max path tracing settings, not the textures ;).
Again it is NOT allocation that I was referring to so please stop saying that. I have already told you there are two different vram usage stats and one of them is the actual usage for vram needed directly by the game and the other is just the allocation. And who cares what setting was causing the vram usage? That again shows you have not paid any attention to anything as the point was that 16gb was not enough to run those settings. You simply just enjoy arguing for some reason.
 
  • Like
Reactions: noko
like this
Again it is NOT allocation that I was referring to so please stop saying that. I have already told you there are two different vram usage stats and one of them is the actual usage for vram needed directly by the game and the other is just the allocation. And who cares what setting was causing the vram usage? That again shows you have not paid any attention to anything as the point was that 16gb was not enough to run those settings. You simply just enjoy arguing for some reason.
Here is unwinder, the author of afterburner, discussing the at the time new per process feature and how it can only read what is committed/allocated at any given time: https://forums.guru3d.com/threads/msi-ab-rtss-development-news-thread.412822/page-127#post-5832540

Now can we put this misinformation to rest?
 
What do you guys think, will there be stock issues at the release of 5070?
Is it going to be hard again to catch a FE?
 
Again it is NOT allocation that I was referring to so please stop saying that. I have already told you there are two different vram usage stats and one of them is the actual usage for vram needed directly by the game and the other is just the allocation. And who cares what setting was causing the vram usage? That again shows you have not paid any attention to anything as the point was that 16gb was not enough to run those settings. You simply just enjoy arguing for some reason.
Yep, Indiana Jones in current state pushes VRAM usage beyond 16gb when using max settings, path tracing. RT BVH uses a large amount of VRAM plus the game assets. If game gets updated using Nvidia Neural Textures this would be resolved with even higher quality. Just have to see if developers will go even more into proprietary Nvidia methods.
 
What do you guys think, will there be stock issues at the release of 5070?
Is it going to be hard again to catch a FE?
Scalpers will buy up early stock of FE parts. You need to get into the line on nvidia website EARLY on launch day and hope enough parts last till you get to talk to someone to buy it.

My guess is that only 5090, 5080, 5070 Ti will be hard to get. 5070 won't be nearly as coveted. But again FE gets bought up by scalpers for the prestige factor these parts now carry.
 
Scalpers will buy up early stock of FE parts. You need to get into the line on nvidia website EARLY on launch day and hope enough parts last till you get to talk to someone to buy it.

My guess is that only 5090, 5080, 5070 Ti will be hard to get. 5070 won't be nearly as coveted. But again FE gets bought up by scalpers for the prestige factor these parts now carry.

Best Buy kept stock of FE 40** series for a good while. I think they had some exclusive distributor status.
 
Back
Top