HardOCP looking into the 970 3.5GB issue?

Status
Not open for further replies.
Seems like the GTX 970 just went from golden child to a POS. I'd rather have a 780ti with consistent frame times than something that stutters whenever the memory usage exceeds 3.5GB. I guess they figured that most people are stuck at 1080p so the 3.5+GB Vram stuttering scenarios wouldn't be noticed. This is the sleaziest thing that Nvidia has done lately besides GameWorks code obfuscation for AMD video cards. After typing that statement, I guess it's par for the course for them. And that's damn sad commentary.

Wonder how the GTX 960 4GB variants will handle 3.5+GB of VRAM usage?
 
Wait, so those of us with 970 sli and 4k monitors are screwed?
If you're playing a game at resolutions and texture load rates where that extra 500mb at reduced bandwidth is actually having a noticeable effect, it's unlikely you are getting playable framerates anyway.
Obligatory car analogy: There is a bug in the Pinto's roof-rack where loading it with more than 3500kg causes the rack to bend. But by that time the suspension has already collapsed.


Remember, all the testing of the GTX 970 as an excellently performing card hasn't suddenly gone away. It hasn't magically lost any performance.
 
Seems like the GTX 970 just went from golden child to a POS. I'd rather have a 780ti with consistent frame times than something that stutters whenever the memory usage exceeds 3.5GB. I guess they figured that most people are stuck at 1080p so the 3.5+GB Vram stuttering scenarios wouldn't be noticed. This is the sleaziest thing that Nvidia has done lately besides GameWorks code obfuscation for AMD video cards. After typing that statement, I guess it's par for the course for them. And that's damn sad commentary.

Wonder how the GTX 960 4GB variants will handle 3.5+GB of VRAM usage?
Lol there is no scenario where a single 970 will be getting good framerates when more than 3.5 GB is "needed". And if its not needed then there will be no stutter. If you are trying to run at settings that stay above 60 fps then you will never encounter an issue.

And why would this impact the 960? The way the 970 is cut down is the issue here.
 
If you're playing a game at resolutions and texture load rates where that extra 500mb at reduced bandwidth is actually having a noticeable effect, it's unlikely you are getting playable framerates anyway.
Obligatory car analogy: There is a bug in the Pinto's roof-rack where loading it with more than 3500kg causes the rack to bend. But by that time the suspension has already collapsed.


Remember, all the testing of the GTX 970 as an excellently performing card hasn't suddenly gone away. It hasn't magically lost any performance.

So what you're saying is that the GTX 970 isn't designed for 4K gaming or games like Shadows of Mordor that can easily access 4GB @1080p. You know; like many of the upcoming console ports.
 
Like I've said in the other thread, I've seen Watch Dogs eat up 3.8GB of vram at 1.78x DSR (1440p downsampled to 1080p) on my 970 SLI. Yet I never encountered any additional stuttering outside of what was already present in the shittily optimized game. I guess this is where people argue over allocated vs actually being used...
 
Like I've said in the other thread, I've seen Watch Dogs eat up 3.8GB of vram at 1.78x DSR (1440p downsampled to 1080p) on my 970 SLI. Yet I never encountered any additional stuttering outside of what was already present in the shittily optimized game. I guess this is where people argue over allocated vs actually being used...
And as already said over and over in every discussion on vram usage, the amount of vram being used is not necessarily the amount thats NEEDED. If you actually NEEDED that much vram then there would be more of a performance drop off or hitching when exceeding 3.5 GB on the 970.
 
Copied from another forum: http://www.overclock.net/t/1537725/...-gtx-970-3-5gb-memory-issue/140#post_23453148

It's not a 3.5GB GPU. If it were, your performance would be worse because for any single command buffers that reference an amount over 3.5GB, you'd be texturing out of system memory, which is way slower than the last 512MB.

The driver has smarts to avoid using the last 512MB unless it has to, and it is intelligent about putting the least important resources in there.

So anyone seeing just 3.5GB available, that's just a reporting issue. In truth, if a single command buffer of a normal game workload (mixed resources) references more than 3.5GB of resources, then there is no choice but to make some of that data resident in the last 512MB. If a game creates more than 3.5GB of resources over time but doesn't reference them all in a single command buffer, the OS would rather evict something to system memory to make room for newly created resources in the first 3.5GB than place a potentially important resource into the last 512MB.

In short, when necessary, the memory will be used, and it's much better than coming from system memory.

And in any real world game, when you have enough resources referenced in a single command buffer to start to want to use the last 512MB, there is always some subset of those resources that you can put there that will ensure a fairly insignificant overall performance impact by having them there.

The performance seen in all reviews is valid, so that's what you should buy your GPU by. Going forward it might be useful to ensure you look at review settings for new games that place the game between 3.5GB and 4GB though to ensure good performance.
 
Wow... I actually literally set up an RMA for my Acer B326HK 4k monitor because I thought something was just wrong with it due to motion issues and seeming frameskipping, for a refund since I couldn't figure out why it was happening and thought it was a defect. I'd not gotten a ton of time to play with games on it since I got it back at the end of November, until around a week ago (maybe ~15-20 hours total and I was playing much less demanding games that weren't hitting up past ~2-2.5gb of vram allocation let alone usage).

I am really wondering if this whole 3.5gb issue is why I was seeing problems lately when I tried dialing up settings that went to high VRAM levels on my GTX 970 SLI oc'd, now... while the average framerate isn't necessarily impacted an extreme amount it seems to be having a VERY big impact on frametimes and it's very irritating to play with. It was driving me nuts trying to figure out why I was seeing these problems, but it seems consistent with what we're hearing now. I actually tossed my X-Star DP2710 back on the desk the other day because I was chalking it up to the monitor... it seems it may actually be this issue instead :eek: :(. My RMA window is only going to allow me about a week from today to get it out in the mail if I'm not keeping the B326HK, which I'd really like to keep otherwise.

Watching news on this like a hawk now...

EDIT: Honestly though, it would be risky to keep it because it's unclear if they intend to fix this, can fix it via software at all even, or would replace the cards if they make a hardware fix. Even at 1440p 100hz using DSR though I'm going to want the extra memory as advertised and be able to use it without major hitching/stuttering (I actually just tried pumping settings up to consume 3.7gb in BF4 on the 1440p panel and got the exact same problem to show up (judders/obviously not looking smooth at all on it despite the framerate average reading 65-70 still!))... nvidia needs to make this right, and it's already going to be a huge stain on their reputation.

I'm ticked. As further games come out over the next year+ I'm going to want to be able to use more than around the 3.2gb of VRAM people are quoting the issue as starting at, and I bought cards advertised to provide a full 4GB of usable vram and 256-bit interface properly.
 
Last edited:
Wow... I actually literally set up an RMA for my Acer B326HK 4k monitor because I thought something was just wrong with it due to motion issues and seeming frameskipping, for a refund since I couldn't figure out why it was happening and thought it was a defect. I'd not gotten a ton of time to play with games on it since I got it back at the end of November, until around a week ago (maybe ~15-20 hours total and I was playing much less demanding games that weren't hitting up past ~2-2.5gb of vram allocation let alone usage).

I am really wondering if this whole 3.5gb issue is why I was seeing problems lately when I tried dialing up settings that went to high VRAM levels on my GTX 970 SLI oc'd, now... while the average framerate isn't necessarily impacted an extreme amount it seems to be having a VERY big impact on frametimes and it's very irritating to play with. It was driving me nuts trying to figure out why I was seeing these problems, but it seems consistent with what we're hearing now. I actually tossed my X-Star DP2710 back on the desk the other day because I was chalking it up to the monitor... it seems it may actually be this issue instead :eek: :(. My RMA window is only going to allow me about a week from today to get it out in the mail if I'm not keeping the B326HK, which I'd really like to keep otherwise.

Watching news on this like a hawk now...

EDIT: Honestly though, it would be risky to keep it because it's unclear if they intend to fix this, can fix it via software at all even, or would replace the cards if they make a hardware fix. Even at 1440p 100hz using DSR though I'm going to want the extra memory as advertised and be able to use it without major hitching/stuttering (I actually just tried pumping settings up to consume 3.7gb in BF4 on the 1440p panel and got the exact same problem to show up (judders/obviously not looking smooth at all on it despite the framerate average reading 65-70 still!))... nvidia needs to make this right, and it's already going to be a huge stain on their reputation.

I'm ticked. As further games come out over the next year+ I'm going to want to be able to use more than around the 3.2gb of VRAM people are quoting the issue as starting at, and I bought cards advertised to provide a full 4GB of usable vram and 256-bit interface properly.

The CUDA benchmark seen on all the forums showing just a few GB/s is not reflective of the performance of the 512MB section. The test was flawed, given the memory architecture of 970. Above 3.5GB, it was measuring bandwidth to system RAM, which is bottlenecked by PCIE. Real games use the local 512MB of extra memory for less important resources, which is much faster than the system memory result shown in the CUDA benchmark.

I.E. Game performance isn't really affected by this, and it's very likely that CUDA will be updated to allocate out of that other memory section before spilling to system memory.

The alternative could've been a $300 3.5GB GPU, so it could be said that you got an extra 512MB of GPU memory for free, and it's still way better than spilling out to system memory.

Another way to see it is that structuring the memory this way got you a cheaper GPU. So it's not like you paid for something but didn't get it... The price was determined by its cost to produce and it's capability.

You got the performance that you see in reviews.
 
Last edited:
I don't think segmentation is something that naturally occurs in a setup like this. I think they've swept something under the rug here. Unless the 12GB Titan X is going to have more than 3x the resources of 970 to support that 12GB, the story doesn't add up. Or maybe the 12GB Titan X is going to be segmented to hell and back.
 
as an owner of a Gigabyte GTX 970 G1 Gaming card I'm hoping this is all much ado about nothing...I don't want to go through any returns or hassles...I just want to enjoy my card
 
I returned my EVGA GTX 970 SSC's (3975 models) because they were studdering bad on COD:AW at 1080p. No enjoyment spending $700 on a pair of cards to have that kind of BS happen. The newer games are just going to make it worse, cause they WILL try to utilize the full amount of VRAM... as they should. Back to AMD is where I went... I seem to have no luck with Nvidia's high-mid range cards... Seems like you have to pay the premium for their top tier stuff to get anything worth while.

On a side note... Can't wait for the R9 300 series.
 
I returned my EVGA GTX 970 SSC's (3975 models) because they were studdering bad on COD:AW at 1080p. No enjoyment spending $700 on a pair of cards to have that kind of BS happen. The newer games are just going to make it worse, cause they WILL try to utilize the full amount of VRAM... as they should. Back to AMD is where I went... I seem to have no luck with Nvidia's high-mid range cards... Seems like you have to pay the premium for their top tier stuff to get anything worth while.

On a side note... Can't wait for the R9 300 series.
AGAIN it will only stutter if it NEEDS that much vram not just because a game allocates that much vram.

I am sure now every game that hitches or stutters is going to be blamed on this issue...
 
CoD:AW is a terrible game to judge this on anyways. That game is not very optimized and it will often stutter for no apparent reason.
 
AGAIN it will only stutter if it NEEDS that much vram not just because a game allocates that much vram.

I am sure now every game that hitches or stutters is going to be blamed on this issue...

No, it won't stutter!!!

It will access less critical resources from ram that is slower, the same way every frame. So your frame time will slowdown every frame by like 1% maybe. And that's already baked into the performance seen in reviews.

This memory configuration does not affect stutter whatsoever.
 
AGAIN it will only stutter if it NEEDS that much vram not just because a game allocates that much vram.

I am sure now every game that hitches or stutters is going to be blamed on this issue...

It should be able to access all memory and not stutter at all. I am sure you like paying for things you can't fully use.
 
It should be able to access all memory and not stutter at all. I am sure you like paying for things you can't fully use.

In what way can you not fully use it? You are spreading information that is just plain wrong.

The only difference is accesses are somewhat slower for every frame on the resources in that section of memory... It doesn't vary frame to frame and cause stutter. But given that the least important stuff for performance is kept there you won't notice it.
 
Last edited:
I returned my EVGA GTX 970 SSC's (3975 models) because they were studdering bad on COD:AW at 1080p. No enjoyment spending $700 on a pair of cards to have that kind of BS happen. The newer games are just going to make it worse, cause they WILL try to utilize the full amount of VRAM... as they should. Back to AMD is where I went... I seem to have no luck with Nvidia's high-mid range cards... Seems like you have to pay the premium for their top tier stuff to get anything worth while.

On a side note... Can't wait for the R9 300 series.

I don't believe you understand what the issue is. AW doesn't use 4GB of GPU resources at 1080P.
 
Yeah... it does. I watched it use up that much VRAM through EVGA Precision on the in game monitor.

It also happens on other games... Just check YouTube with the 3.5GB issue on the GTX 970. I've seen it happen on Dragon Age and Far Cry 4... all of the more recent titles that fully utilize the most recent GPU's. Some games are even worse than what I experienced on COD:AW.

Like this:
https://www.youtube.com/watch?v=ZQE6p5r1tYE
 
Y
In what way can you not fully use it? You are spreading information that is just plain wrong.

The only difference is accesses are somewhat slower for every frame on the resources in that section of memory... It doesn't vary frame to frame and cause stutter. But given that the least important stuff for performance is kept there you won't notice it.

So people with stuttering are lying? Blindly following isn't good.
 
Y

So people with stuttering are lying? Blindly following isn't good.

For real... I see it happen a lot. I really don't understand the undying love for one brand situation. I've bought just about one card from each generation of both Nvidia and AMD/ATI since 2004. I've had tons of video cards, never stick to one team over another. Nvidia's new cards looked great, so I figured i'd give them a try. I'm just glad I got a full refund.
 
Y

So people with stuttering are lying? Blindly following isn't good.

No, people with stuttering likely have an unrelated issue. You really don't see that as a just-as-likely if not more probable possibility?

It sounds like you are blindly following the assumption that the memory configuration is the cause. Two way street.

The only way for someone to know more definitively (before spouting misinformation) is to compare such stutters to the same test at the same settings on a 980.
 
Ok so I wanted to do a test with Dragon Age since I played the fuck outta it (usually with 2xmsaa @ 1440p).

So I decided to put 4xMSAA @ 1440p everything maxed.....and wow yea I can notice the stuttering when getting above 3.5gb. I honestly thought this could only happen to certain/card systems, but damn I can actually reproduce it......Crazy indeed.

So the reason @ 4k this isnt a problem is because the GPU is already pegged at 99%?, but if the GPU is around 70-80% usage, and over 3.5gb I get horrible stuttering.....wow this is crazy. I was going to say it happens at certain situation, but with more and more games need more vram, this is going to be a serious issue in the near future.
 
No, people with stuttering likely have an unrelated issue. You really don't see that as a just-as-likely if not more probable possibility?

It sounds like you are blindly following the assumption that the memory configuration is the cause. Two way street.

The only way for someone to know more definitively (before spouting misinformation) is to compare such stutters to the same test at the same settings on a 980.

RainStryke just posted this in another thread. https://www.youtube.com/watch?v=ZQE6p5r1tYE I am sure going over 3.5 isn't the problem here.
 
Ok so I wanted to do a test with Dragon Age since I played the fuck outta it (usually with 2xmsaa @ 1440p).

So I decided to put 4xMSAA @ 1440p everything maxed.....and wow yea I can notice the stuttering when getting above 3.5gb. I honestly thought this could only happen to certain/card systems, but damn I can actually reproduce it......Crazy indeed.

So the reason @ 4k this isnt a problem is because the GPU is already pegged at 99%?, but if the GPU is around 70-80% usage, and over 3.5gb I get horrible stuttering.....wow this is crazy. I was going to say it happens at certain situation, but with more and more games need more vram, this is going to be a serious issue in the near future.

Well according to bluestorm something else is your issue. lol
 
Maybe Nvidia could save face by marketing it as a 3.5GB card that happens to have 4GB of ram on it for no reason...
LOL!
 
Well according to bluestorm something else is your issue. lol

Ahhh I have him on ignore for years.

It's hard to find people who aren't bias anymore. I would think most people in this thread knows who is bullshitting and who isn't. Once you get a rep on here for being a certain way, it never goes away, At least to us people who have been here for years.
 
[From Another Forum] Nvidia could have simply said "UP TO 4GB" like boost clocks. Problem solved. LOL.

Laughable that some are claiming this is anything other than what it is: shady, fraudulent BS of the highest order.
 
Ahhh I have him on ignore for years.

It's hard to find people who aren't bias anymore. I would think most people in this thread knows who is bullshitting and who isn't. Once you get a rep on here for being a certain way, it never goes away, At least to us people who have been here for years.

My apologies. Some people have engineering backgrounds and can speak intelligently and accurately to the issues discussed. If facts get in the way of your worldview, it's completely understandable why you would put me on ignore.

I, on the other hand, would prefer to discuss on the merits and impact of the issue.
I cant help that misinformation and people floating bogus data on every forum bothers me. Take the time to learn the facts and how these systems work, rather than immediately rushing to grab your torch and pitchfork.
 

That guy says: "All I did was go from smaa to msaax4 to go from <3.5 to >3.5GB.&#65279;"

How can we seriously know that the different AA isn't the cause of his stuttering? (Which was hard to really discern anyway because the whole recording was messed up.)

So people with stuttering are lying? Blindly following isn't good.

Some people are reporting no performance issues or stuttering at all. How could that be possible if this is a hardware issue and all hardware is the same?
 
That guy says: "All I did was go from smaa to msaax4 to go from <3.5 to >3.5GB.&#65279;"

How can we seriously know that the different AA isn't the cause of his stuttering? (Which was hard to really discern anyway because the whole recording was messed up.)



Some people are reporting no performance issues or stuttering at all. How could that be possible if this is a hardware issue and all hardware is the same?

I used that as a reference point to show you one... of the MANY that have been having these issues. It's not just one game or just some setting in a controlled environment. This issue is happening with a lot of users in many different forums. Just look at how many hits that guy has gotten in such a short amount of time. A lot of people are looking this up, because they are having the issue as well.

It's funny how people think you can't utilize 4GB of RAM with these newer titles. Hell.. Titan users have seen 6GB of usage @ 1080p on multiple 2014 games. If the VRAM is there, these newer games are going to utilize all of it.

BTW, I had my issues with the optimized settings on COD:AW... I didn't push for extras. My system was completely stock, nothing overclocked. I did a lot of troubleshooting. I had just switched from a R9 290X to a pair of EVGA GTX 970 SSC (3975) cards. I changed damn near everything on my system for troubleshooting. I switched out three different power supplies, I went from a Corsair HX750, to EVGA 850G2 to a Seasonic X-1250. I changed motherboards, went from a Asus P8Z77-M PRO to a MSI Z87 MPOWER MAX... of course changed processors, from i7 3770K to i7 4790K. Nothing changed, the problem got worse the longer I used the cards. The coil whine never "burned in" like they claimed it would... I ran 3DMark on those things like 5 times. I also had SLI issues, I couldn't use the rear SLI plugs because they would create black lines across the screen when running benchmarks. I could use the front ones though... I even went as far as purchasing Windows 8.1, upgrading from Windows 7 to see if it was some kind of operating system issue.
 
Last edited:
I used that as a reference point to show you one... of the MANY that have been having these issues. It's not just one game or just some setting in a controlled environment. This issue is happening with a lot of users in many different forums. Just look at how many hits that guy has gotten in such a short amount of time. A lot of people are looking this up, because they are having the issue as well.

It's funny how people think you can't utilize 4GB of RAM with these newer titles. Hell.. Titan users have seen 6GB of usage @ 1080p on multiple 2014 games. If the VRAM is there, these newer games are going to utilize all of it.

BTW, I had my issues with the optimized settings on COD:AW... I didn't push for extras. My system was completely stock, nothing overclocked. I did a lot of troubleshooting. I had just switched from a R9 290X to a pair of EVGA GTX 970 SSC (3975) cards. I changed damn near everything on my system for troubleshooting. I switched out three different power supplies, I went from a Corsair HX750, to EVGA 850G2 to a Seasonic X-1250. I changed motherboards, went from a Asus P8Z77-M PRO to a MSI Z87 MPOWER MAX... of course changed processors, from i7 3770K to i7 4790K. Nothing changed, the problem got worse the longer I used the cards. The coil whine never "burned in" like they claimed it would... I ran 3DMark on those things like 5 times. I also had SLI issues, I couldn't use the rear SLI plugs because they would create black lines across the screen when running benchmarks. I could use the front ones though... I even went as far as purchasing Windows 8.1, upgrading from Windows 7 to see if it was some kind of operating system issue.
And yet again what you cant seem to comprehend is that a game allocating vram does not mean it NEEDS it. :rolleyes:

The issue here with the 970 is when it actually NEEDS vram past 3.5 GB.
 
Is this dude reading what he is saying?

Allocating is the utilization of that hardware. In otherwise, your word for "NEED" should be replaced by "Utilize" or possibly, "Utilized to it's full potential." Either way... this dude has just gone full wetawd.
 
And yet again what you cant seem to comprehend is that a game allocating vram does not mean it NEEDS it. :rolleyes:

The issue here with the 970 is when it actually NEEDS vram past 3.5 GB.

So what you are saying is, it isnt an issue until we need that extra 500mb? Come on man, when you buy a 4gb video card, you expect to use all 4gb without some special allocation on some of it.

Besides ALOT of the newer games are using more Vram.

I can tell you this. if GTAV on the PC uses alot of vram, and this stuttering issue is still going on, you are going to have ALOT of pissed off 970 users.

GTAV on the PC is actually a game ALOT of people are looking forward too. Lets not even bring up Witcher 3.

If I can easily pass 3.5gb on dragon age @ 1440p maxed out.....well there is going to be alot of issues in the near future.
 
Is this dude reading what he is saying?

Allocating is the utilization of that hardware. In otherwise, your word for "NEED" should be replaced by "Utilize" or possibly, "Utilized to it's full potential." Either way... this dude has just gone full wetawd.
Please stop making a fool of yourself.
 
So what you are saying is, it isnt an issue until we need that extra 500mb? Come on man, when you buy a 4gb video card, you expect to use all 4gb without some special allocation on some of it.

Besides ALOT of the newer games are using more Vram.

I can tell you this. if GTAV on the PC uses alot of vram, and this stuttering issue is still going on, you are going to have ALOT of pissed off 970 users.

GTAV on the PC is actually a game ALOT of people are looking forward too. Lets not even bring up Witcher 3.

If I can easily pass 3.5gb on dragon age @ 1440p maxed out.....well there is going to be alot of issues in the near future.
Pay attention to the context. I am not defending Nvidia at all. All I was saying to him is that just because a game allocates all 4gb does not mean he will have an issue. AGAIN the issue is when more than 3.5 is really needed and that is not going to happen at playable settings on a single 970.
 
Solution found!
u8oqZcf
 
Context? The you toss around the word allocation like it has a specific meaning... Like Dynamic or Static. Your context isn't specific enough for people to even understand what you trying to explain. The problem isn't unused allocated memory. The problem is that the game fully utilizes the allocated RAM it's been allowed to use. Stop with the semantics and understand the issue here. It's a fact, people are getting micro stuttering issues. Even at 1080p settings on a GTX 970.

Here is my source for the memory usage claim:
http://www.guru3d.com/articles_page..._graphics_performance_benchmark_review,8.html
 
Status
Not open for further replies.
Back
Top