HardOCP looking into the 970 3.5GB issue?

Status
Not open for further replies.
Right, because why would someone register an account over 1 and a half years ago, not make a single post, and then finally when shit hits the fan, the FIRST EVER post he makes on this forum is an offer to help.

Sounds more like PR damage control stunt to me. Hell I could say the exact same things, have people PM me, only to tell them a week later that I "tried my best but the vendors would not see it our way". Weasel words like "offer to help" and "can't guarantee" anything allow plausible deniability, which is what nVidia is at this point.
 
everyone seems to be in panic mode or over-reacting about this...Nvidia lied about the specs but they didn't lie about the performance or the price/performance ratio...from AnandTech to Guru3D, everyone is saying the same thing...r-e-l-a-x

"Guru3D The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing...But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities...If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980...However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it...Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer...But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html
 
There is not much point getting a refund IMO... GTX980 is a lot more money for an extra 0.45gb and 10% performance, 290/290x are similar performance to a 970 at a lot more power / heat... I would keep the 970 until something actually worth swapping for comes out, its not ideal that nvidia did this but I think people are being a bit hysterical about it... The gtx970 still has 3.55GB memory which is more than a 780ti and every other graphics card except titan / 290x. I would be happy with a free game or a 15-20% refund but I cannot see that happening either..
 
So far, no one has reported to have successfully returned the card, only rejections. There are some reports of Amazon issuing a refund, but that is all as far as I've seen

I can definitely see Amazon issuing a refund, exchange or credit...Amazon is the easiest, most consumer friendly company ever...luckily I bought my GTX 970 from Amazon but I'm not sure it's worth it to return it and wait for the next-gen of cards or even worse, go back to my older GTX 580
 
There is not much point getting a refund IMO... GTX980 is a lot more money for an extra 0.45gb and 10% performance, 290/290x are similar performance to a 970 at a lot more power / heat... I would keep the 970 until something actually worth swapping for comes out, its not ideal that nvidia did this but I think people are being a bit hysterical about it... The gtx970 still has 3.55GB memory which is more than a 780ti and every other graphics card except titan / 290x. I would be happy with a free game or a 15-20% refund but I cannot see that happening either..

Right, but see I'm pissed because I originally was debating between 2x 980s or 2x 970s, with the caveat being that if I went 2x 970s, I'd upgrade again this year for something better.

Had I known about the 970 issues back then, I would've undoubtedly went for 2x 980s and held out until 2016 when we finally transition off 28nm, but because of nVidia's deception I was robbed of an informed decision, and that's what I'm most pissed about.
 
everyone seems to be in panic mode or over-reacting about this...Nvidia lied about the specs but they didn't lie about the performance or the price/performance ratio...from AnandTech to Guru3D, everyone is saying the same thing...r-e-l-a-x

"Guru3D The Bottom line

Utilizing graphics memory after 3.5 GB can result into performance issues as the card needs to manage some really weird stuff in memory, it's nearly load-balancing...But fact remains it seems to be handling that well, it’s hard to detect and replicate oddities...If you unequivocally refuse to accept the situation at hand, you really should return your card and pick a Radeon R9 290X or GeForce GTX 980...However, if you decide to upgrade to a GTX 980, you will be spending more money and thus rewarding Nvidia for it...Until further notice our recommendation on the GeForce GTX 970 stands as it was, for the money it is an excellent performer...But it should have been called a 3.5 GB card with a 512MB L3 GDDR5 cache buffer

http://www.guru3d.com/news-story/middle-earth-shadow-of-mordor-geforce-gtx-970-vram-stress-test.html

Flawed test, covered here http://hardforum.com/showthread.php?t=1849838&page=16

They lied about specs which have serious effect on performance that wasn't tested for at launch and is more and more likely to be evident as newer, more memory demanding, titles come out.
 
Right, but see I'm pissed because I originally was debating between 2x 980s or 2x 970s, with the caveat being that if I went 2x 970s, I'd upgrade again this year for something better.

Had I known about the 970 issues back then, I would've undoubtedly went for 2x 980s and held out until 2016 when we finally transition off 28nm, but because of nVidia's deception I was robbed of an informed decision, and that's what I'm most pissed about.

Yes your situation is more annoying than mine but I am a bit pissed off as well because if it was advertised as 3.5gb when I was deciding between a 970 or a 290 for £100 less I would have bought the 290.... But oh well the 970 is still good for 1080p but I will need a new GPU when I want to go to 1440p... Which I would have probably got anyway (AMD freesync) but still it is annoying and may reduce the resale value of the 970... Especially annoying for people who bought GTX 970 SLI.
 
Any thoughts of if the GTX 960 would be adequate for current games at 1920x1200?

If the GTX 970 isn't going to do what I bought it for, namely Surround high resolution gaming, then it might go back to nVidia. 960 is about $150 cheaper but would it do well?

I haven't gotten three large monitors yet, so I'm still doing single monitor gaming.
 
No major site has called out the fact that the card runs in 224-bit mode 99% of the time, and isn't actually (224+32)-bit until stutter mode kicks in?

Edit: or maybe the 7th port is cleverly split across half of vram chips #7 and #8 mitigating the effect, but that strikes me as unlikely.
 
Last edited:
The silence from some publications is troubling, and the other sites are just giving a little slap on the wrist when they say
"But it performs the same, so why should you care"

Because it does not perform well at 3.7GB of ram usage, because they didn't buy 3.7GB cards, because stutter at 4K is not what anyone wants. Because the design is inherently less future proof than some were led to believe.
I hate to be that guy but I truly believe that AMD would catch more flak for this than Nvidia and thats just journalism bullshit.
 
No major site has called out the fact that the card runs in 224-bit mode 99% of the time, and isn't actually (224+32)-bit until stutter mode kicks in?

Edit: or maybe the 7th port is cleverly split across half of vram chips #7 and #8 mitigating the effect, but that strikes me as unlikely.

Some have covered it, but didn't give Nvidia too much flak over it, which is a shame. Mainstream sites have been pretty poor on covering this. I'm hoping they are doing more thorough testing and will publish some results soon. We could've expected as much from the likes of Anandtech, TomsHardware and similar, unfortunately. I've expected better testing from Hardwarecanucks, though.

I'm hoping that the silence from HardOCP means they will come up with some in-depth coverage on this.


The memes are here, of course :)
9ER7snZ.jpg
 
Last edited:
NVIDIA lied. They deserve everything they're getting right now. THAT kind of company doesn't make THIS kind of mistake. That being said...
It's true that the cards perform just like reviews said they would perform. So what's the problem.
The reason everyone is mad is...
They thought they were getting a 64 rop card with fully accessible 4 GB that they could overclock and get 980 like performance for cheap. This faux outrage at NVIDIA' s prevarication is really nerd rage at not being able to cheat the system by getting flagship performance for midrange money.
 
NVIDIA lied. They deserve everything they're getting right now. THAT kind of company doesn't make THIS kind of mistake. That being said...
It's true that the cards perform just like reviews said they would perform. So what's the problem.
The reason everyone is mad is...
They thought they were getting a 64 rop card with fully accessible 4 GB that they could overclock and get 980 like performance for cheap. This faux outrage at NVIDIA' s prevarication is really nerd rage at not being able to cheat the system by getting flagship performance for midrange money.

Yeah, stupid customers for believing they were getting what the specification stated. They should have known it was too good to be true. Suckers.
 
The reason everyone is mad is...
They thought they were getting a 64 rop card with fully accessible 4 GB that they could overclock and get 980 like performance for cheap. This faux outrage at NVIDIA' s prevarication is really nerd rage at not being able to cheat the system by getting flagship performance for midrange money.
You can't be serious. People paid for a product that was advertised as having X features at Y price. They company didn't actually deliver said features. The outrage is entirely justified, as people made purchasing decisions based on incorrect information.
 
NVIDIA lied. They deserve everything they're getting right now. THAT kind of company doesn't make THIS kind of mistake. That being said...
It's true that the cards perform just like reviews said they would perform. So what's the problem.
The reason everyone is mad is...
They thought they were getting a 64 rop card with fully accessible 4 GB that they could overclock and get 980 like performance for cheap. This faux outrage at NVIDIA' s prevarication is really nerd rage at not being able to cheat the system by getting flagship performance for midrange money.

Nerd rage is not wanting to be lied to about the specs when you purchase a component?
 
The reason everyone is mad is...
They thought they were getting a 64 rop card with fully accessible 4 GB that they could overclock and get 980 like performance for cheap.

Yeah...seemed like a great deal at the time. Not so much now if you play at high res/high VRAM games (like most current-gen console ports).

Are you really surprised people are upset?

As far as "performing how it performed in benchmarks"...I guarantee that barely (if any) reviews specifically looked at high VRAM use cases, coupled with frametime measurements, which is what's required to see this issue.
 
Yeah...seemed like a great deal at the time. Not so much now if you play at high res/high VRAM games (like most current-gen console ports).

Are you really surprised people are upset?

As far as "performing how it performed in benchmarks"...I guarantee that barely (if any) reviews specifically looked at high VRAM use cases, coupled with frametime measurements, which is what's required to see this issue.

THANK YOU

Hopefully those "but you knew how these performed and still bought them anyway" shills will keep quiet now
 
They were reviewed ad nauseam at multiple resolutions.
Anyone who bought a 970 is going to get that performance.

What your NOT going to be able to do, is get 980 like performance simply by overclocking.
You are NOT going to drive your rich man's monitor with a poor man's card.
I wish NVIDIA had been honest. It's actually a genius approach.
 
They were reviewed ad nauseam at multiple resolutions.
Anyone who bought a 970 is going to get that performance.

What your NOT going to be able to do, is get 980 like performance simply by overclocking.
You are NOT going to drive your rich man's monitor with a poor man's card.
I wish NVIDIA had been honest. It's actually a genius approach.

You say that now, but all the reviews that were available prior to nVidia coming clean from their deception indicated it was indeed possible. So yes all 970 buyers were deceived and are rightfully outraged.
 
They were reviewed ad nauseam at multiple resolutions.
Anyone who bought a 970 is going to get that performance.

What your NOT going to be able to do, is get 980 like performance simply by overclocking.
You are NOT going to drive your rich man's monitor with a poor man's card.
I wish NVIDIA had been honest. It's actually a genius approach.

- they were reviewed without paying attention to memory usage
and/or
- they were reviewed without paying attention to memory usage and frametimes
and/or
- they were reviewed with games that didn't have today's and future memory requirements
and/or
...

People want to get full 4 GB 256-bit 224 GB/s 64 ROPs performance as advertised, that is all.
 
Keep crying.

It's a midrange card.
Sold for midrange money.
Intended for midrange resolutions at midrange settings. LOL

Just get the refund and buy AMD next time.
Oh, and get the right level of gpu for the money.
Overclocking in order to overutilize is always a crapshoot and NVIDIA trolled all ya'll real good this time.
 
LOL so your only response is "you should've known better". I see a bright future at nVidia for you, you should check if they're hiring right now.

And I guess you also run K and X CPUs at stock too right?

I would certainly press for a refund had I not spent $350 on watercooling these cards. But of course, I should've known better and it's my fault. :rolleyes:
 
Keep crying.

It's a midrange card.
Sold for midrange money.
Intended for midrange resolutions at midrange settings. LOL

Just get the refund and buy AMD next time.
Oh, and get the right level of gpu for the money.
Overclocking in order to overutilize is always a crapshoot and NVIDIA trolled all ya'll real good this time.

Nobody here is talking about overclocking...

If you bothered to actually read about the issue, you would understand instead of trying to stir shit up.
 
The biggest jab is a high-end x70 card shipping with an unbalanced vram setup, whether it was disclosed or not, that's just a joke. The take-away is that x70 is no longer high-end ... like wrangler said above, it's a mid-range card, the new 960Ti if you will.
 
The biggest jab is a high-end x70 card shipping with an unbalanced vram setup, whether it was disclosed or not, that's just a joke. The take-away is that x70 is no longer high-end ... like wrangler said above, it's a mid-range card, the new 960Ti if you will.

The take-away is, nV wasn't forthcoming and didn't bother to correct any of the reviews or the data they sent to review sites.

Yes, the performance of the card is fantastic currently. But, what happens when you actually need more than 3.5GB of VRAM? All I see are benchmarks with crazy amounts of DSR and AA while expecting things to remain playable.

For what I play at 1440p, I haven't had any issues. Then again, I'm not using DSR or most levels of AA unless it's a relatively old title.
 
Last edited:
Nobody here is talking about overclocking...

If you bothered to actually read about the issue, you would understand instead of trying to stir shit up.

This.

Forget overclocking for a second, the 970 is on average about 15% behind the 980, so if the 970 is only getting 30 FPS at a certain resolution, the 980 isn't going to magically get 60 FPS. No it's going to choke too.

The issue here, is that particularly if you run these cards in SLI, when both the 970 and 980 will have enough GPU power to push 60 frames, the 970 may give terrible stuttering due to hitting the 3.5GB vram wall, whereas the 980 will simply truck on.

This 3.5GB invisible wall was never disclosed until now, and is pretty much a killer especially at 4K. Or even at 1080p for certain games like Watch Dogs and Shadow of Mordor.
 
Last edited:
This card was not intended for 4K.
Mentioning the 970 and 4K in the same sentence is moronic.

The situations at 1080 where it runs into the buffer will require that you lower the settings.
To midrange. LOL.
 
This card was not intended for 4K.
Mentioning the 970 and 4K in the same sentence is moronic.

The situations at 1080 where it runs into the buffer will require that you lower the settings.
To midrange. LOL.

What is your objective in this thread? I'd like to know.
 
Keep crying.

It's a midrange card.
Sold for midrange money.
Intended for midrange resolutions at midrange settings. LOL

Just get the refund and buy AMD next time.
Oh, and get the right level of gpu for the money.
Overclocking in order to overutilize is always a crapshoot and NVIDIA trolled all ya'll real good this time.

Yeah, I'm not sure if you're serious.

No one here is complaining about the card's performance in most situations. That was, and still remains, very good. If you'd take 5 minutes and actually read through one of the articles floating around the web, you'd see that the real issue here is Nvidia sold huge amounts of these cards over the past 4 months while falsely claiming the card had higher specs than it does. THAT IS A PROBLEM. PERIOD.

I knew I was buying mid range cards, but that is really irrelevant. We were givin false specs for the mid range cards. Based on the specs that were given, I thought SLI 970s would be the perfect solution for what I'd be playing for quite some time (1440p, high-max settings, some AA). It is now proven that this is simply not going to be the case. Had I known the real specs a few months ago, I would have likely gone with a single 980 with the option to SLI when the price drops a bit.
 
This card was not intended for 4K.
Mentioning the 970 and 4K in the same sentence is moronic.

The situations at 1080 where it runs into the buffer will require that you lower the settings.
To midrange. LOL.

Nice backfire lol.
It would be a great card in SLI for 4K had the specs been correct.
 
This card was not intended for 4K.
Mentioning the 970 and 4K in the same sentence is moronic.

The situations at 1080 where it runs into the buffer will require that you lower the settings.
To midrange. LOL.

If the 970 isn't intended for 4K, then neither is the 980, as both are "midrange" by your own definition. So what's your point?
 
Totally serious.

Look. NVIDIA lied. I do not dispute that. I've read every word of the AT article, glanced at the Tech Report stuff and have been reading this thread since it started.

But. Was there a more reviewed card than the 970? Maybe. But not by much.
The performance was there for everyone to see.
People got what they paid for. The constant talk about 4K and special situations at 1080 don't justify the response that I see. IMHO... ok... IMHO

People want to get 780 ti performance from a 780. They want to get 290x performance from a 290. We've been spoiled to expect that for many years now.
I think I see this for what it is.
NVIDÎA found a way to end that and tried to slip it through. People found out and now they're mad.
That's my thoughts on the subject.
 
The silence from some publications is troubling, and the other sites are just giving a little slap on the wrist when they say
"But it performs the same, so why should you care"

Because it does not perform well at 3.7GB of ram usage, because they didn't buy 3.7GB cards, because stutter at 4K is not what anyone wants. Because the design is inherently less future proof than some were led to believe.
I hate to be that guy but I truly believe that AMD would catch more flak for this than Nvidia and thats just journalism bullshit.

Maybe - just maybe - some publications are noticing that their readers have the pitchforks and torches out, and know that if they don't get their facts and tests all perfectly done, they'll be burned in effigy for it. So either they're staying out of the fray entirely, or they are taking the time to do proper analysis, which takes longer than a couple of hours on Monday afternoon.
 
Totally serious.

Look. NVIDIA lied. I do not dispute that. I've read every word of the AT article, glanced at the Tech Report stuff and have been reading this thread since it started.

But. Was there a more reviewed card than the 970? Maybe. But not by much.
The performance was there for everyone to see.
People got what they paid for. The constant talk about 4K and special situations at 1080 don't justify the response that I see. IMHO... ok... IMHO

People want to get 780 ti performance from a 780. They want to get 290x performance from a 290. We've been spoiled to expect that for many years now.
I think I see this for what it is.
NVIDÎA found a way to end that and tried to slip it through. People found out and now they're mad.
That's my thoughts on the subject.

And the key difference is: aside from GPU core config, the memory subsystem was completely untouched in the examples you provided.

Once again it's not a matter of GPU power, I don't know how many times I have to say this. Look I know the 970 is 15% behind the 980, and I'm totally fine with that because I always go SLI anyway. The issue here, is that when both the 970 and 980 will have enough GPU power to push 60 frames at whatever resolution, the 970 will still crap out prematurely because of the 3.5GB vram wall, whereas the 980 won't.

In terms of GPU power, we're talking a difference of 15% here. If the 970 is getting 50 FPS, the 980 might get 60. If the 970 is getting 30 FPS, the 980 sure as hell won't be anywhere near 60. It's not a day and night difference.
 
Totally serious.

Look. NVIDIA lied. I do not dispute that. I've read every word of the AT article, glanced at the Tech Report stuff and have been reading this thread since it started.

But. Was there a more reviewed card than the 970? Maybe. But not by much.
The performance was there for everyone to see.
People got what they paid for. The constant talk about 4K and special situations at 1080 don't justify the response that I see. IMHO... ok... IMHO

People want to get 780 ti performance from a 780. They want to get 290x performance from a 290. We've been spoiled to expect that for many years now.
I think I see this for what it is.
NVIDÎA found a way to end that and tried to slip it through. People found out and now they're mad.
That's my thoughts on the subject.

I would actually prefer you were trolling than believe someone could write this seriously.
 
Status
Not open for further replies.
Back
Top