HardOCP looking into the 970 3.5GB issue?

Status
Not open for further replies.
People will just ignore it and scream conspiracy.

It is a simple thing, and it also shows how they cut some corners on cheaper binned parts while still keeping performance up.

"Those are also average framerates, which don’t address the problem some commenters have pointed out: dramatic framerate stutter at the moment the GTX 970 starts utilizing its final 500MB of VRAM."


Even if we ignore all that, they've lied about the specs for months. There is no excuse for that.
 
It blows my mind that people are actually defending nVidia about this.

Even if we assume that there is very little performance impact when using >3.5GB of VRAM, there is really no denying that they published the wrong specs (whether intentionally or not, doesn't really matter) and then those specs stuck around for MONTHS afterwards with no one being like "Hey, isn't that incorrect?". You can bet if those particular specs were something easily measurable by the community they would have been exposed within days. However, since it's an architecture issue there's really no confirmation other than this issue that we are seeing indirectly with >3.5GB situations.

Is this a big deal in terms of performance? Maybe, maybe not. Did nVidia purposefully lie about the specs? Maybe, maybe not. Doesn't really matter...what matters is that nVidia undeniably fucked up in one or more ways and it sounds like it's unfixable.

If there's an issue with stuttering, regardless of FPS, then that IS a performance impact and it absolutely IS a problem. Buying a 4 GB card and then getting horrible stutter if you have the audacity to use all 4 GB is bullshit.

I have been a pretty big fan of nVidia in recent years, but they fucked up here, plain and simple. What remains is to see what, if anything, they do about it.
 
Blah, blah, blah: they lied. Who cares, the GTX 970 was cheap as dirt for the kind of performance it offers. Now the question is where are the real world benchmarks using VRAM intensive applications like X-Plane 10. If my job were to report on stuff like this I could have it done in a couple of hours. It's been days.
 
It blows my mind that people are actually defending nVidia about this.

Even if we assume that there is very little performance impact when using >3.5GB of VRAM, there is really no denying that they published the wrong specs (whether intentionally or not, doesn't really matter) and then those specs stuck around for MONTHS afterwards with no one being like "Hey, isn't that incorrect?". You can bet if those particular specs were something easily measurable by the community they would have been exposed within days. However, since it's an architecture issue there's really no confirmation other than this issue that we are seeing indirectly with >3.5GB situations.

Is this a big deal in terms of performance? Maybe, maybe not. Did nVidia purposefully lie about the specs? Maybe, maybe not. Doesn't really matter...what matters is that nVidia undeniably fucked up in one or more ways and it sounds like it's unfixable.

If there's an issue with stuttering, regardless of FPS, then that IS a performance impact and it absolutely IS a problem. Buying a 4 GB card and then getting horrible stutter if you have the audacity to use all 4 GB is bullshit.

I have been a pretty big fan of nVidia in recent years, but they fucked up here, plain and simple. What remains is to see what, if anything, they do about it.

Thank you for this. Really - thank you. Your post is spot on. I'll also add that Nvidia wasn't exactly forthcoming about their memory segmentation either. I can't believe people are defending nVidia on this either. Blows my mind also that the part about the 500 MB being dramatically slower just came to light too. I knew there was a reason the driver only allowed using the last 500 MB if it absolutely HAD to.
 
While I don't own a GTX 970, it is my firm belief that if a manufacturer is going to market a 4GB card, all 4GB must be accessed and run at the same speed; whether partitioned or not. My guess - not fact as I have not researched - is that failure to provide the public full disclosure of a product probably goes against each state's Consumer Protection laws.
 
If there's an issue with stuttering, regardless of FPS, then that IS a performance impact and it absolutely IS a problem. Buying a 4 GB card and then getting horrible stutter if you have the audacity to use all 4 GB is bullshit.

Has there been any proof yet that this causes stuttering?
 
Has there been any proof yet that this causes stuttering?

A lot of anecdotal evidence from what I have seen, but I would for sure like to see some actual frametime measurements from legitimate sites ([H], wink wink nudge nudge?).

However, I don't personally know for sure which is why I prefaced the statement with "If". :D

Just so far I have been seeing various users / nVidia saying the performance impact is "minimal" (1-6%-ish?), but in relation to framerates and not frametimes.
 
While I don't own a GTX 970, it is my firm belief that if a manufacturer is going to market a 4GB card, all 4GB must be accessed and run at the same speed; whether partitioned or not. My guess - not fact as I have not researched - is that failure to provide the public full disclosure of a product probably goes against each state's Consumer Protection laws.

Even the speed is wrong. It's not a 4GB 224 GB/s card, but a 3.5GB 196GB/s one.
 
I personally noticed stuttering with my 970s earlier this month while using them to run my heavily textured skyrim. The stuttering was very noticeable as soon as 3506mb of vram was filled.
 
I personally noticed stuttering with my 970s earlier this month while using them to run my heavily textured skyrim. The stuttering was very noticeable as soon as 3506mb of vram was filled.

The seven 512MB sections of memory in the 3.5GB partition adds up to 3584MB of VRAM. I could not tell you why you have experienced stuttering on the 3.5GB portion of the VRAM.
 
Yeah, that is kinda like being sold a V8 and getting a V6.

More like thinking you bought a V8, you look under the hood and it looks like a V8, Chevy told you it was a V8, but when you're doing 90 miles an hour 2 cylinders stop firing.
 
Why are the Nvidia execs selling their shares?

Something they knew this would be revealed?

Last year, between April and most of November, each transaction of NVDA typically involved less than 3,000 shares.

Within the past two months, Nvidia's largest shareholders garnered a minimum of 4 million dollars.

David Shannon
01/23/2015 - Sold 10,000 Shares
01/21/2015 - Sold 21,400 Shares
01/02/2015 - Sold 8,600 Shares
Approximately $800,000

JONES HARVEY C
12/22/2014 - Sold 88,000 Shares
Approximately 2 Million Dollars

PURI AJAY K
12/03/2014 - Sold 24,250 Shares
11/28/2014 - Sold 34,174 Shares
Approximately 1.2 Million Dollars

You cannot deny that insider trading has occurred.

Source: http://www.nasdaq.com/symbol/nvda/insider-trades
 
While I don't own a GTX 970, it is my firm belief that if a manufacturer is going to market a 4GB card, all 4GB must be accessed and run at the same speed; whether partitioned or not. My guess - not fact as I have not researched - is that failure to provide the public full disclosure of a product probably goes against each state's Consumer Protection laws.

This. People expect 256 bit 4GB to act like 4GB, not 224 bit 3.5 + an unwanted half hanging off like side-port memory. Despite my serious stack of nVidia cards, and my detest of this move by nVidia, by the letter of the wording I don't think they've technically lied -- the card does have 256 bits and 4GB ... just not like everyone is used to though ... it's a crap move.

Well, hopefully all future reviews check for (lack of) segmentation and confirm uniform memory access!!
 
Why are the Nvidia execs selling their shares?

Something they knew this would be revealed?

Last year, between April and most of November, each transaction of NVDA typically involved less than 3,000 shares.

Within the past two months, Nvidia's largest shareholders garnered a minimum of 4 million dollars.

David Shannon
01/23/2015 - Sold 10,000 Shares
01/21/2015 - Sold 21,400 Shares
01/02/2015 - Sold 8,600 Shares
Approximately $800,000

JONES HARVEY C
12/22/2014 - Sold 88,000 Shares
Approximately 2 Million Dollars

PURI AJAY K
12/03/2014 - Sold 24,250 Shares
11/28/2014 - Sold 34,174 Shares
Approximately 1.2 Million Dollars

You cannot deny that insider trading has occurred.

Source: http://www.nasdaq.com/symbol/nvda/insider-trades
You've just crossed over into tin-foil territory, my friend. Take the helmet off.
 
This. People expect 256 bit 4GB to act like 4GB, not 224 bit 3.5 + an unwanted half hanging off like side-port memory. Despite my serious stack of nVidia cards, and my detest of this move by nVidia, by the letter of the wording I don't think they've technically lied -- the card does have 256 bits and 4GB ... just not like everyone is used to though ... it's a crap move.

Well, hopefully all future reviews check for (lack of) segmentation and confirm uniform memory access!!

That's exactly how I feel ! People make these kinds of purchases with the future in mind, so while we don't know how exactly future scenarios will affect these cards it is vital to know the facts upfront.
 
1. No. Performance is performance is performance.
LOL, that is why people have taken this news so well ;)

2. Because comparing the wrong specs against bad source data would not send up any red flags! The team responsible for feeding the incorrect data to review sites was also the one responsible for checking it on review sites against their provided data. They are the technical marketing team that works with reviewers.

It's clearly possible.

You are convinced everyone at Nvidia is innocent, OK, I got it, further discussion is pointless.
 
performance impact or not...Nvidia lied about the specs...they should offer customers the choice of a refund or replacement card
 
AMD's fuckup resulted in consumers getting GPU's that under-performed compared to review samples. They got hardware that throttled more-often than review samples.

There you are spreading that LIE again. As I told you, some cards went out with the wrong fan profile and throttled back. A driver update corrected it.
Here is my proof, now let us see yours
Link to full article here at [H]
http://www.hardocp.com/article/2014/01/05/amd_radeon_r9_290x_retail_performance_variance_review/1

AMD's official response
Hello, We've identified that there's variability in fan speeds across AMD R9 290 series boards. This variability in fan speed translates into variability of the cooling capacity of the fan-sink. The flexibility of AMD PowerTune technology enables us to correct this variability in a driver update. This update will normalize the fan RPMs to the correct values.

The correct target RPM values are 2200RPM for the AMD Radeon R9 290X "Quiet mode", and 2650RPM for the R9 290. You can verify these in GPU-Z. If you're working on stories relating to R9 290 series products, please use this driver as it will reduce any variability in fan speeds. This driver will be posted publicly tonight


From [H]'s conclusion
We have also learned that the issue is not truly about retail cards versus press sample cards, as some sites would have you believe. The issue is simply the nature of each video card, its manufacturing differences, and tolerances, and the fact of each video card being just a little bit different in so many variables compared to another one. These are things AMD cannot control apparently, and it means there is variance in clock speed since the clock speed is wholly dynamic now.

The Bottom Line
At the end of the day, it is always disappointing to find that something that you spend your hard earned cash on isn't performing "up to" the advertised level of performance due to power or heat factors. For some people, it might be a deal breaker that the Radeon R9 290X doesn't run at 1000MHz on a full time basis regardless of all other parameters. However, after taking a few steps back, we realized that the actual game play experience and best playable settings are not at all impacted by the variations between cards in Uber mode. While a couple of frames per second might be crucial in borderline cases where it could force a game to the next lower settings, in 4 of the most demanding games available on the market today, that’s simply not a problem with the Radeon R9 290X.

For the question around press samples performing differently from retail video cards we do not believe that AMD is sending out non-representative review samples. The performance from GIGABYTE #1 is so close to the press sample cards’ performance that it could easily be in the expected margin of error. However, GIGABYTE #2 does raise a bit of suspicion with its lower performance and observed GPU clocks across the board, but it could also be related to the power delivery mechanisms not being as efficient as designed causing the higher power draw to the GPU. Either way, the same graphics settings and game play experience was enjoyed with GIGABYTE #2 as was with the other three cards.
 
More like thinking you bought a V8, you look under the hood and it looks like a V8, Chevy told you it was a V8, but when you're doing 90 miles an hour 2 cylinders stop firing.

This post wins the thre err I mean car analogy.
 
More like thinking you bought a V8, you look under the hood and it looks like a V8, Chevy told you it was a V8, but when you're doing 90 miles an hour 2 cylinders stop firing.

Pretty much...except you can't technically even look under the hood, you just know some shit is fucked up based on the performance issues. :p
 
Pretty much...except you can't technically even look under the hood, you just know some shit is fucked up based on the performance issues. :p

Having performance issues when going full throttle... oye :eek:
 
More like thinking you bought a V8, you look under the hood and it looks like a V8, Chevy told you it was a V8, but when you're doing 90 miles an hour 2 cylinders stop firing.

Pretty much...except you can't technically even look under the hood, you just know some shit is fucked up based on the performance issues. :p

How about...

More like thinking you bought a V8, Chevy told you it was a V8, and it indeed drives like a V8. But when you take it to the track 2 cylinders stop firing.
 
Even if we assume that there is very little performance impact when using >3.5GB of VRAM, there is really no denying that they published the wrong specs (whether intentionally or not, doesn't really matter) and then those specs stuck around for MONTHS afterwards with no one being like "Hey, isn't that incorrect?". You can bet if those particular specs were something easily measurable by the community they would have been exposed within days. However, since it's an architecture issue there's really no confirmation other than this issue that we are seeing indirectly with >3.5GB situations.

The L2 cache discrepancy is really the only one that would have any impact, since the ROP difference doesn't mean jack in this case. You can get the L2 cache value by querying the card's information, so that info was available to reviewers and consumers from day one and wasn't being hidden.
 
So the two main defenses for Nvidia lying about the memory bus on a $350 video card:
  • It probably won't be a big deal, since:
    • most games don't use more than 3.5gb
    • you should have known what you were getting into by reading reviews
    • and it's probably irrelevant that all of the review sites were mislead with bogus spec sheets and testing old games that require less VRAM.

  • AMD made a driver mistake in one of their cards that caused it to throttle and was fixed with a driver update,
    • (that excuses months of lies from Nvidia)
There are enough anecdotes of people encountering severe stuttering when going over 3.5gbs of utilization in this thread, and the other one posted in the main video card forum, it leads me to seriously question the motivations of the people who are going such great lengths to defend Nvidia in this thread. Nvidia lied about their card's memory bus, they lied about its basic specification, and they allowed sites to publish false information for months without correcting it. That's inexcusable.

Finally: average frame rates, as measured in most reviews, don't adequately represent the loss of immersion caused by stuttering. A game can have high FPS, high stuttering, and be unplayable (Far Cry 4)
 
performance impact or not...Nvidia lied about the specs...they should offer customers the choice of a refund or replacement card

The launch price of the GTX 970 is $330, as opposed to $500 for the GTX 980.

Oh, let me digress for a moment. I distinctly recall that everyone was pleasantly surprised when the GTX 970 didn't launch at the expected $400, because previously, the GTX 780 was $650 while the GTX 770 was $400, the GTX 680 was $500 while the GTX 670 was $400, the GTX 580 was $500 while the GTX 570 was $350, and the GTX 480 was $500 while the GTX 470 was $350.

Looking at the price history of the x70 lines, I could almost suggest that it's like everyone here who has bought a GTX 970 has already received a refund of at least $20 (if not $50 or $70) since there's never been a cheaper x70 card.

Back to my original point. Nvidia's current claim is that the 970 performance drop when using more than 3.5GB is 4%-6% more than a GTX 980's drop in performance at that VRAM usage range. If you wish to argue about the degree of performance penalty, then do please post evidence and I shall adjust this post accordingly.

If a class action lawsuit were to proceed and be awarded against Nvidia, then I would expect that the reward offered to claimants would then be equal to 4%-6% of the price of the GTX 970 purchased. 5% of $330 is $16.50.

Or, should I say that 5% of $400 is $20, or that 6% of $350 is $21? Funny how that works out. You bought the GTX 970 because the price was appealing to you, because it was anywhere from $20 to $70 cheaper than you thought it should be. Now you know why it was so cheap.
 
If a class action lawsuit were to proceed and be awarded against Nvidia, then I would expect that the reward offered to claimants would then be equal to 4%-6% of the price of the GTX 970 purchased. 5% of $330 is $16.50.

Or, should I say that 5% of $400 is $20, or that 6% of $350 is $21? Funny how that works out. You bought the GTX 970 because the price was appealing to you, because it was anywhere from $20 to $70 cheaper than you thought it should be. Now you know why it was so cheap.

Nvidia lied about the specifications of a product...that alone is enough to warrant a full refund...if you lie about the specs of a TV, car, stove etc you can be sued regardless of how much real world impact it may or may not have...it's called false advertising or bait and switch...people go by the specs when buying computer hardware probably more so then any other product

it may not effect real world performance now but I plan on keeping my Gigabyte 970 for at least 2+ years...what happens when games do start to utilize more then 3.5GB VRAM (1920 x 1200)?
 
it may not effect real world performance now but I plan on keeping my Gigabyte 970 for at least 2+ years...what happens when games do start to utilize more then 3.5GB VRAM?

Games already utilize over 3.5GB of VRAM and the GTX 970 stutters noticeably when that happens.
 
I bet nearly every single person that bought a gtx 970 still would have bought it if it was listed as having 3.5 gb. Yes I understand that it was shady of Nvidia but I dont think knowing then what you know now would have made any difference as the card's actual performance and price was what mattered.
 
I bet nearly every single person that bought a gtx 970 still would have bought it if it was listed as having 3.5 gb. Yes I understand that it was shady of Nvidia but I dont think knowing then what you know now would have made any difference as the card's actual performance and price was what mattered.

You're wrong. I was just about to pull the trigger on a GTX 970 SLI setup precisely because the 3gb on my GTX 780s are getting maxed out in current games and I wanted something with more memory headroom. I have actually been happy with the GPU performance. I just wanted more memory, due to Far Cry 4 and Titanfall stuttering with textures maxed, and GTA V coming up.

It would have cost me ~$240 to upgrade after selling my GTX 780s. If I had spent all that money and found I was only getting an extra 512mb of memory, I would have been furious. As it stands, I'm furious that Nvidia has falsely advertised the bus speed and memory bandwidth in the first place. For months, they turned a blind eye to bogus information on review sites. Shameful.
 
You're wrong. I was just about to pull the trigger on a GTX 970 precisely because the 3gb on my GTX 780s are getting maxed out in current games and I wanted something with more memory headroom (have actually been happy with the GPU performance. I just wanted more memory).

It would have cost me ~$240 to upgrade after selling my GTX 780s. If I had spent all that money and found I was only getting an extra 512mb of memory, I would have been furious. As it stands, I'm furious that Nvidia has falsely advertised the bus speed and memory bandwidth in the first place. For months, they turned a blind eye to bogus information on review sites. Shameful.
Lol I am wrong because YOU said so? Thanks and I will be sure to check in with you before posting my opinion next time. :p

People still bought the 570/470 with 1.25 GB of vram over the 580/480 because of the price and how it performed. So again if the 970 launched listed as having 3.5 GB then MOST people still would have bought it because it was/is a good card for the money...
 
It also shows how you can put 4GB on a GPU, advertise it, and yet not allow the consumer to truly utilize 4GB. Its amazing how they did that! Great job Nvidia!
 
There are enough anecdotes of people encountering severe stuttering when going over 3.5gbs of utilization in this thread, and the other one posted in the main video card forum, it leads me to seriously question the motivations of the people who are going such great lengths to defend Nvidia in this thread.

The trouble with anecdotal evidence is that it's worthless, especially in regards to stuttering, which is a problem with a billion possible causes. Video cards have suffered severe stuttering long before the GTX 970 was ever released. I'm asking for reproducible evidence that this issue causes stuttering; point me towards some.

This isn't a defense of Nvidia; it's a very significant point that needs to be proven one way or another. If this issue does cause stuttering, it's a very big fucking deal and a serious crippling of the 970. Nvidia is not going to offer proof that it does; we all know they have too big an interest to go looking for it.

You guys have to prove it does cause stuttering, if indeed it does. Nobody else is going to do it for you. You guys have to show non-anecdotal proof that is solid and reproducible or it's not going to stand in class-action court or even in a gaming news article.
 
will MSI Afterburner (or similar programs) only report a max of 3.5GB VRAM usage?...or will that separate allocation of 512MB show up in monitoring programs?
 
Lol I am wrong because YOU said so? Thanks and I will be sure to check in with you before posting my opinion next time. :p
you're wrong in saying that consumers wouldn't care and would have still bought the video card anyway. i care, so your blanket statement that nobody would care is wrong.
 
you're wrong in saying that consumers wouldn't care and would have still bought the video card anyway. i care, so your blanket statement that nobody would care is wrong.
Perhaps you should actually read a little more carefully as I said "nearly every single person".

IMO If Nvidia and reviews had disclosed all of this at launch then I bet hardly anyone would have cared or passed on it. It was a damn nice card for 330 bucks at launch and that is what most people looked at. Hell if it was advertised as 3.5 GB from the beginning then people would be saying how stupid it is to worry about having the full 4GB.
 
Last edited:
IMO If Nvidia and reviews had disclosed all of this at launch then I bet hardly anyone would have cared or passed on it. It was damn nice card for 330 bucks and that what people looked at.
But they didn't. They instead lied about the specifications for months. And what you are saying is that most people who bought it wouldn't mind knowing they were lied to. I find that difficult to believe, but clearly you've established that I'm an outlier.
 
If I'd knew about the 3.5GB vram on the 970 back then, I might've ponied up some extra dough and sprung for the 980 instead, and then held out until 16nm dropped in 2016.
 
Status
Not open for further replies.
Back
Top