HardOCP looking into the 970 3.5GB issue?

Status
Not open for further replies.
I wanted to buy a card that could handle games like Titanfall and Far Cry 4 without memory-related stuttering. I'm glad I didn't pull the trigger on the GTX 970 and was just about to right before I heard about this bullshit.

I've been an Nvidia fanboy for a decade. I've bought all of their cards in SLI (up until this generation), bought into GSYNC, 3D Vision, Tegra, even their shitty 680i SLI motherboard. But falsely advertising the specs of a card, and then not correcting it for months, and then making excuses when people start noticing performance problems--performance problems which will be increasingly common as games utilize more VRAM--this is completely unacceptable.

I can't imagine ever giving Nvidia another dime unless they make amends for this problem in a big way.
 
And it does have a real-world impact in games that use over 3.5gb, in terms of stuttering. Everyone apologizing for Nvidia is basing their argument on the assumption that games won't continue to require increasing amounts of video memory.

You are stating this as fact, yet no one has conclusive evidence showing this.

Why are you stating facts before any evidence has come in to support them?

How would stuttering happen exactly? Could you hypothesize how slower access to some memory for some resources would cause one frame to take longer than the next, when both frames are accessing the same data the same way and the same number of times? You'd get a small general slowdown for *all* frames, not a couple.
 
Which was never misrepresented in benchmarks. There are frame-time percentile scores from review samples on launch-day that tell you EXACTLY how a GTX 970 will perform.

That performance curve has not changed. You still got the performance you paid for. The only thing that's changed is you know WHY the performance curve looks like it does.

So yeah... why is this a big deal?

How many benchmarks including Titanfall and Far Cry 4 were released when the GTX 970 was made available? Oh right, none. One is a game that is nearly impossible to use in reviews, and the other is brand new.
 
If the GTX 970's performance is a problem for you now, it should have been a problem for you on launch-day.
That's bullshit.
How is that bullshit? The performance of the card HAS NOT changed.

If the 970 isn't enough for you AFTER learning why its performance HAS ALWAYS looked the way it does, then it shouldn't have been enough for you BEFORE learning why its performance has always looked like it does... because it's the same damn performance curve either way.
 
People that look at spec sheets like this seen on pcper and use it in their buying decision.

GeForce GTX 980
GeForce GTX 970 (Corrected)

GPU Code name
GM204
GM204

GPU Cores
2048
1664

Rated Base Clock
1126 MHz
1050 MHz

Texture Units
128
104

ROP Units
64
52

L2 Cache
2048 KB
1792 KB
Its actually 56 ROPs and pcper has edited the specs to reflect that.
 
Just because last year's games worked fine in benchmarks doesn't mean jack.

So please educate us. What's the difference between:

today's games at max settings and high res (to get >3.5GB but <4.0GB used)

And

tomorrows games at lower settings (to get >3.5GB but <4.0GB used)?

Somehow you think the fact that the game comes out in the future will make it magically uncover something people haven't already tested today with the memory behavior? You can use and test that memory *today*.
 
Wcq4OBo.gif

You just won this thread.
And I knew that this was not some BS issue.
Goodbye to the idea of ever getting a GTX970.
And atop of that they lied about the specs and never corrected them?
How shady can you get Nvidia.
 
it's bullshit for the reasoning i stated that you didn't quote.
Still waiting for a reason I should care about something that has NO EFFECT ON PERFORMANCE. The GTX 970 performed the same before and after this information came to light.

Performance was not misrepresented. You knew exactly what performance-level you were paying for. Not seeing why this is a big deal.

Surely AMD made a bigger fuckup by releasing cards that actually performed slower than what they handed to review sites. This seems benign by comparison.
 
It does have an effect on performance, and Nvidia cherry-picked benchmarks in their response that minimized this performance impact. Others are saying the performance difference between GTX 970 and GTX 980 is twice as high as what Nvidia claims, and given the fact that Nvidia lied about their card's specifications for the last 4 months, I'm disinclined to believe Nvidia's benchmarks.
 
How would stuttering happen exactly? Could you hypothesize how slower access to some memory for some resources would cause one frame to take longer than the next, when both frames are accessing the same data the same way and the same number of times? You'd get a small general slowdown for *all* frames, not a couple.

I guess it could happen if different frames were rendered from different textures from different segments. E.g. all of frame 1's textures are in the first segment, but frame 2 requires memory access to both segments. But considering how texture change is probably spread out over a large volume of frames (maybe 1-2 seconds?) it seems like the effect would be smoothed out.

Or if, heaven forbid, the data keeps getting moved around between segments. But that would seem like an issue at the software (driver / application) level.
 
It does have an effect on performance, and Nvidia cherry-picked benchmarks in their response that minimized this performance impact. Others are saying the performance difference between GTX 970 and GTX 980 is twice as high as what Nvidia claims, and given the fact that Nvidia lied about their card's specifications for the last 4 months, I'm disinclined to believe Nvidia's benchmarks.

You're clearly emotionally affected by this on a deep level. I can't debate the facts with you on a rational level, so I'm not going to waste my time.

Let me know when the grieving period is over and maybe we can find some common ground.
 
Just because they got away with it until now, doesn't validate the deception, yikes. Obviously *somebody* noticed.
 
It does have an effect on performance
Nope, the configuration of the card has not changed, so performance is exactly the same as it's always been.

Performance was never misrepresented. Again, no-care from me.

Nvidia cherry-picked benchmarks in their response that minimized this performance impact.
Nvidia has themselves said that the difference only seems to be around 5%... how do you cherry pick between results that are only 5% off? That's tiny and pretty much not worth doing.
 
Surely AMD made a bigger fuckup by releasing cards that actually performed slower than what they handed to review sites. This seems benign by comparison.
Stop repeating that lie or link to PROOF that review sites got cherry picked cards.
 
Just because they got away with it until now, doesn't validate the deception, yikes. Obviously *somebody* noticed.

Lets game this out, rationally:

If it were deception, they would try to hide it at all costs, in every avenue possible, right? Well wouldn't such a deception include hiding it in the CUDA device enumeration data that Anandtech posted that has been around for months, showing reduced cache on 970?

http://www.anandtech.com/show/8935/geforce-gtx-970-correcting-the-specs-exploring-memory-allocation

Why wouldn't they alter the CUDA reported cache to hide it? Oh, forgetting to make this number match their "deception" was a mistake perhaps?

So let me get this straight... according to people who think NVIDIA is lying, NVIDIA is capable of making a mistake in their attempt to hide this "deception", but they are not capable of making a mistake in their reviewer guide?

It was a mistake, and has no effect on the benchmark results anyone published.
 
Last edited:
I guess it could happen if different frames were rendered from different textures from different segments. E.g. all of frame 1's textures are in the first segment, but frame 2 requires memory access to both segments. But considering how texture change is probably spread out over a large volume of frames (maybe 1-2 seconds?) it seems like the effect would be smoothed out.

Or if, heaven forbid, the data keeps getting moved around between segments. But that would seem like an issue at the software (driver / application) level.

But that's not how textures work. They are either in use or they are not. Their use does not oscillate frame to frame. That's why I'm pointing out that whatever tax there is would be consistent frame to frame, and not stutter.

Even if they did oscillate frame to frame (they don't, but for example), it would still be stutter that would be there for regular memory configurations where all memory access is full speed, too... It just would be even less measurable. But this is all theoretical, and there is no such texture oscillation like is in games. Textures are either in use or they are not.

You're right that if it kept getting moved around it could cause some issues. Fortunately that's not how these work... The OS does not aggressively move data between the segments, and doing so is super fast. The two memory segments are almost treated equally when it comes to the decision about whether or not to move stuff between them. It just leaves them be.
 
Last edited:
Lets game this out, rationally:

If it were deception, they would try to hide it at all costs, in every avenue possible, right? Well wouldn't such a deception include hiding it in the CUDA device enumeration data that Anandtech posted that has been around for months, showing reduced cache on 970?

http://www.anandtech.com/show/8935/...cting-the-specs-exploring-memory-allocation/3

Why wouldn't they alter the CUDA reported cache to hide it? Oh, forgetting to make this number match their "deception" was a mistake perhaps?

So let me get this straight... according to people who think NVIDIA is lying, NVIDIA is capable of making a mistake in their attempt to hide this "deception", but they are not capable of making a mistake in their reviewer guide?

It was a mistake, and has no effect on the benchmark results anyone published.

I disagree, as the card is advertised as 256-bit, but effectively operates in 224-bit mode most of the time (only 7 of 8 channels). They knew what they were pulling here from the get-go.
 
Still waiting for a reason I should care about something that has NO EFFECT ON PERFORMANCE. The GTX 970 performed the same before and after this information came to light.

Performance was not misrepresented. You knew exactly what performance-level you were paying for. Not seeing why this is a big deal.

Surely AMD made a bigger fuckup by releasing cards that actually performed slower than what they handed to review sites. This seems benign by comparison.

Amd may have "cherry picked" gpus for reviewers, but they didn't lie about the specs. I don't buy for a second that nvidia gave out the wrong specs months ago and never noticed it until now.

So if they weren't call out on this they would have never noticed they specs were wrong? They only reason they are admitting this is because people noticed an issue. I don't care who you are, amd, nvidia, Intel, if you lie about something, people will find out and call you on it.
 
I disagree, as the card is advertised as 256-bit, but effectively operates in 224-bit mode most of the time (only 7 of 8 channels). They knew what they were pulling here from the get-go.


You still haven't addressed the question. How do you so conclusively say that one mistake is more probable than the other?


The question isn't:
"Was this an intentional design by engineering?". We know that answer is yes.

The question is:
"Isn't it just as likely that technical marketing made a mistake in a document as it is likely that they were blatantly lying about the product specs (despite the performance being enough to sell it) and forgot to change CUDA to lie about the cache size specs also?"

Either possible story has a mistake involved, but somehow people assume one of these mistakes is possible and the other isn't.
 
You still haven't addressed the question. How do you so conclusively say that one mistake is more probable than the other?


The question isn't:
"Was this an intentional design by engineering?". We know that answer is yes.

The question is:
"Isn't it just as likely that technical marketing made a mistake in a document as it is likely that they were blatantly lying about the product specs (despite the performance being enough to sell it) and forgot to change CUDA to lie about the cache size specs also?"

Either possible story has a mistake involved, but somehow people assume one of these mistakes is possible and the other isn't.

Thats called PR Spin, No one believes Nvidia. Specially after 4 months time. Companies do it all the time to save face. Nvidia is no different.
 
Thats called PR Spin, No one believes Nvidia. Specially after 4 months time. Companies do it all the time to save face. Nvidia is no different.

Anyone reading should note I presented a rational neutral question and he replied with clear bias.
 
You still haven't addressed the question. How do you so conclusively say that one mistake is more probable than the other?


The question isn't:
"Was this an intentional design by engineering?". We know that answer is yes.

The question is:
"Isn't it just as likely that technical marketing made a mistake in a document as it is likely that they were blatantly lying about the product specs (despite the performance being enough to sell it) and forgot to change CUDA to lie about the cache size specs also?"

Either possible story has a mistake involved, but somehow people assume one of these mistakes is possible and the other isn't.

Ok, let's agree it was a mistake. Are telling me nvidia didn't notice the mistake with the specs until Sunday? I find that real hard to believe.

When everyone starting reporting problems, why would nvidia have to "look into it"? They knew exactly what they problem was. They just needed away to cover their asses and come up with the best pr response.
 
You still haven't addressed the question. How do you so conclusively say that one mistake is more probable than the other?


The question isn't:
"Was this an intentional design by engineering?". We know that answer is yes.

The question is:
"Isn't it just as likely that technical marketing made a mistake in a document as it is likely that they were blatantly lying about the product specs (despite the performance being enough to sell it) and forgot to change CUDA to lie about the cache size specs also?"

Either possible story has a mistake involved, but somehow people assume one of these mistakes is possible and the other isn't.

I have two questions for you

1. If this info had come out in the early reviews would it have hurt sales?

2. Once reviews started postng the wrong specs why didn't anyone at Nvidia correct it?
 
Ok, let's agree it was a mistake. Are telling me nvidia didn't notice the mistake with the specs until Sunday? I find that real hard to believe.

When everyone starting reporting problems, why would nvidia have to "look into it"? They knew exactly what they problem was. They just needed away to cover their asses and come up with the best pr response.

So am I to believe that every day at work, you go back over everything you've ever done for the past number of months? How does the amount of time that has gone by affect anything? Does your work involve many countless technical specifications in tables, where its not obvious if a number is wrong? It's not like spell checking a document lol.

When you do work, you do it and move on, especially if you are under time pressure to do it. What matters most is that the price and performance are right, and there were no mistakes there. Those are both still accurate.

There are thousands of people involved in launching a product, and most people in the chain only know a lot about their one little slice of the process. One missed email, or an email that was never sent can have significant implications downstream, when people make assumptions while doing their work. A large percentage of people who work at NVIDIA could easily look at the specs publicly listed and not know they weren't correct. It's not their job to know.

So its absolutely possible, maybe even likely.
 
Last edited:
I have two questions for you

1. If this info had come out in the early reviews would it have hurt sales?

2. Once reviews started postng the wrong specs why didn't anyone at Nvidia correct it?

1. No. Performance is performance is performance.

2. Because comparing the wrong specs against bad source data would not send up any red flags! The team responsible for feeding the incorrect data to review sites was also the one responsible for checking it on review sites against their provided data. They are the technical marketing team that works with reviewers.

It's clearly possible.
 
So am I to believe that every day at work, you go back over everything you've ever done for the past number of months? How does the amount of time that has gone by affect anything? Does your work involve many countless technical specifications in tables, where its not obvious if a number is wrong? It's not like spell checking a document lol.

When you do work, you do it and move on, especially if you are under time pressure to do it. What matters most is that the price and performance are right, and there were no mistakes there. Those are both still accurate.

There are thousands of people involved in launching a product, and most people in the chain only know a lot about their one little slice of the process. One missed email, or an email that was never sent can have significant implications downstream, when people make assumptions while doing their work. A large percentage of people who work at NVIDIA could easily look at the specs publicly listed and not know they weren't correct. It's not their job to know.

So its absolutely possible, maybe even likely.

Oh, please. Are we really to believe that NOT A SINGLE person that knew the true specs had read any of the thousands of reviews and articles on one of the company's most important products?



1. No. Performance is performance is performance.

...as long as no >3.5 GB is required, at least.
 
So am I to believe that every day at work, you go back over everything you've ever done for the past number of months? How does the amount of time that has gone by affect anything? Does your work involve many countless technical specifications in tables, where its not obvious if a number is wrong? It's not like spell checking a document lol.

When you do work, you do it and move on, especially if you are under time pressure to do it. What matters most is that the price and performance are right, and there were no mistakes there. Those are both still accurate.

There are thousands of people involved in launching a product, and most people in the chain only know a lot about their one little slice of the process. One missed email, or an email that was never sent can have significant implications downstream, when people make assumptions while doing their work. A large percentage of people who work at NVIDIA could easily look at the specs publicly listed and not know they weren't correct. It's not their job to know.

So its absolutely possible, maybe even likely.

Did you really guy compare the work of an IT person in health care to nvidias marketing department?

You really think nvidia as large as they are don't have people checking these kind of things?
 
Amd may have "cherry picked" gpus for reviewers, but they didn't lie about the specs.
AMD's fuckup resulted in consumers getting GPU's that under-performed compared to review samples. They got hardware that throttled more-often than review samples.

Nvidia's fuckup resulted in no performance difference from reviews. It's purely a clerical error.

I'd be MUCH more pissed about AMD's fuckup. Nvidia got a number on a piece of paper wrong (who the hell cares, really?), AMD actually gave consumers poorer-performance than expected.
 
AMD's fuckup resulted in consumers getting GPU's that under-performed compared to review samples. They got hardware that throttled more-often than review samples.

Nvidia's fuckup resulted in no performance difference from reviews. It's purely a clerical error.

I'd be MUCH more pissed about AMD's fuckup. Nvidia got a number on a piece of paper wrong (who the hell cares, really?), AMD actually gave consumers poorer-performance than expected.

Nvidia has done the same thing with review samples.
 
Status
Not open for further replies.
Back
Top