GTX 970 flaw

Status
Not open for further replies.
I've got news for yall. Nvidia and AMD have both been stuffing more RAM in to cards than the card is capable of using for ages. It's called marketing. You got what you paid for. Don't shame the seller, shame the buyer for not knowing what they were buying.

Soooo how exactly was the buyer supposed to know what they were buying when nVidia lied about the specs?
 
I've got news for yall. Nvidia and AMD have both been stuffing more RAM in to cards than the card is capable of using for ages. It's called marketing. You got what you paid for. Don't shame the seller, shame the buyer for not knowing what they were buying.

Also 100% wrong.
 
Ahh, so they made a marketing mistake. Did you not buy your cards based on the reviews?

Do you see people going ape shit over every car manufacturer that advertises their vehicles as having 400HP and only dynoing 330HP? No.

They made a mistake. You cards still performs the same as it did the day it was reviewed before you made your purchase. Now you just want more than what you paid for.
 
Soooo how exactly was the buyer supposed to know what they were buying when nVidia lied about the specs?

Most people don't understand the specs and technical limitations anyway at all anyway (as is clearly evidenced by people asking for a firmware update to fix this). You should buy a card based on reviews and other user's feedback, not because one card says it has 12 unicorn teeth in it and another card only has 11 Bigfoot teeth.
 
please remember the rules....particularly #1



(1) Absolutely NO FLAMING, NAME CALLING OR PERSONAL ATTACKS. Mutual respect and civilized conversation is the required norm.
 
Are you John W.? If so, why were you giving the customer service rep such a hard time? Kenny's OC suggestion (or non-suggestion) was absurd, but I fucking hate seeing people pull this "I'm the customer and you're a doormat" bullshit.

The rep Kenny actually wasn't suggesting overclocking but to ensure that it was running at its top stock boost bin. People heckling CSR's like that are part of why people with legitimate concerns have such a hard time getting a serious response for a wide variety of issues :(... that CSR has nothing to do with the VRAM "issue"/concern (which has been shown so far to have no appreciable impact except on spec sheets, compared to what we expected to have gotten when buying off of reviews... looking forward to FCAT and frametime data to confirm though). Hassling him over it is just ridiculous.
 
Ahh, so they made a marketing mistake. Did you not buy your cards based on the reviews?

Do you see people going ape shit over every car manufacturer that advertises their vehicles as having 400HP and only dynoing 330HP? No.

They made a mistake. You cards still performs the same as it did the day it was reviewed before you made your purchase. Now you just want more than what you paid for.

You realize the issue is very subtle, and only happens when games actually require >3.5GB vram right? Most reviews don't go to the trouble of forcing a game to use up vram by cranking up AA like crazy, so it's likely most (all?) reviews failed to catch on to this.

To answer your second question: Absolutely And we're talking about goddamn Hyundai here. People who buy Hyundais aren't exactly performance nuts but they still have every right to be outraged at false advertising.

As for getting what we paid for, well you can tell it to this guy:


And finally lemme ask you something: if this was a genuine "mistake", why did it take nVidia 4 months after release to correct said mistake?
 
You realize the issue is very subtle, and only happens when games actually require >3.5GB vram right? Most reviews don't go to the trouble of forcing a game to use up vram by cranking up AA like crazy, so it's likely most (all?) reviews failed to catch on to this.

To answer your second question: Absolutely And we're talking about goddamn Hyundai here. People who buy Hyundais aren't exactly performance nuts but they still have every right to be outraged at false advertising.

As for getting what we paid for, well you can tell it to this guy:



And finally lemme ask you something: if this was a genuine "mistake", why did it take nVidia 4 months after release to correct said mistake?

Most reviews ran the card up to the point where the game was unplayable and backed off. So, people DID get an idea of what the card was capable of on a given setup.

Again, people that bought based on reviews already had an expectation of the cards limitations.
 
Do you see people going ape shit over every car manufacturer that advertises their vehicles as having 400HP and only dynoing 330HP? No.

They made a mistake. You cards still performs the same as it did the day it was reviewed before you made your purchase. Now you just want more than what you paid for.

Wrong on both.

400 HP on the dealer sticker/fact sheet is rated at the flywheel, while 330 Wheel HP is what a dyno pull will net you, usually between 15-20% less depending on the dyno used. This is standard. No one goes ape shit over this because they are 2 different rates, your WHP and flywheel HP will never be the same.

Want more then I paid for? Try I want exactly what I paid for. Not more, and certainly not less like I am now aware I have been sold. 4GB =/= 3.5 + a much slower .5

Yes the reviews and CURRENT benchmarks show the card capable of handling most things that are thrown at it. It's FUTURE benchmarks I (and many others) are concerned with. I bought this card to handle today's and tomorrow's games, not just this years.
 
Nothing misleading about reviews since they clearly spell out the expectations at specific resolutions and graphics settings.
 
Most reviews ran the card up to the point where the game was unplayable and backed off. So, people DID get an idea of what the card was capable of on a given setup.

Again, people that bought based on reviews already had an expectation of the cards limitations.

Then you don't understand the issue with segmented vram and why it would REALLY hurt at high resolutions.
 
Most reviews ran the card up to the point where the game was unplayable and backed off. So, people DID get an idea of what the card was capable of on a given setup.

Again, people that bought based on reviews already had an expectation of the cards limitations.

Most reviewers run an automatic benchmark and report the average framerate. Not to mention the specifics of this issue, RAM usage, game resolution and settings, future games having increased chance of being affected...

But none of that matters in the end. They've sold the card with false specifications and that alone is enough of a problem.
 
I sometimes wonder who Nvidia market these cards to. I mean any normal consumer product that has a high chance of coil-whining would be boo'ed to no end. I guess most buyers of these cards just don't care? Maybe they're between the ages of 10-16 and have already a billion fans running in their systems? Kind of puzzling. Nvidia doesn't really scream quality the way ATI use to when they made their own cards.
 
Most people don't understand the specs and technical limitations anyway

The "most people are idiots and as such deserve to be screwed into buying a product" is an excelent argument
 
A law suit would be more effective. But also completely ineffective, so both of these ideas are a total waste of time.
If people are going to buy video cards from a monolith of a company, who has no stake in the industry whatsoever aside from themselves, you need to expect them to take any chances at shafting you whenever possible.
 
Most reviewers run an automatic benchmark and report the average framerate. Not to mention the specifics of this issue, RAM usage, game resolution and settings, future games having increased chance of being affected...

But none of that matters in the end. They've sold the card with false specifications and that alone is enough of a problem.

Obviously you haven't read the reviews here at [H] then to understand they pushed the card to it's limit, then backed off to a playable state. So, yes, the expectations HAVE been set in both single and SLI configs.
 
Obviously you haven't read the reviews here at [H] then to understand they pushed the card to it's limit, then backed off to a playable state. So, yes, the expectations HAVE been set in both single and SLI configs.

They even overlay the graph with a competing product so any major discrepancy would be immediately obvious.

All this drama reminds me of Columbus announcing the world is not actually flat but round. Did it affect anyone before or after. No and no.
 
They even overlay the graph with a competing product so any major discrepancy would be immediately obvious.

All this drama reminds me of Columbus announcing the world is not actually flat but round. Did it affect anyone before or after. No and no.

For real. The 970 stomps a mud hole in the 290X which sells for roughly the same amount. All the games reviewed on [H] were playable at 2560x1440 with high, if not higher settings, than the 290X. Everyone KNEW what they were buying as far as how the card performed. A lousy oversight in the marketing material has everyone's panties in a twisted knot like they were cheated.
 
For real. The 970 stomps a mud hole in the 290X which sells for roughly the same amount. All the games reviewed on [H] were playable at 2560x1440 with high, if not higher settings, than the 290X. Everyone KNEW what they were buying as far as how the card performed. A lousy oversight in the marketing material has everyone's panties in a twisted knot like they were cheated.

I am happy with the card performance but I would like 4Gb ram.
 
For real. The 970 stomps a mud hole in the 290X which sells for roughly the same amount. All the games reviewed on [H] were playable at 2560x1440 with high, if not higher settings, than the 290X. Everyone KNEW what they were buying as far as how the card performed. A lousy oversight in the marketing material has everyone's panties in a twisted knot like they were cheated.
Lol back in the real world, the 970 most certainly does not "stomp a mud hole in the 290X". In fact it is SLOWER than the 290x in newer games.

And you are pretty naive if you think not listing the actual 970 specs by now was just a "marketing oversight".
 
Lol back in the real world, the 970 most certainly does not "stomp a mud hole in the 290X". In fact it is SLOWER than the 290x in newer games.

And you are pretty naive if you think not listing the actual 970 specs by now was just a "marketing oversight".

Yeah, I wasn't getting the 290X mud hole stomping either. The 970 sits at or below the 290X. The 980 is above but far from mud-hole stomping too.
 
I'm a happy owner of an AMD HD 7950. I got a good deal on it when I bought it. I know it's a cut down version of the HD 7970, and I'm ok with that. It has 3 GB of memory. I was happy to get so much, as I was coming from a 1 GB card, and I figured that much memory was more about future proofing.

Due to having a 1080p monitor, and that downsampling from higher resolutions don't work with newer catalyst drivers, and that I don't play the latest, AAA, games, I rarely see anything like memory usage coming close to hitting my cards 3 GB limit. But, "who knows?", I might download some textures, or buy a very demanding game, or in the future a game built around insanely high quality textures will come out and I'll want it.

If, after happily having my HD 7950 for years I was unable to use all 3 GB's of memory (the way HD 7970 users could) because of some, newly discovered, gimmick that AMD used to push out the 7950's with their listed spec's, I'd be pretty annoyed.

So, if I'm being honest with myself about that, my take on this brouhaha is this: It all boils down to the question of if there are scenarios where a 970 is embarrassed next to a 980 due only because of the gimmick in question. If that scenario exists then, imo, nVidia is due some grief*.

But, if it's only a case of nVidia having been clever enough, even if by way of lots of gimmickry, to work around the issue by rerouting work, and assets, etc, then nVidia merely has some egg on their face. Though the amount of egg would, imo, call for some serious accountability. I also think some game coupons would be a good gesture.

*As to the "grief" scenario: This is tricky as the 970 remains a great deal even if the worse case scenario applies. But if it is a worst case scenario, for people who do manage to constantly play at and beyond the hypothetical "danger zone" of this card, then, yeah, I won't be surprised to see legal implications.

Not a joke for nVidia if some people suffered a tangible harm. I think we need some real gaming gurus/tech wizards, who are totally independent, to look at these supposed worst case scenarios that provide hiccups as compared to the performance of the 980, which supposedly doesn't suffer them. Naturally, the added power of the 980 has to be allowed for.
 
Lol back in the real world, the 970 most certainly does not "stomp a mud hole in the 290X". In fact it is SLOWER than the 290x in newer games.

And you are pretty naive if you think not listing the actual 970 specs by now was just a "marketing oversight".

You must be reading different reviews than I am. The two on [H] for the Asus and MSI cards show the Asus beating the 290X and the MSI royally beating the 780 and 290.

http://www.hardocp.com/article/2014...ix_directcu_ii_video_card_review#.VMgUYivF8W4

http://www.hardocp.com/article/2014...70_gaming_4g_video_card_review/1#.VMgVhCvF8W4
 
No stomping found in those reviews. Reference 290x vs. Overclocked OC Strix 970s. Ok.


Move on to 4k and watch the 970 drop like a rock.

970 goes from being faster than a 290x in 1080p to being slower than a 290 in 4k. Already tested that.
 
You must be reading different reviews than I am. The two on [H] for the Asus and MSI cards show the Asus beating the 290X and the MSI royally beating the 780 and 290.

http://www.hardocp.com/article/2014...ix_directcu_ii_video_card_review#.VMgUYivF8W4

http://www.hardocp.com/article/2014...70_gaming_4g_video_card_review/1#.VMgVhCvF8W4
You might want to actually look at your own links a little closer. Overall the reference stock 290x is faster than the factory oced 970 in apples to apples settings.

And how about a current summary of many games which shows the 290x is actually faster?

http://www.techpowerup.com/mobile/reviews/Palit/GeForce_GTX_960_Super_JetStream/29.html

Bottom line in no way shape or form does a 970 "stomp a mud hole in a 290x" except in your own mind. In reality the 970 is SLOWER overall than the 290x.
 
Lol back in the real world, the 970 most certainly does not "stomp a mud hole in the 290X". In fact it is SLOWER than the 290x in newer games.

And you are pretty naive if you think not listing the actual 970 specs by now was just a "marketing oversight".

Nah, stock-to-stock it is similar, but oc-to-oc the GTX 970 oc's very high, for virtually everyone resulting in large speed gains, while the 290X varies wildly in results and adds an unreasonable amount of heat/noise when doing so for many people. Adding in physx/shadowplay/mfaa/etc. (mfaa in particular 4x MFAA vs 4x MSAA is a 20-30% performance gain alone and looks essentially identical by all reviews, now works with just about every DX10/11 title in existence!) just widens the gap between the cards' value propositions, as does the lower heat output and noise profiles of the GTX 970.

I've yet to see something showing true losses compared to what we already had expected and been experiencing with the GTX 970 cards, just a lot of rampant hyperbole and speculation with a few articles stating it's nothing to write home about. My only real concern, other than it being false advertisement and nvidia needing to take heat for that, is whether the >3.5gb use can contribute abnormally to microstutter in SLI tests, and without a review site using FCAT we simply don't know that yet though the technical description is showing it shouldn't.

Here's some oc-to-oc #'s I found awhile back...

___________________
"
Here's a review someone found I thought was worth posting as this kind of comparison is often cited but rarely has data to answer it so far:
http://www.reviewstudio.net/2028-asus-gtx-970-strix-oc-review-bring-the-maxwell-to-the-owl

It compares a 1228mhz GTX 780 Ti OC vs. 1300mhz GTX 780 OC vs. a 1530mhz GTX 970 OC, for the curious (boost clocks as stated in the text).

The 780 Ti in the review is stated to be running at 1228mhz boost in-game with 1975mhz memory (7900mhz qdr)
(http://www.reviewstudio.net/1179-as...iew-best-performance-dead-silent/overclocking) and you can find the same reference in the 780 OC review on their site.

BASE CLOCKS are listed by them in the graph. Boost clocks according to the review are as stated above :). This is actually the best oc-to-oc comparison between a 780 ti at good clocks, 780 at good clocks, and GTX 970 at good clocks, that I have seen yet.

9AfYZQk.png


UMtxmqU.png


7xG9sRd.png


34W7KjL.png


The gap in other reviews narrows a little at 4K, while remaining similar to 1080 when done at 2560x1440 resolution, keep in mind. (They have a 3dmark score in there too but list the combined, rather than GPU, score which makes it meaningless :(). The GTX 970 oc'd stacks up very favorably to the GTX 780 Ti oc'd and consistently beats out the 780 oc'd.
"

_______________

Additionally in upcoming games the picture looks grim on the 290X front compared to a GTX 970:

dYig3pA.png


RGTaHmx.png


pzDzkpb.png


Don't get me wrong, I'm annoyed with the whole misadvertisement of the specs, but at the end of the day I still have the exact same performance I paid for and expected at the time of purchase 4+ months ago and have been enjoying my pair of GTX 970 cards in SLI ever since. AMD has focused so much on their Mantle marketing stunt that they neglected their DX11 driver optimization, and it's taking its toll... DX12 hitting will completely outmode Mantle, too.

EDIT: Oh, and just for a very easy demonstration of how good DX11 optimization is for these...

dbcIyLv.png
 
Last edited:
Yep, no doubt 1080p is faster. Story changes once the resolution is increased. 290/290x and 980 are excellent 4k performers. 970 starts to chug once it hits the 3.5gb of VRAM threshold.

I always thought it was the shader deficit of the 970 but it appears that there was more to the story.
 
Yet more evidence that Nvidia feels contempt even for its customers. The stinking herd of fanboys most of all. Useful idiots for Nvidia, as Lenin old boy might say.
 
Yep, no doubt 1080p is faster. Story changes once the resolution is increased. 290/290x and 980 are excellent 4k performers. 970 chugs due to the 3.5gb limit.

Wrong:

http://www.pcgameshardware.de/Geforce-GTX-970-Grafikkarte-259503/Specials/zu-wenig-VRAM-1149056/

Please do your research first... articles already are coming out showing there's no extra adverse impact due to the segmentation. And before that, we had no proof one way or the other, so stating that as fact is mind-boggling.

Oh, and just to mention, GTX 970 perf in 4k is beautiful ;):
qO9bQ9g.png


I ended up shipping my monitor back at this point but I will probably go 4K again once a gsync IPS model launches :D. With 60hz I definitely want no input lag or tearing.
 
Must yet another topic be polluted with Nvidia vs AMD fanboyism?


Please do your research first... articles already are coming out showing there's no extra adverse impact due to the segmentation. And before that, we had no proof one way or the other, so stating that as fact is mind-boggling.

Except that frametime tests do show there is an adverse impact going over 3.5GB.
 
Last edited:
I have already done my research, thanks. Now back to the topic.

I like both companies. Give me more VRAM and you get my money.
 
Looking back, I blamed my GTX 970 SLI random instability on the voltage discrepancy bug, but I wonder if the random crashes/stuttering were potentially caused by the fact that I had settings turned up high at 2560x1600 and was often in the 3.5GB or more range. Can't test now, since I sold them.
 
Verdict's out, the vram wall is real and does negatively impact smoothness:

First frametimes results coming in, from PCGamesHardware.de:

p0oDb0Q.png

Xew0wIU.png



qg93AHQ.png

DJrIvi4.png



Google Translate:
However, once the memory above 3.5 gigabytes is really needed and the driver can not shirk that it be situated in Ultra-HD with 4 x MSAA and "High" texture maps with up to 3,980 MiB, shows that there must be tricked, to view the full four gigabytes. The frametimes be far more uneven compared to the GTX 980 (which, incidentally, may approve about 70 MiB more still) and it shows not only in the diagram, but sensitive natures can do this in a direct comparison in the game notice.

Away from "normal" benchmarks are the differences between GTX 970 and 980 clearly than would imply the previously known specifications - at least in the on our benchmarks. While the behavior of the driver and the heuristic possibly obtain different application-specific good results, remains a stale aftertaste whether one or the other or stuttering stutterers in the border area with the previously announced Nvidia configuration would not be avoided.
 
Nah, stock-to-stock it is similar, but oc-to-oc the GTX 970 oc's very high, for virtually everyone resulting in large speed gains, while the 290X varies wildly in results and adds an unreasonable amount of heat/noise when doing so for many people.

Thats a big blanket statement to put out there.
Saying any GPU is sure to get great OCs is just bad form. They do not all OC well just because 30 or 40 people could get those clocks. We are talking about thousands of boards being shipped and sold here.
The only products where I see pretty similar OC results would be CPUs.
 
Verdict's out, the vram wall is real and does negatively impact smoothness:

Well shit, I totally misread that graph set before :eek: :mad:. Guess my 4k monitor wasn't at fault after all... which sucks, because I already sent it back for a refund and had gotten a killer deal on it originally.

Thats a big blanket statement to put out there.
Saying any GPU is sure to get great OCs is just bad form. They do not all OC well just because 30 or 40 people could get those clocks. We are talking about thousands of boards being shipped and sold here.
The only products where I see pretty similar OC results would be CPUs.

Maxwell 2.0 chips are basically the 2600K of GPU's :p. In general I agree though.
 
Well now you know why I'm pissed :p

But seriously though sucks about the 4K monitor, and there goes my ROG Swift plans
 
I'm pissed

Me too, now.

But seriously though sucks about the 4K monitor, and there goes my ROG Swift plans

Yeah... I'm going to have to see what comes up later in the year, I guess at this point. I don't want to sell these cards on ebay given their brand spanking new 180-day return policy implemented through paypal... and I can't replicate that deal I had on the 4k monitor anyway :( ($675 for the 32" Acer B326HK).
 
Status
Not open for further replies.
Back
Top