Petitioning for 970 Refund

Lets make the asshole lawyers even richer!....


.....Please dont fall for this. :(

Yeah, fixed that for ya. Class action bait, and someones lawyers are going to make bank no matter who wins if this happens. And you can buy a slurpee.

...and you could say that for ALL class action suits

Sometimes it truly is about the principle. If it doesn't cost me anything besides a bit of my time, and ends up costing nVidia tens of millions in legal fees and bad publicity, that's already good enough for me.

And you guys realize if we just rolled over and took it, it basically tells nVidia what they're doing is ok right?
 
Last edited:
This. I think a lot of people are totally clowning on this, or they have no idea what they are doing. Who the hell would play BF4 at the settings needed to get 25FPS performance @ 4K then complains about stutter? People stutters is the least of your worries if you are only managing 25FPS. Also the GTX980 only gets 27 FPS (with less stutter) however that is still completely unplayable for many people. People don't even care that game is basically a slideshow, they are like OMG LOOK AT THOSE FRAME TIMES!

Is anyone really surprised a $300 card can't run 4K resolution with 1.5x DSR which means the car is actually rendering 5760 x 3240 or 18.66 million pixels? I don't think people even understand how DSR works.

This is exactly what I've been saying all along, and people have been burning me up.

I think this entire issue has been blown out of proportion. Yes, nVidia should have published the exact specs, I fault them for that.

HOWEVER, what people are claiming as a reason to dump the card and all the hate on nVidia is just stupid. I think it's a deeper psychological issue. People love the drama. NV has been crushing AMD for years now and people want any reason they can get to hate the big guy and rush to the underdog. It's a mob mentality. The card still performs great, and I keep seeing people in this thread say "Well the 970 has the best bang for the buck, so what card should I get when I get my refund???"

:rolleyes::rolleyes:

It's a mental issue, not a hardware issue. The card performs great no matter what. :eek:
 
"Well the 970 has the best bang for the buck, so what card should I get when I get my refund???"

:rolleyes::rolleyes:

It's a mental issue, not a hardware issue. The card performs great no matter what. :eek:

the 290 is by far the best bang for the buck.
anyone who knows cards knows that.

a class action suit is what nvidia is heading for.
 
the 290x is the best bang for the buck now...but when the 970 launched prices on the 290x weren't as low as they are now so it wasn't as good of a deal...I don't mind the heat/power consumption but I do mind that most games nowadays seem to be much more optimized towards Nvidia
 
I keep seeing people in this thread say "Well the 970 has the best bang for the buck, so what card should I get when I get my refund???"
I would have to disagree and say that the 290/290X is the better bang for the buck. The MSI 290X Lightning is only $300 AR and you can get an XFX 290 Double Dissipation model for $220 AR. The cheapest 970 is still around $340.

It's a mental issue, not a hardware issue.
Wrong. The 970 does have memory issues when utilizing more than 3.5GB of memory. It has been proven.

The card performs great no matter what.
Wrong. The 970 has performance issues when utilizing more than 3.5GB of memory. It has been proven.
 
This is exactly what I've been saying all along, and people have been burning me up.

I think this entire issue has been blown out of proportion. Yes, nVidia should have published the exact specs, I fault them for that.

HOWEVER, what people are claiming as a reason to dump the card and all the hate on nVidia is just stupid. I think it's a deeper psychological issue. People love the drama. NV has been crushing AMD for years now and people want any reason they can get to hate the big guy and rush to the underdog. It's a mob mentality. The card still performs great, and I keep seeing people in this thread say "Well the 970 has the best bang for the buck, so what card should I get when I get my refund???"

:rolleyes::rolleyes:

It's a mental issue, not a hardware issue. The card performs great no matter what. :eek:

Pushing a card beyond it's limits and then complaining about it is one thing, and as far as that point is concerned, I agree.

But the complaints I've witnessed and where I stand on the problem is people showing comparisons between the 970 and the 980. Both rated as 4GB VRAM cards. Side by side video comparison shows the 980 with smooth gameplay between 3.5 and 4GB of memory used. That same situation done on a 970 and there's really bad stuttering. And not just video, there are graphs showing such as well.

Is that not an issue? Oh, and the stuttering doesn't happen on the 970 when less than 3.5 GB VRAM is used. Yeah, it's important to separate this stuttering issue from blatantly pushing the card beyond it's limits... But I don't think that's what's going on which makes that point irrelevant.
 
This is exactly what I've been saying all along, and people have been burning me up.

I think this entire issue has been blown out of proportion. Yes, nVidia should have published the exact specs, I fault them for that.

HOWEVER, what people are claiming as a reason to dump the card and all the hate on nVidia is just stupid. I think it's a deeper psychological issue. People love the drama. NV has been crushing AMD for years now and people want any reason they can get to hate the big guy and rush to the underdog. It's a mob mentality. The card still performs great, and I keep seeing people in this thread say "Well the 970 has the best bang for the buck, so what card should I get when I get my refund???"

:rolleyes::rolleyes:

It's a mental issue, not a hardware issue. The card performs great no matter what. :eek:

What a load of crap. Try SLI @ 1440p (which, these cards can push decent framerates at 1440p, so don't try to weasel out of that. I'm running my RoG Swift on 780 SLI). Or don't; even without SLI, 30FPS is more than playable for many people. So no, 25FPS isn't that big of a deal. Some people are more than willing to deal with that framerate for the visuals. I can tell you right now if I got this card, I would be rather angry at this point.

Furthermore, you realize the 290/X is a better price/performance deal than the 970 right? That's been true since the day the 970 was released. The reason people go for the 970 is that (all other things being equal) it has lower power draw and much lower temps (well and obviously any nvidia features you might want, such as Shadowplay or G-Sync).

Well all other things are not equal. Nvidia just tacked on 500MB of crapram to bring their cards equivalent to competition. For many that 500MB being there was probably a large part of their decision to go for this over AMD.


Frankly either way... You don't get the product you paid for. That's not mere drama, that's a fact. Nvidia lied. If you don't see this, you're either a troll or a fanboy (in which case, probably worth putting on the ignore list). When I get a product that is not what I paid for either way, I get a bit angry too. Especially when it costs >300$.
 
So, how about this here guru3d article people saying there's no problem keep linking? What an interesting read-- uhoh!

fAsBumR.png

http://i.imgur.com/fAsBumR.png

Hmmmmmmm... that's why I said uhoh above ;).
 
I dont post on [H] much anymore. I just came to say i have been researching an upgrade from my slightly aging 8800gt from years ago (great card BTW just started showing its age recently :p)

I sure am glad i didn't fire off the order button too soon for a 970 after researching cards. Just purchased an AMD 290 on newegg for $219 after $30 rebate. This is my first AMD card, and it makes me a little sad as i've always went Nvidia. Geforce 3 was my first Nvidia card.

This was a really hard decision to make. Even after realizing the power draw, and heat production of the AMD 290, i still decided to go with it over any Nvidia offerings. This current fiasco rubbed me the wrong way, and am glad i went with AMD. Performancewise, i think the 290 will be a great card over the 8800gt i currently have.
 
I sure am glad i didn't fire off the order button too soon for a 970 after researching cards. Just purchased an AMD 290 on newegg for $219 after $30 rebate. This is my first AMD card, and it makes me a little sad as i've always went Nvidia. Geforce 3 was my first Nvidia card

should be a nice improvement over your 8800GT...enjoy!
 
people need to disclose whether they are using 4k or SLI...those are valid reasons for returning the 970...everyone else is just getting caught up in the hysteria...placebo effect...been playing the Battlefield Hardline beta and it's a solid 60fps...same with the Evolve beta...big difference in fps from my previous GTX 580...the card is an excellent performer at 1920 x 1200
 
Golden Tiger what you linked too is sadly all to common in hardware journalism. Nvidia really makes these reviewers not want to bite the hand that feeds. I wish we had a hardware website that had some balls.
 
Last edited:
people need to disclose whether they are using 4k or SLI...those are valid reasons for returning the 970...everyone else is just getting caught up in the hysteria...placebo effect...been playing the Battlefield Hardline beta and it's a solid 60fps...same with the Evolve beta...big difference in fps from my previous GTX 580...the card is an excellent performer at 1920 x 1200

Both. :( Well, it was both until I returned the 4k monitor thinking it was the issue, not the gtx 970 sli :mad:. I do use DSR though anyway at 1440p to higher resolutions until I get a 4k monitor back at some point.
 
people need to disclose whether they are using 4k or SLI...those are valid reasons for returning the 970...everyone else is just getting caught up in the hysteria...placebo effect...been playing the Battlefield Hardline beta and it's a solid 60fps...same with the Evolve beta...big difference in fps from my previous GTX 580...the card is an excellent performer at 1920 x 1200

Or are going to use it for 4k or SLI. Just because someone isn't yet doesn't mean they don't have intentions to. I can realistically see myself getting a 4k monitor within the next year. And I plan to use my 970 for at least two years which is about the norm for me. It'd be great if it were future proof enough for three years.
 
So I have two separate GTX 970 cards (one FTW and one SC in two different systems) manufactured by EVGA.

A few weeks after purchasing them, EVGA released an updated FTW+ and SSC and I originally planned to step up to the newer versions.

However with the memory issue that is well documented, should I just step up right to the GTX 980?

One system will be 1080P for the time being (new monitor purchased a year ago), meanwhile I have plans for my build to go with a 1440P display (currently running at 1680x1050). I'm thinking the 1080P will be good for the GTX970 as it is for a pre-teen who uses it for the Lego games and FIFA games.

My system I am debating. Currently I play FSX and TF2 but will eventually buy GTA 5. am I better off staying with the 970 or going with the 980? I'm even tempted to put the cash I'd spend on stepping up to the 980 aside and upgrade when the next generation cards come out as I almost feel that would be money better spent.
 
people need to disclose whether they are using 4k or SLI...those are valid reasons for returning the 970...everyone else is just getting caught up in the hysteria...placebo effect...been playing the Battlefield Hardline beta and it's a solid 60fps...same with the Evolve beta...big difference in fps from my previous GTX 580...the card is an excellent performer at 1920 x 1200

Me, I have 4K/SLI 970's. I've got an email in to Newegg. Hopefully they make good.
 
Or are going to use it for 4k or SLI. Just because someone isn't yet doesn't mean they don't have intentions to. I can realistically see myself getting a 4k monitor within the next year. And I plan to use my 970 for at least two years which is about the norm for me. It'd be great if it were future proof enough for three years.

You need more than 4GB if you want three years of use.
 
You need more than 4GB if you want three years of use.

Maybe, but the absolute max I'd ever be willing to spend on a GPU would be ~$550. And I was debating on a 980 but I just couldn't talk myself into it because of the price. If it's rather noticeable that I need something better after about two years so be it. If not and I can get by, by sacrificing some of the ultra settings to make three I wouldn't be against doing that.

My thinking on this comes from the fact that I've been using a 2GB VRAM card since 2010 (Radeon HD 5850, and GTX 680) without much issue. Granted that's all been on 1080p. I'm hopeful nearly doubling the RAM at this point for me personally will last around three years. I'll actually be surprised if an 8GB VRAM GPU is in my price range two to three years down the road anyway. Likely I'll be looking at 6GB versions.
 
It's not better or more effective use of the VRAM to let it sit there unused when it can be used to improve performance which seems to be the reason devs are doing it, improving performance by utilizing the resources better is optimization by the definition you provided.

If the VRAM isn't being efficiently used to actually improve the experience or visual fidelity of the game I'd rather have it sit there unused than trying to seek out every last MB the card says it has and use it poorly. The other part of that is the coding and engine optimizations that exist in the game's programming lines. Just because something uses 4GB doesn't make it more optimized or more efficient over something using 3GB if the 3GB actually provides a better quality image and frame buffer - which is exactly the issue with the majority of games and console ports now. They'll seek out every ounce of VRAM available and use it about as efficiently as filling up an Olympic swimming pool with a garden hose.

If less games and major titles were like that there would be a lot less fuss over the slow 512MB on the 970. In fact, we probably would still be okay with 3GB cards up to non-4K resolutions. I realize though that unfortunately this isn't the reality and the PC gaming master race should be ashamed for letting this bullshit drag on for so long.
 
VSR/DSR has made native resolutions irrelevant with regards to VRAM. Nothing looks better than DSR/VSR with some AA.

970SLi could definitely use the full 4GB if you're using it right. If it wasn't needed why are SLI/triSLI users wanting 8GB and nVidia/AMD using 4GB?

Regardless of usage/function, my real issue is the complete lack of ethical conduct by nVidia before and after the issue was found.
 
Last edited:
NV run msrp/map pricing
blame amazon / people retuning fully working cards

the card is just as fast as it was when it was reviewed
itt bunch of entitled kids

Speed has nothing to do with RAM exhaustion. You missed the whole point of this issue.
 
VSR/DSR has made native resolutions irrelevant. Nothing looks better than DSR/VSR with some AA.

Well, the real deal high resolution does. :D But yes, ever since I discovered downsampling+AA I have been kicking myself why I went for 2GB GTX770 back then, 4gb models were not THAT much more expensive. It looks so pretty but VRAM starvation kicks in fast with current gen games.
 
Well, the real deal high resolution does. :D But yes, ever since I discovered downsampling+AA I have been kicking myself why I went for 2GB GTX770 back then, 4gb models were not THAT much more expensive. It looks so pretty but VRAM starvation kicks in fast with current gen games.

I edited it and added "with regards to VRAM" :)

I figurd you got what I meant but other people like to nitpick in a negative way. Not saying we can all go back to 800x600 heh.
 
http://www.hardocp.com/article/2014...x_directcu_ii_video_card_review/#.VNDVzS7ClWQ

not seeing a 'bait and switch' you got what you payed for
unless you can prove EVERY reviewer got a golden card...

how many reviewers were running high res modern games trying to exceed 3.5GB ram? if you buy a car that is advertised as able to do 140mph, but it turns out it can do 20mph in 1st and 120mph in 4th and thats how they got 140, yet none of the reviewers noticed because none tried to push it to 140 does not make it any less of a lie
 
how many reviewers were running high res modern games trying to exceed 3.5GB ram? if you buy a car that is advertised as able to do 140mph, but it turns out it can do 20mph in 1st and 120mph in 4th and thats how they got 140, yet none of the reviewers noticed because none tried to push it to 140 does not make it any less of a lie

This is probably the worst analogy since this discussion started. It's more like you are doing 140 and your car wants to downshift to first gear.
 
This is probably the worst analogy since this discussion started. It's more like you are doing 140 and your car wants to downshift to first gear.

Kid, you'll fly over the handlebars!
 
This is probably the worst analogy since this discussion started. It's more like you are doing 140 and your car wants to downshift to first gear.

how is it a bad analogy? the memory can't be read in parallel, they took the speed of the fast memory and added it to the slow memory and called that the memory speed... even though its impossible to use both at once, I think it's an excellent analogy
 
how is it a bad analogy? the memory can't be read in parallel, they took the speed of the fast memory and added it to the slow memory and called that the memory speed... even though its impossible to use both at once, I think it's an excellent analogy

Of course you do because you wrote it.
 
Probably a better (but still crappy) analogy is once the car goes above 140mph, the intake manifold starts leaking and causes the engine to misfire.
 
Ugh, this has been covered multiple time. It is not so much a problem of how much RAM games need, but how much of the available one they use. A 3 GB 780 will run smoothly with the same settings where 3.5/4 GB 970 will stutter. People are going to 290X and 980 not because they have 4 GB, but because they won't stutter when it is used. If 970 was a 3.5 GB card, very few people would need a 980/290X.

Thanks for reminding of that, I *keep* forgetting that it's not a 3.5GB vs 4GB argument, it's a advertised 4GB vs actually 3.5GB + 0.5GB issue.
 
Probably a better (but still crappy) analogy is once the car goes above 140mph, the intake manifold starts leaking and causes the engine to misfire.

Yours has to be the most accurate... But the one about downshifting to 1st at 140 mph made me chuckle.
 
Of course you do because you wrote it.


so am I wrong that they took the speed of the slow and fast memory, summed them and called that the total speed even though there is no way to ever achieve that since they are used independently and not in parallel?
 
so am I wrong that they took the speed of the slow and fast memory, summed them and called that the total speed even though there is no way to ever achieve that since they are used independently and not in parallel?

Lol You are wrong about your analogy, which I was talking about. If you can;t see your flaw well that is on you.
 
So, how about this here guru3d article people saying there's no problem keep linking? What an interesting read-- uhoh!

http://i.imgur.com/fAsBumR.png

Hmmmmmmm... that's why I said uhoh above ;).

That depends on what that 3.6Gb number represents...

If they are talking about 3.6 out of 4Gb, 3.6 * 1024Mb = 3686.4Mb
or
If they are seeing 35--Mb in usage and are rounding up to 3.6Gb.
 
[*IMG]http://www.pcper.com/files/review/2015-01-29/BF4_3840x2160_PLOT_0.png[/IMG]

Well, isn't that a nice graph. And no, it doesn't matter those were unrealistically high settings for BF4. The article is proof enough that there will be huge issues when 970's RAM is filled and that is something to be more and more expected as newer games come out. There are already games out there that show the very same thing at 1080p.

I think people very easily forget too, that an extra 20ms swing past the average frametime (when it's taking 18-25ms typically per frame) is pretty much the same as skipping a frame outright in terms of the visible hitch, and when it does it a bunch of times per second, it isn't pretty seeing that stuttering. In SLI particularly with these cards where they otherwise would have had the horsepower to run demanding settings, it's a major issue, and I don't think it's any kind of stretch to say that people running two or three high-end video cards in one system aren't generally doing so to run high settings and/or resolutions in most situations :).

Golden Tiger what you linked too is sadly all to common in hardware journalism. Nvidia really makes these reviewers not want to bite the hand that feeds. I wish we had a hardware website that had some balls.

Can't agree more. I can't exactly blame these guys for wanting to keep out of a pretty big and heated situation, given that they depend on ad income and free review samples primarily for their revenue stream to keep the lights on and presumably earn a living... but it doesn't make it ethical. I have a hard time believing that such a basic, fundamental thing could be accidentally overlooked on an article they have plastered to their front page (guru3d homepage) for days and had set to show up near the top every single day. Especially on such a big issue. Even if they had, it's almost just as bad. We'll see if they update their article with properly designed testing or not from here ;).

So, how about this here guru3d article people saying there's no problem keep linking? What an interesting read-- uhoh!

http://i.imgur.com/fAsBumR.png

Hmmmmmmm... that's why I said uhoh above ;).

Considering Guru3d has banned me from their forums (member since 2002 with many positive contributions) for pointing it out... I wouldn't imagine they care too much about their recklessness.

A funny aside is they edited out my 250 kilobyte image and claimed it was "too big" a file size in a PM to me (warning that a forum infraction would be handed out if I posted it again there) and was thus being removed, minutes before banning before I even saw the email. Given that for a broadband connection that takes typically around a tenth of a second to download I couldn't help but chuckle on that one.

GFphKC1.png
 
Last edited:
Sounds like they got butthurt about writing a shitty analysis and getting called out on it.
 
I filed a complaint as a consumer and purchaser of a GTX970 on 1/4/15 for two reasons. False ad (64 ROPs) later disclosed to be 56 and deceptive ad 4G Vram, which implies 4 FULL gigs.

Signing a petition is fine BUT if you go on the FTC web page you can also file a consumer complaint.

Sent the GTX970 back.
 
So I tried to make a graphical summary of all the events that have happened thus far. It's not in chronological order, but hopefully is satisfactory for those who just want a tl;dr summary, and I hope some will find some much needed comic relief in them. (yes there is some text, sue me. No wait on second thought sue nVidia instead :p)

Let's start with honest Europeans:


lHdU7oa.png


qg93AHQ.png
DJrIvi4.png

]
p0oDb0Q.png
Xew0wIU.png


Source

Google Translate of the last two paragraphs:
However, once the memory above 3.5 gigabytes is really needed and the driver can not shirk that it be situated in Ultra-HD with 4 x MSAA and "High" texture maps with up to 3,980 MiB, shows that there must be tricked, to view the full four gigabytes. The frametimes be far more uneven compared to the GTX 980 (which, incidentally, may approve about 70 MiB more still) and it shows not only in the diagram, but sensitive natures can do this in a direct comparison in the game notice. The Radeon, according to a shaky performance in Full HD, here acts over long distances at the level of (clocked down) GTX 980, but gets to the end of our benchmarks towards problems of memory management. Away from "normal" benchmarks are the differences between GTX 970 and 980 clearly than would imply the previously known specifications - at least in the on our benchmarks. While the behavior of the driver and the heuristic possibly obtain different application-specific good results, remains a stale aftertaste whether one or the other or stuttering stutterers in the border area with the previously announced Nvidia configuration would not be avoided.

Frame times of other games:

8.1422525451.svg

10.1422525484.svg

12.1422525547.svg

Detailed results here

4OhMiW5.png


7zIdwaa.png

 
Last edited:
Back
Top