Petitioning for 970 Refund

And then there is this, what has occurred, my card mopping the floor with every card on the market except the 980.
The 970, while a good card, hardly mops the floor with a 290X or overclocked 290. And the 290X and 290 can be had for much less than a 970.
 
§kynet;1041418073 said:
Here
Here
Here

The slides supplied by Nvidia are wrong, 64 ROPs etc. The GPU-Z shot is wrong as well although I don't blame GPU-Z for that.

Here are the SPECIFIC links

1
2
3

There could be other reviews as well but those are the ones I've read.

I checked with the newer GPUz 0.8.1 and it displays the same information as in the screenshot.

We have requested updated slides from NVIDIA.
 
So Nvidia didn't voluntarily send out corrected spec sheets to reviewers after it was proven that the info they initially sent was wrong?

Probably just another "miscommunication" within Nvidia, I'm sure. ;)
 
Resale value isn't going to be much different. When a GTX970 is old and losing value, the 980 will be too. People aren't going to spend another $200 to get a used 980 over a 970 for a small performance gain. Just like right now no one pays a huge premium to get a used 670 over a 660, even though the 670 is faster.

Oh I'm sure someone wanting to sli with there current gigabyte G1 980 will pay for thst 980 over a 970 ;).

It's been mentioned way too many times that runing 970 sli and high rez u will most likely see the issues with the memory controller with a hacked off nut.

Another reason to run from the 970.. no fun with sli. I like the option open if I feel the need down the road to grab another card cause I'm seriously itching to run 1440 rez.
 
So Nvidia didn't voluntarily send out corrected spec sheets to reviewers after it was proven that the info they initially sent was wrong?

Probably just another "miscommunication" within Nvidia, I'm sure. ;)

Yup.. just like from the beginning.. they didn't want to hurt those sale figures by stating less then 4gb card so they will continue on with the master plan. Should have a Asterix next to the 4gb claim atleast.
 
Yup.. just like from the beginning.. they didn't want to hurt those sale figures by stating less then 4gb card so they will continue on with the master plan. Should have a Asterix next to the 4gb claim atleast.
While I don't like the way Nvidia conducts a lot of their business, the 970 is a 4 GB card. Despite the fact that the last 500 MB sometimes runs slower than the other 3.5 GB, it still all adds up to 4 GB. Plain and simple.
 
Oh I'm sure someone wanting to sli with there current gigabyte G1 980 will pay for thst 980 over a 970 ;).

It's been mentioned way too many times that runing 970 sli and high rez u will most likely see the issues with the memory controller with a hacked off nut.

Another reason to run from the 970.. no fun with sli. I like the option open if I feel the need down the road to grab another card cause I'm seriously itching to run 1440 rez.


Pretty much. I've been doing a lot of testing since last night with Dying Light, Shadows of Mordor, etc. GoldenTiger's research is absolutely on the money, the results I'm seeing are clearly FUBAR with 970 SLI even with 1440p at ultra settings in any games that use more than 3.5 GB. Those that are spreading the notion that this issue effects only people running 4k resolution are completely wrong.

So I decided that if I'm going to have to reduce settings in SLI anyway, what the hell do I need the second card for? That puppy is going to find a new home and I'll just stick with a single well overclocked 970 running slightly reduced settings at least until the next round of GPUs come out. No way am I giving NV more money to upgrade to a 980 after this mess, just can't bring myself to do it.
 
While I don't like the way Nvidia conducts a lot of their business, the 970 is a 4 GB card. Despite the fact that the last 500 MB sometimes runs slower than the other 3.5 GB, it still all adds up to 4 GB. Plain and simple.

Hense the 4gb* astrerix I mentioned. Or better would be 3 1/2GB + a 1/2 old man "with a hat" driving in the slow lane GB for a total stuttering 4GB experiance The Way we want you to play!
 
The specs said 4GB at 256 bit width and 7,000 MHz. If that was true nVidia wouldn't have admitted they messed up and this thread would not exist. I am curious how nVidia responds to Kyle's request if there are updated spec slides.
 
I'll tell you exactly how Nvidia will respond - by sitting on their hands for 6 months and waiting for the furor to die down.

Most of you will forget this even happened the next time Nvidia showcases a new line of cards.
 
The specs said 4GB at 256 bit width and 7,000 MHz. If that was true nVidia wouldn't have admitted they messed up and this thread would not exist. I am curious how nVidia responds to Kyle's request if there are updated spec slides.

I'd bet the response is going to be... "Meh."

NV isnt going to say or do anything because they know they've basically got nothing to fear in the long run.

The torch wavers and pitchfork holders of today are still very likely going to be next cycles customers if NV has the best price/performance out there. Plenty have already swapped out 970's for 980's.

I'll wait for AMD to ship its 3XX cards and see what NV offers up (if anything) to counter punch, then I'll decide what to buy.
 
I'd bet the response is going to be... "Meh."

NV isnt going to say or do anything because they know they've basically got nothing to fear in the long run.

The torch wavers and pitchfork holders of today are still very likely going to be next cycles customers if NV has the best price/performance out there. Plenty have already swapped out 970's for 980's.

I'll wait for AMD to ship its 3XX cards and see what NV offers up (if anything) to counter punch, then I'll decide what to buy.

I agree with you guys. nVidia doing anything would probably increase their risk if by some off chance they are taken to court.

It will be hard to ignore better performance. I was just reading in the AMD forums that AMD is neglecting crossfire lately. That doesn't really inspire confidence.

I am hoping DX12 actually levels the field a little, hopefully better sli/Xfire frametimes all around? If AMD offers similar value and performance in 2016 I'll jump ship. Maybe 10% under the Titan II but $600 rather than $1350? Like the 290x and Titan... A bit of speculation there. By a bit I mean I made up pretty much all of it.
 
I'll tell you exactly how Nvidia will respond - by sitting on their hands for 6 months and waiting for the furor to die down.

Most of you will forget this even happened the next time Nvidia showcases a new line of cards.

That's exactly how they are responding right now. FWIW I still remember Bumpgate, but I agree with what you're saying.

I agree with you guys. nVidia doing anything would probably increase their risk if by some off chance they are taken to court.

It will be hard to ignore better performance. I was just reading in the AMD forums that AMD is neglecting crossfire lately. That doesn't really inspire confidence.

I am hoping DX12 actually levels the field a little, hopefully better sli/Xfire frametimes all around? If AMD offers similar value and performance in 2016 I'll jump ship. Maybe 10% under the Titan II but $600 rather than $1350? Like the 290x and Titan... A bit of speculation there. By a bit I mean I made up pretty much all of it.

Apart from the mining craze days, AMD has historically had better price/perf than nVidia. The problem is that while the hardware was good, software support was a bit lacking. There's also the perception that nVidia is somehow more premium, so people were willing to pay more. (I liken the situation to Gigabyte mobos -- great hardware hampered by lackluster bios)

That being said, I honestly can't say nVidia is so much better than AMD recently given how shitty the Maxwell drivers have been. I'd say at this rate for single card solutions AMD might achieve parity with nVidia, and only for multi-GPU setups would nVidia pull ahead due to at least an attempt at pushing out SLI profiles regularly.
 
Most of you will forget this even happened the next time Nvidia showcases a new line of cards.


47e812e6396593a6b2e3397e45a673753398601d0bea3d8adb57a7b79e8aa762.jpg
 
So Nvidia didn't voluntarily send out corrected spec sheets to reviewers after it was proven that the info they initially sent was wrong?

They are probably too busy trucking wheelbarrow after wheelbarrow of cash to the nearest bank to be bothered with sending out updated spec sheets.

There are only so many hours in the day, after all.
 
So Nvidia didn't voluntarily send out corrected spec sheets to reviewers after it was proven that the info they initially sent was wrong?

Probably just another "miscommunication" within Nvidia, I'm sure. ;)

ah the card didnt get slower.... and most people have no clue what 'level 2 cache' is let alone what a 'ROP' is

and if you bought the card purely off the pr spec sheet sight unseen ... well i got some other things to sell you too
cash only please small bills
 
Originally Posted by SRTie4k View Post

Most of you will forget this even happened the next time Nvidia showcases a new line of cards.

I already forgot about it.
 
Opinions are already fairly cemented on the topic. its a bit of a black eye on their reputation. It won't ruin the company but they already lost some consumers.
 
I'd venture to say that the VAST majority of 970 buyers haven't a fucking clue this issue even exists.
 
I'd venture to say that the VAST majority of 970 buyers haven't a fucking clue this issue even exists.

Funny story. The reference 970 I bought went into a miniITX 4690k build for my sister's family's Christmas present. I set up the software/hardware, moderately OC'd everything shipped it 1000 miles, told her to skype me to set it up. She didn't. Her husband hooked it up and they loved it. After a month I finally got them to run 3dmark and take temps just for my own sanity. She got 7xx firestrike instead of 10k. Also helped her setup minecraft and it only got 44fps. It dawned on me she was running off the IGP for a month and thought it was great...

Yeah people probably don't notice!! Hehe
 
I'd venture to say that the VAST majority of 970 buyers haven't a fucking clue this issue even exists.

Or don't care. I know I don't. Obviously there are situations where it could matter, but mine isn't one of them. As I've said, Nvidia could have been more upfront about this, but the card performs very well as is. By the time a game that's actually worth anything comes out, and requires a metric-eff-ton of non-segmented memory, I'll have moved onto a new card anyway.
 
Recent power management features have limited overclocking on desktop parts as well.
It used to be that you could push the voltage quite high and work out for yourself your max stable thermal limit.
Using more advanced cooling allowed you to take advantage of that.
Now there are voltage and TDP limits that dont extend with better cooling.
You can get a higher overclock with better cooling, but without being able to boost the TDP, the TDP ends up being your wall, not the overclock stability.

Its understandable that due to novice users pushing the boundaries without realising the consequences, they have seen the need to do this.
So its not surprising that laptops are even more restricted tbh.
The number of laptops I have had to improve the cooling on to keep them alive is silly, and those were business users who didnt overclock, they just didnt allow them to vent very well.
 
The limitations you mentioned for desktop cards can be overridden by flashing your own BIOS. I finally fixed the damn SLI voltage bug and gave myself as much power as 8+6 pin could deliver, so the only thing holding back my overclock is the silicon and my balls.

I agree that most mainstream (and even some business class) laptops have poor stock cooling. But performance laptops that house a 970M or 980M are designed to handle such high TDP cards without overheating. I have a laptop that comes with TWO 780M cards, each with a 100W TDP rating. I can give both a 10% overclock and still keep temps under 85C. After tweaking the cooling system I've managed to keep temps below 75C.

These are not your el cheapo sub $1000 junkbooks, these are $3500+ 17.3"+ DTRs that can go toe to toe with a gaming desktop.
 
Last edited:
I think you could pin it on them legally, but who is going to take the time and trouble.

Exhibit A) 256-bit GTX 980 running an application at 3GB, how many channels are at greater than 50% utilization. 8, that's good.

Exhibit B) 256-bit GTX 970 running an application at 3GB, how many channels are at greater than 50% utilization. 7, ah okay isn't that 224-bits then for a majority of uses?

86nm98D.gif
 
ah the card didnt get slower.... and most people have no clue what 'level 2 cache' is let alone what a 'ROP' is

and if you bought the card purely off the pr spec sheet sight unseen ... well i got some other things to sell you too
cash only please small bills
What does your comment have to do with anything relevant?

Nvidia knows that the 970 specs they sent to the various review websites are wrong. Yet they couldn't be bothered to send out updated spec sheets so that the article's technical details regarding the 970 could be corrected?

It's simple honesty and courtesy. If you make an error, you try to fix it.
 
Apart from the mining craze days, AMD has historically had better price/perf than nVidia. The problem is that while the hardware was good, software support was a bit lacking. There's also the perception that nVidia is somehow more premium, so people were willing to pay more. (I liken the situation to Gigabyte mobos -- great hardware hampered by lackluster bios)

That being said, I honestly can't say nVidia is so much better than AMD recently given how shitty the Maxwell drivers have been. I'd say at this rate for single card solutions AMD might achieve parity with nVidia, and only for multi-GPU setups would nVidia pull ahead due to at least an attempt at pushing out SLI profiles regularly.

One of the big problems with AMD is the power usage on the cards is getting out of control. Nvidia cards use almost half the power a comparable AMD card will under load.
 
One of the big problems with AMD is the power usage on the cards is getting out of control. Nvidia cards use almost half the power a comparable AMD card will under load.

Tru dat.

When I had the two 290X Lightning cards, I could get my rig to pull 1,100+ watts under full load. In the same system with an OC'd 980, it pulls about 420 watts. Adding a second OC'd 980 would probably bring it up to no more than 700 watts. HUGE difference in both power requirements and heat!
 
This was posted over on reddit.

I just wanted to make a quick post saying that I finally received a response from Newegg about my G1 Gaming GTX 970.
I inquired about returning the card two weeks ago when the news broke out about the VRAM issue. Yesterday I received an email from them with a pasted nvidia response. I found this to be unsatisfactory in that I believe their test case shows a situation in which 4GB is going to be reached, so a similar bottleneck would be experienced on both cards. A better test would show something like 3.7GB of VRAM utilization, where the 980 still has full memory performance but the 970 has fallen off. Someone feel free to correct me if I am wrong here.
Anyways, I just received a call today from a Newegg rep, Rosa. She offered me two choices.
A full refund in the form of store credit, return card.
30% of initial purchase price credited back to my card and I keep it.
I chose the 30%. The card was $360, so I should be receiving over $100 back. I feel like for that money the card is a great value and the issue can be overlooked. That said, I plan to sell it once the new AMD 3XX cards come out.
I do feel bad for Newegg in this case. It wasn't their screw up and now they are having to pay for it. I hope nVidia is compensating them for each instance of this. I know Newegg has taken a lot of flak recently for customer service related issues, but in this case I think they handled it right and I wanted to let that be known.
It's also worth noting that Gigabyte basically told me to blow them when I asked about the issue. So not everyone is responding in kind across the industry.
Feel free to discuss/ask questions,
Cheers!
 
Most people on that post are saying that Newegg will NOT give them the same deal.
 
[L]imey;1041422308 said:
Most people on that post are saying that Newegg will NOT give them the same deal.

I didn't get jack from them. Sold my 970s on the open market. Canceled my Newegg Premier membership. Buying all my graphics cards from Amazon from now on.
 
I decided to keep mine under the current circumstances as I am planning to stay at 1080p until qhd or uhd monitors are 150 bucks.
 
[L]imey;1041422308 said:
Most people on that post are saying that Newegg will NOT give them the same deal.


That is the reason I shared this here, someone must be feeling some heat. If I were a person that was looking to return my 970 to NE, you might reference this.
 
I do feel bad for those wanting to return their cards and are hitting walls with the eTailer's. NVIDIA and the manufacturers throwing up the big middle finger while letting the retail outlets take the hits says something about their willingness to support the products they create. Shitty all around.

That being said it might be prudent to make future online purchases with AmEx. At least they give you a 90 day window for returns regardless of what the eTailer's policies are.
 
I do feel bad for those wanting to return their cards and are hitting walls with the eTailer's. NVIDIA and the manufacturers throwing up the big middle finger while letting the retail outlets take the hits says something about their willingness to support the products they create. Shitty all around.

That being said it might be prudent to make future online purchases with AmEx. At least they give you a 90 day window for returns regardless of what the eTailer's policies are.

The fact is that the retailers will NOT take the hit outside of actually having to handle the return. The AIBs will have to take heat on those returns and that will be passed back to NVIDIA.
 
Got this last night from newegg regarding my request to return my Gigabyte G1 970. They gave me full store credit.

Thank you for requesting a Return Merchandise Authorization (RMA) number from Newegg.

Your RMA number is valid for 14 days, and will expire on 2/25/2015.

Please print this page and enclose the top half with the item(s) you are returning. Also, please remember to cut out the included RMA Identification Label and firmly tape it to the outside of the shipping box with clear tape. It is important to attach this label to ensure that your RMA is processed in a timely manner. Failure to tape the RMA Identification Label to the shipping box may result in a processing delay. The RMA ID Label is for Newegg internal use and is not a valid UPS or FedEx Shipping Label.

Thank you for shopping at Newegg.

We do our best to provide you with an unbeatable shopping experience, but we recognize that sometimes you have to return a product you've bought from us.

Your UPS RMA Return Shipping Label has been created. Below is your Shipping Label Order confirmation, along with instructions for accessing and printing your label. Please keep a copy of this email for your records.
 
[L]imey;1041422326 said:
I decided to keep mine under the current circumstances as I am planning to stay at 1080p until qhd or uhd monitors are 150 bucks.

Not an option for me as I am already at 1440p and was experiencing issues.
 
So many people returning these and buying 980s.

NVIDIA is going to make a killing off this fubar.

At this point I'm beginning to wonder if the whole controversy was fabricated by them.
 
I don't want to start an argument, but I don't see how that is possible given the stance of NVIDIA and the AIB's. Unless the eTailer's have Net 180 terms of payment for the product and will just start actively refusing payment for each card returned.

Do you think that AIBs are not already contracted to "reimburse" for returned cards?
 
So many people returning these and buying 980s.

NVIDIA is going to make a killing off this fubar.

At this point I'm beginning to wonder if the whole controversy was fabricated by them.

No, just geeks getting themselves worked up over nothing. People who look at synthetic and component specs over actual benchmarks.
 
Back
Top