Petitioning for 970 Refund

I agree that your testing methods show the settings that will work well.
But at that time you werent aware of why you had to resort to all of the lower settings.
If it was simple for owners of this card to set it up to never cause stuttering there wouldnt be as big an issue.

I thought that if any site would publish a full on test of this it would be [H].
I saw that you deferred ownership of the problem to someone else, but thats not stopped you from investigating other issues.
Its quite perplexing for us.
 
complaining because they are actually experiencing issues themselves or complaining to join this crusade for the principle of it?...Nvidia purposely lied about the specs...OK...I've moved past that...they would be incredibly dumb to ever try that again because they know people will be paying even more close attention to every small detail on their future cards...seems like most people got their feelings hurt because they lied but don't really have anything else to show how this is effecting them...I looked at the benchmarks after they re-tested...made my decision to keep the 970 easy

* I have a return label setup with Amazon which expires on March 2nd...so technically I'm keeping my options open...I have a few weeks to make a 'final' decision if some new facts come to light

They're complaining because Nvidia took the short cut and sold them a 4GB card. Even if the 970 performs well, it's still not the card that people thought they were getting. This attitude of "well, it's still a great deal for the performance, so you don't need a refund" is pure nonsense. People don't want to buy something only to find out it isn't what they thought they were getting, and then be told "well - it' still a great deal - so shut the fuck up - you'll hit the GPU limit anyways..." (Blah, blah - as if we know every usage case the card will endure).

Does it really matter that at the moment, only a fringe user base will experience the issues that the gimped memory system will cause? It shouldn't.

If you end up keeping the 970 - good for you. But don't patronize the people who are complaining. It's insulting and rude.

EDIT: I apologize for the 'tude myself (we just had a baby this weekend and I'm running on little sleep). I do want to applaud your keeping your options open though. Try to realize though, that even if the complaints may not seem legitimate to you, they're very legitimate to the people who make them. I myself recommended this card to a co-worker, and I feel bad that he didn't get what he paid for either. Now I'm going to have to send him a link to all the hooplah and feel a little guilty about it.
 
Last edited:
complaining because they are actually experiencing issues themselves or complaining to join this crusade for the principle of it?...Nvidia purposely lied about the specs...OK...I've moved past that...they would be incredibly dumb to ever try that again because they know people will be paying even more close attention to every small detail on their future cards...seems like most people got their feelings hurt because they lied but don't really have anything else to show how this is effecting them...I looked at the benchmarks after they re-tested...made my decision to keep the 970 easy

* I have a return label setup with Amazon which expires on March 2nd...so technically I'm keeping my options open...I have a few weeks to make a 'final' decision if some new facts come to light

I'm not singling you out, but this is exactly the sort of thing I've been talking about. nVidia pretty much wants to see this, that this whole incident will just blow over by itself without any consequences for them. (I'm not counting 5% return rate if even that as a "consequence"). If anything, they'll take this as an example of "we can do whatever the fuck we want and these guys will still buy our products regardless". That sends a very wrong message.
 
Try to realize though, that even if the complaints may not seem legitimate to you, they're very legitimate to the people who make them. I myself recommended this card to a co-worker, and I feel bad that he didn't get what he paid for either. Now I'm going to have to send him a link to all the hooplah and feel a little guilty about it.

I'm not saying the complaints are not legitimate...they are...Nvidia lied...I was on the other side of the fence initially and wanted a refund or a new card but after reading the facts about this issue from numerous trusted websites I changed my mind...if people feel duped then by all means try and get a refund...I just think too many people are losing sight of the real issue of how the wrong specs are tied into performance for their particular system

congrats on the new baby!

I'm not singling you out, but this is exactly the sort of thing I've been talking about. nVidia pretty much wants to see this, that this whole incident will just blow over by itself without any consequences for them. (I'm not counting 5% return rate if even that as a "consequence"). If anything, they'll take this as an example of "we can do whatever the fuck we want and these guys will still buy our products regardless". That sends a very wrong message.

Nvidia is too big of a company to take a major hit from this...I mean people only have 2 options in terms of graphics cards- AMD, Nvidia or integrated graphics...AMD has never had the same reputation as Nvidia even during their ATI 9800 Pro heyday...this will blow over...it will always be a stain on Nvidia's reputation but when the next amazing card comes out people will forget...

in the land of the blind, the one-eyed man is king
 
I'm pretty sure if people who wanted refunds actually were allowed refunds, we wouldn't get half as many complaints. A lot of the anger is directed at how nVidia handled the fallout, and rightfully so. They did initially offer to help those who wanted a refund, but quickly reneged and instead gave a canned response to everyone who took up their offer of "assistance". Needless to say many who wanted to return their cards were simply pointed to nVidia's official press release and basically told "see there's no problem it's by design".

I dare say, had nVidia handled this whole situation properly -- official apology + unconditional returns for the 970 -- this whole situation would've played out a lot better. Then people could actually say "ok yes so they lied and kept hush about it for 4 months, but at least they owned up to their mistake, and if you return the card you basically got to use it for free for 4 months, so I guess I'll give them a second chance". But no right now there's nothing but silence, and all indications are that they simply want to sweep this under the rug and pretend nothing ever happened. Tying this into the too big to fail part, I see this as all the more reason why this needs to be in the spotlight.

As for US based sites saying everything is a-ok, well the Germans seem to have something to say about that.

I posted a bunch of pics earlier in this thread so I'm not gonna spam them again.
 
Last edited:
I'm pretty sure if people who wanted refunds actually were allowed refunds, we wouldn't get half as many complaints. A lot of the anger is directed at how nVidia handled the fallout, and rightfully so. They did initially offer to help those who wanted a refund, but quickly reneged and instead gave a canned response to everyone who took up their offer of "assistance". Needless to say many who wanted to return their cards were simply pointed to nVidia's official press release and basically told "see there's no problem it's by design".

I dare say, had nVidia handled this whole situation properly -- official apology + unconditional returns for the 970 -- this whole situation would've played out a lot better. Then people could actually say "ok yes so they lied and kept hush about it for 4 months, but at least they owned up to their mistake, and if you return the card you basically got to use it for free for 4 months, so I guess I'll give them a second chance". But no right now there's nothing but silence, and all indications are that they simply want to sweep this under the rug and pretend nothing ever happened. Tying this into the too big to fail part, I see this as all the more reason why this needs to be in the spotlight.

As for US based sites saying everything is a-ok, well the Germans seem to have something to say about that.

I posted a bunch of pics earlier in this thread so I'm not gonna spam them again.

I totally agree. If they handled it right it could of been a reinforcement for loyal customers. Instead my next build in 2016, unless AMD completely mauls their lineup, will likely be 3x high end AMD cards. Principles, ethics and dignity might be old fashioned but that's how I was raised.
 
I don't think it is one particular thing that has people upset.
It is a multitude of reasons and how this whole thing played out, i.e. the way Nvidia handled it. If they would have admitted they did something wrong and apologized, that would have gone a long way.The fact that they lied to your face for 4 months and then when everyone finds out they were, they flip you off and tell you to kiss their ass... while people in EU actually have a valid recourse.... it doesn't feel like they actually care about their customers or their image.

What I can't understand is the people that are upset and outraged at the GTX970 owners that have an issue with this.
The majority aren't directly affected by this incident, don't own a GTX970, there are some that are and feel the need to tell other GTX970, they are dumb or stupid or ridiculous for using their cards a certain way.
 
Last edited:
I totally agree. If they handled it right it could of been a reinforcement for loyal customers. Instead my next build in 2016, unless AMD completely mauls their lineup, will likely be 3x high end AMD cards. Principles, ethics and dignity might be old fashioned but that's how I was raised.

Principles is also the reason why I never have and never will use IC Diamond as my thermal paste after the CEO acted like a total jackass during the TechPowerUp debacle.
 
What I can't understand is the people that are upset and outraged at the GTX970 owners that have an issue with this.
The majority aren't directly affected by this incident, don't own a GTX970, there are some that are and feel the need to tell other GTX970, they are dumb or stupid or ridiculous for using their cards a certain way.

I'm not outraged at people who are upset, or anything like that. I just think we should keep it real here and try to stay within the realm of the facts, rather than -- come on, let's be honest; this temptation exists -- getting oneself more and more and more worked up as one sits there and dwells on it endlessly. Before too long they're convinced that they're a legitimate victim.

The fact is, there is practically no performance impact from this segmented memory in real-world scenarios.

If people feel slighted because they believe they were "lied to" or that NVIDIA did something wrong, I have no bone to pick with that. That's an emotional response where reason isn't going to get much play anyhow.
 
Last edited:
I'm not outraged at people who are upset, or anything like that. I just think we should keep it real here and try to stay within the realm of the facts, rather than -- come on, let's be honest; this temptation exists -- getting oneself more and more and more worked up as one sits there and dwells on it endlessly. Before too long they're convinced that they're a legitimate victim.

The fact is, there is practically no performance impact from this segmented memory in real-world scenarios.

If people feel slighted because they believe they were "lied to" or that NVIDIA did something wrong, I have no bone to pick with that. That's an emotional response where reason isn't going to get much play anyhow.

Again.... What about in 2 years? As games advance and larger resolution displays become the norm. No one is doubting the NOW. What happens when in 2 years, someone wants to purchase a 2nd 970 for SLI on the cheap and used to "upgrade" to play at that higher resolution. Now he is screwed because he purchased a gimped card that he had no knowledge of from the get go?

Why is everything NOW NOW NOW... Again, not everyone upgrades constantly and some people will utilize their hardware for years. Hell, Ive been on my 2600k since day 1. 4+ years now and plan to keep it probably another year. This is not the year 2000 where technology was doubling in speed every release. We are beginning to hit limits until some breakthrough technology comes about.
 
Again.... What about in 2 years? As games advance and larger resolution displays become the norm. No one is doubting the NOW. What happens when in 2 years, someone wants to purchase a 2nd 970 for SLI on the cheap and used to "upgrade" to play at that higher resolution. Now he is screwed because he purchased a gimped card that he had no knowledge of from the get go?

Why is everything NOW NOW NOW... Again, not everyone upgrades constantly and some people will utilize their hardware for years. Hell, Ive been on my 2600k since day 1. 4+ years now and plan to keep it probably another year. This is not the year 2000 where technology was doubling in speed every release. We are beginning to hit limits until some breakthrough technology comes about.

Cards barely last 2 years for 60fps@popular resolutions. Buying a video card for the THEN is stupid.
 
Buying a video card for the future is not stupid. Some of us don't have the money to upgrade often, we want hardware that will have the longest lifespan possible.
Also plenty of the GPUs I have owned have lasted more than two years at popular resolutions. Maybe those GPUs didn't hit 60fps in every title, but every title was playable and it allowed me to skip on any upgrades for a while.
 
Buying a video card for the future is not stupid. Some of us don't have the money to upgrade often, we want hardware that will have the longest lifespan possible.
Also plenty of the GPUs I have owned have lasted more than two years at popular resolutions. Maybe those GPUs didn't hit 60fps in every title, but every title was playable and it allowed me to skip on any upgrades for a while.

I'm confused, so r u agreeing that like ur previous cards the 970 should decline in performance as it ages when it comes to playin future titles at 60fps at popular resolutions?
 
Samm do you not understand that the GTX970 issue isn't an issue of framerates but frametimes?
The experience I brought up was related to a card that didn't stutter because games weren't trying to access ram that my GPU couldn't use effectively. So yeah I am agreeing that GPUs will decline in performance as it ages....

What I do not agree with is the notion that companies can market GPUs as having 4GB of ram on a 256 bit bus when in reality thats not the case. It upsets me because Nvidia effectively stole away all consumers that wanted a good gaming GPU with 4GBs of ram while not actually delivering on what they promised. That led to problems for SLi users, heavy DSR users, 4k Monitor Users, and its even led to problems for content creators that needed the full 4GB. Believe it or not but people do pay attention to the amount of VRAM on a card and it DOES influence their purchase. Anyone that wants to pretend like suddenly losing 1/8th your VRAM is a non issue is just burying their head in the sand.

Oh and please do not even quote me if you are going to use the "but you technically have 4GB!" crap line on me, anyone with a brain knows that VRAM that slow is useless.
 
Samm do you not understand that the GTX970 issue isn't an issue of framerates but frametimes?
The experience I brought up was related to a card that didn't stutter because games weren't trying to access ram that my GPU couldn't use effectively. So yeah I am agreeing that GPUs will decline in performance as it ages....

What I do not agree with is the notion that companies can market GPUs as having 4GB of ram on a 256 bit bus when in reality thats not the case. It upsets me because Nvidia effectively stole away all consumers that wanted a good gaming GPU with 4GBs of ram while not actually delivering on what they promised. That led to problems for SLi users, heavy DSR users, 4k Monitor Users, and its even led to problems for content creators that needed the full 4GB. Believe it or not but people do pay attention to the amount of VRAM on a card and it DOES influence their purchase. Anyone that wants to pretend like suddenly losing 1/8th your VRAM is a non issue is just burying their head in the sand.

Oh and please do not even quote me if you are going to use the "but you technically have 4GB!" crap line on me, anyone with a brain knows that VRAM that slow is useless.
Do you own 2 (or 1) GTX 970 cards and are running them in SLI with a 4k monitor? I was looking at your signature but I'm figuring it hasn't been updated.
 
People fail to realize this is misrepresentation .It's like buying a used car with the odometer rolled back. Its still a nice car but its not what I bought. BTW people who roll back odometers goto jail.
 
Is this thread still even about petitioning for a refund or more people talking about the same thing over and over with used car analogies? Why doesn't this thread get locked like the rest?
 
People fail to realize this is misrepresentation .It's like buying a used car with the odometer rolled back. Its still a nice car but its not what I bought. BTW people who roll back odometers goto jail.

That's not the best analogy. That would imply that Nvidia is selling cards as brand new that are actually used.

A better analogy is one already posted, that you're selling a car advertised as 200HP. A bunch of people then dyno it and realize it's only outputting 180HP. Then the car company comes back and says "oops, we screwed up and mismarked the sales literature. The car actually has 180HP."

AFAIK, no car company has ever admitted to mismarked specs, but they do regularly inflate their products specs.
 
A better analogy is one already posted, that you're selling a car advertised as 200HP. A bunch of people then dyno it and realize it's only outputting 180HP. Then the car company comes back and says "oops, we screwed up and mismarked the sales literature. The car actually has 180HP."

AFAIK, no car company has ever admitted to mismarked specs, but they do regularly inflate their products specs.

Yes, they have. I posted this yesterday, though I missed the original post comparing it to a car. But that's exactly what happened with the ca. 1999 Ford Mustang SVT Cobra. It was marketed with 320 hp - a 20 hp bump from the previous MY. Dyno tests across the country could not (even under the most optimistic drivetrain loss assumptions) duplicate the quoted output. Eventually it was revealed that it actually did have between 300 and 310 hp. A class action suit was filed. Unfortunately it was denied and kicked back to a lower court. Eventually Ford owned up to it and issued a recall. They corrected EVERY SINGLE ONE of the Ford Mustang GTs sold under the 320 hp marketing specs, including a new air intake, reflash of the car's ECU, and exhaust modifications to bring the actual output to 320 hp. You can be sure they haven't made that mistake again. Here's an article detailing these events. And another

This is probably also why German auto manufacturers almost always rate their cars engine outputs very conservatively.

Nvidia should offer refunds to anyone that wants one. This "fix" would 1. not even impact that many folks, as only the enthusiast market is even aware of this issue, and a large portion of them simply don't care; and 2. likely be a very minimal expense for them - there's no engineering or redesign cost, and they're not providing any hardware to customers at no cost. It would also turn all that negativity into goodwill and positive publicity.
 
Last edited:
Are you seriously asking that question? If NV sold a card that had 3.5gb of RAM (just disable that extra 512mb), people would have questioned it and said WTF, then saw the benchmarks and moved on and purchased one happily.

They are not selling a card with only 3.5gb of ram. They are selling a 4gb card that starts stuttering and causing performance issues once that 512mb is accessed. It isnt just a 3.5gb card. Do you really not understand the issue here?

As far as never purchasing NV products again, I personally never said that. Every company makes mistakes and its foolish to cut out competition as that is what drives more performance for less money. My issue here is strictly with this card. I can assure you that if this was a published issue, I would have never considered the card. Why do I want to buy a card that will potentially be an issue later? Why do I want to buy a card in which the value of it could potentially be less due to this issue (in a couple years when people want to purchase these used, they will think twice due to this issue and Ill bet the resale is lower than the typical)? What if I want to go SLI w/ a used 970 in 2 years to support a higher resolution and cant now? I wouldnt. I would have purchased either a GTX980 or a 290x. Its crazy to me the amount of people who feel that this is a non-issue due to not being able to currently push the card.

They would have questioned the $220 cheaper card for having 512MB less VRAM when going back to the GTX 200 series there has only been one generation where the runner-up card has had the same amount of VRAM as the "flagship" card (GTX 670/680)? I don't really see your reasoning considering it's the GTX 770 replacement which had 2 GB of VRAM and still has more usable (as in usable without any performance hindrance) VRAM than the GTX 780 and GTX 780 Ti.

Yes, I understand the issue but if 1/8 of your 4GB total VRAM is so slow that it's unusable it's effectively a 3.5 GB card. Realistically, the only thing I see Nvidia possibly doing is a driver update to disable the last 512 MB to prevent applications from writing to that area and causing the issues users have been experiencing.
 
They would have questioned the $220 cheaper card for having 512MB less VRAM when going back to the GTX 200 series there has only been one generation where the runner-up card has had the same amount of VRAM as the "flagship" card (GTX 670/680)? I don't really see your reasoning considering it's the GTX 770 replacement which had 2 GB of VRAM and still has more usable (as in usable without any performance hindrance) VRAM than the GTX 780 and GTX 780 Ti.

Yes, I understand the issue but if 1/8 of your 4GB total VRAM is so slow that it's unusable it's effectively a 3.5 GB card. Realistically, the only thing I see Nvidia possibly doing is a driver update to disable the last 512 MB to prevent applications from writing to that area and causing the issues users have been experiencing.

Not saying you don't have valid points, but as far as what nVidia has done in the past - completely irrelevant. What matters is what nVidia advertised the 970 as. A 4GB card.
 
Not saying you don't have valid points, but as far as what nVidia has done in the past - completely irrelevant. What matters is what nVidia advertised the 970 as. A 4GB card.

I hear ya. Just wanted to know if people felt 3.5 GB for the newest generation X70 card at $330 is unreasonable, disregarding the fact that Nvidia didn't correctly state the size of the L2 cache/amount of ROPs and never disclosed that 1/8th of the VRAM was much slower than the rest.

IMHO, Nvidia should have cut down the current 970 to 3 GB VRAM (unsegmented @ full speed for the entire amount) and released a 970 variant (Ti for example) with the original 970 specs in the $400 range. Ideally though, I wish my 970s had the original specifications but I'm not going to be writing any letters to my congressman.
 
Buying a video card for the future is not stupid. Some of us don't have the money to upgrade often, we want hardware that will have the longest lifespan possible.
Also plenty of the GPUs I have owned have lasted more than two years at popular resolutions. Maybe those GPUs didn't hit 60fps in every title, but every title was playable and it allowed me to skip on any upgrades for a while.

Buy a video card that performs the best today. Anyone who claims to know the future is talking out of their exhaust port.
 
I hear ya. Just wanted to know if people felt 3.5 GB for the newest generation X70 card at $330 is unreasonable, disregarding the fact that Nvidia didn't correctly state the size of the L2 cache/amount of ROPs and never disclosed that 1/8th of the VRAM was much slower than the rest.

IMHO, Nvidia should have cut down the current 970 to 3 GB VRAM (unsegmented @ full speed for the entire amount) and released a 970 variant (Ti for example) with the original 970 specs in the $400 range. Ideally though, I wish my 970s had the original specifications but I'm not going to be writing any letters to my congressman.

Considering the benchmarks/performance - no. I would, and I'm betting many review sites and other enthusiasts would have thought 3.5 GB was still a very good deal.
 
Yes, I understand the issue but if 1/8 of your 4GB total VRAM is so slow that it's unusable it's effectively a 3.5 GB card.

But hasn't testing shown that the performance impact is negligible, if not practically non-existent? Terms like "so slow" and "unusable" appear to go against the facts here.

Realistically, the only thing I see Nvidia possibly doing is a driver update to disable the last 512 MB to prevent applications from writing to that area and causing the issues users have been experiencing.

How would that be an improvement? If an application required more than 3.5 GB then it's just going to pull that data from PCIe rather than the last memory chip. That wouldn't be any better.

And the "issues" users have been experiencing are apparently unrelated to this memory config. Am I wrong?
 
How would that be an improvement? If an application required more than 3.5 GB then it's just going to pull that data from PCIe rather than the last memory chip. That wouldn't be any better.

And the "issues" users have been experiencing are apparently unrelated to this memory config. Am I wrong?

Some recent games will take a look at available VRAM and move to use everything it can. This causes framerate problems when it hits the slow patch. I was getting some bad stuttering with Dragon Age: Inquisition on my GTX 970 at 1080p, now I know why.
 
Buy a video card that performs the best today. Anyone who claims to know the future is talking out of their exhaust port.
It's not so much about knowing the future of games but more options for the future of your equipment. If you had a choice at launch between 2 cards, and the only difference that people noticed was approx 15-20% in performance at most because of a few disabled shaders, you went with the cheaper one. Why? because you saved $220 and had options for the future. That $220 would be able to buy a used one to SLI of the same card later if performance became an issue then. Or you could sell your current card to someone else to help fund the next card that is on the market then. By buying the 970 the SLI option is out because of the vram issue, which means the second option of selling is really the only option. This tanks it's resale price because there won't be many buyers cause of the SLI problem.

Just FYI not every one can buy the video card the performs best today. And even if they could given the performance difference and price difference, only the ones that went for the 980 are the ones that had more disposable income. I had the money for a 980 but I went with the 970 because the price/performance difference made it idiotic for me to spend that extra $200+ on the more expensive solution when I could use that $200+ for other things. If at the time when I had that spare $200+ allocated for a video card, I had known the 970 was a 3.5GB card I would have spent it on the card and bought the 980 because at the time 4GB was the minimum I was willing to accept for the graphics card. I still have an option of upgrading to a 980 thanks to EVGA, but at this point in time it's not exactly easy as I already spent the extra money I had originally allocated towards a video card, so it becomes one of those extra unplanned expenditures. Not to mention the idea of giving Nvidia more money for defrauding me with the 970 would be morally reprehensible. Funny thing is I probably would have upgraded to the 980 if Nvidia just kept their word and set up a return policy, but the whole backpedaling thing really made me not want to support them.
jwcalla said:
How would that be an improvement? If an application required more than 3.5 GB then it's just going to pull that data from PCIe rather than the last memory chip. That wouldn't be any better.

And the "issues" users have been experiencing are apparently unrelated to this memory config. Am I wrong?
The issue is that using that 512MB pool causes stuttering. It affects SLI users more but can be seen in single card systems also. It's apparent in a game like shadows of mordor using DSR and ultra textures. They say Ultra Textures requires 6GB video cards, but for the most part on a 970 it only hits 3.5GB most of the time with the latest driver, but it does occasionlly hit 3.6-3.7GB and when it does there is a stuttering effect. If you leave all settings the same and change textures to "High" the stuttering disappears because the card never exceeds about 3GB. Pulling from the PCIe would be slower, but consistent. In otherwords, all the other textures in the vram are still pulling data locally at a full 196gbps, so while the card waits for the new textures to arrive over the PCIe bus, whats being rendered is still running full speed. Unlike when using the slow pool of vram, anything it needs to pull from there reduces the speed of the fast pool to the speed of the slow pool essentially. So that 196 becomes like 22 for every access of that slow section and causes stuttering.
 
Buy a video card that performs the best today. Anyone who claims to know the future is talking out of their exhaust port.

Extrapolation has been used for centuries.
Its a fundamental part of scientific endeavour etc. and is a major reason why learning is useful.
I use it to work out which PC components will last me a good length of time, odd I know, who would have thought,

If you know there is a trend and the reasons are well established, its easy to make a judgement based on that,
Thinking is quite useful.
 
Some recent games will take a look at available VRAM and move to use everything it can. This causes framerate problems when it hits the slow patch. I was getting some bad stuttering with Dragon Age: Inquisition on my GTX 970 at 1080p, now I know why.

That's my understanding of it as well unless I'm mistaken (please let me know, I'm always down for learning). I haven't experienced this issue yet because I've only had enough time to play Battlefield 4 with my two cards (for maybe a total of 3-4 hours) and at 2560x1440, the highest amount of VRAM usage I saw in MSI Afterburner was 2900-3000 MB so I haven't been able to hit that pool beyond 3584 MB.
 
I'm not outraged at people who are upset, or anything like that. I just think we should keep it real here and try to stay within the realm of the facts, rather than -- come on, let's be honest; this temptation exists -- getting oneself more and more and more worked up as one sits there and dwells on it endlessly. Before too long they're convinced that they're a legitimate victim.

The fact is, there is practically no performance impact from this segmented memory in real-world scenarios.

If people feel slighted because they believe they were "lied to" or that NVIDIA did something wrong, I have no bone to pick with that. That's an emotional response where reason isn't going to get much play anyhow.

For most people, you're right - they're not running into a performance issue. But there are some who are experiencing the performance issues that come from the memory segmentation. Sure, it only shows up in the most particular and most "demanding" situations (I put demanding in quotes, because the demand must be made on the memory system), but there IS a real-world performance issue that will rear its head due to this.

The thing that should make one concerned is - if people are seeing issues now, what does the future hold? And I'm not even talking about a couple of years down the road. Will any big releases this year show issues with the gimped memory? If so - then that's a shame. No one buys an enthusiast-class GPU with the idea that it can't even handle the latest games.

And someone has already made mention of it, but it bears itself worth repeating. If Nvidia simply left the remaining 512MB off the map, would games experience stuttering? If the system only sees 3.5 GB and not 4.0 GB, it won't try to fill the 4 GB -at least, that's the theory. Not sure if this theory holds water or not.
 
I bought one. I love the card, but i'm not running a high resolution. I feel like NV fucked up with this. I'm most likely going to keep the card because for what I do it's a great. I am pissed off at NV though and I'll remember this next time a make a VC purchase.
 
Some recent games will take a look at available VRAM and move to use everything it can.
Yup, found this out in Dying Light. "High" texture quality filled up all available RAM on my GTX 780 in short order.

Dying Light is an interesting case, however. "Medium" texture quality loads exactly the same textures as "High" texture quality. The only difference between Medium and High is how much vRAM the game attempts to use for texture cache.

Medium = ~2GB of vRAM usage.
High = All the vRAM is can get its hands on.

Texture pop-in is more likely on Medium, but I resolved that by moving the game to SSD. It feels like "High" is built for systems that need to use a boatload of video RAM to compensate for a bandwidth bottleneck somewhere (laptop graphics are a great example).
 
Last edited:
The 970 is a good card, otherwise. They should have been all, check out what we can do with 224-bits and 3.5GB. I don't think anyone would have minded, as benchmarks do the talking. But now we have a card with a lurking performance landmine, eh.
 
The 970 is a good card, otherwise. They should have been all, check out what we can do with 224-bits and 3.5GB. I don't think anyone would have minded, as benchmarks do the talking. But now we have a card with a lurking performance landmine, eh.

"landmine" is a bit of a stretch.

"splinter" would be a more accurate term.

This is what I'm kind of referring to with the overreacting.
 
"landmine" is a bit of a stretch.

"splinter" would be a more accurate term.

This is what I'm kind of referring to with the overreacting.
Landmine is too grandiose, splinter is way too timid. Most accurate are speed bumps because that is more representative of what the actual stuttering effect is like. You going down a nice flat stretch of deserted highway, then boom, you hit your head several times against the ceiling of your vehicle because that pool of memory is accessed.
 
Landmine is too grandiose, splinter is way too timid. Most accurate are speed bumps because that is more representative of what the actual stuttering effect is like. You going down a nice flat stretch of deserted highway, then boom, you hit your head several times against the ceiling of your vehicle because that pool of memory is accessed.

There is no stuttering.

Forget about YouTube videos.

Numerous sites have ran objective tests and there is practically no impact.
 
There is no stuttering.

Forget about YouTube videos.

Numerous sites have ran objective tests and there is practically no impact.

Oh, like this one?

fAsBumR.png


Sorry, but you should discount many of the sites running so-called "objective tests" on this issue as they clearly are not objective or done properly :p. I don't need to watch youtube videos to see the problem, including what I think was false advertising on the specs including L2/ROP and 3.5gb of VRAM with 512mb of dog-slow VRAM. I've done my own tests and it is visible very readily even if you don't run any logging. Additionally, a couple of sites' results have shown the issue when proper test parameters were used including PCGamesHardware.de, and even PCper's sli test had it manifest very visibly in the frametimes despite them then whitewashing it with plentiful praise in their text conclusion.

PHaofek.png


mhOevfQ.png


Still waiting on newegg to contact me back at this point. I'm extremely disappointed to see how this was "handled" (or rather, not even touched) as a major issue by nvidia to date.
 
Back
Top