MavericK
Zero Cool
- Joined
- Sep 2, 2004
- Messages
- 31,944
Guess I should have ordered from Amazon...from what it sounds like, Newegg is telling us to go fuck ourselves.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I highly doubt they would ever offer any sort of compensation for this issue as that would be admitting fault.
YES!...I was able to get a full refund from Amazon on my GTX 970 purchased in November...YES!...no hassle at all...I didn't even need to explain the technical details of the GTX 970 3.5GB VRAM issue...apparently since I ordered the card in November Amazon extended the return window for the holidays until January 31st...so I had a few hours remaining in my return window when I called
I'm still not sure about returning the card but I'm glad I have the option to do so if I choose (the return label is good for 7 days so I have 1 week to decide)...the Gigabyte G1 Gaming 980 card is now $579.99 after rebate so even though I don't think it's worth $200 more then a 970 I might bite...since Amazon has a 12 month interest free promotion on orders above $599 also helps
Guess I should have ordered from Amazon...from what it sounds like, Newegg is telling us to go fuck ourselves.
will Nvidia manufacturers be able to design their own custom 970 card which gives access to the full 4GB VRAM like the 980?...or is it not possible?
Yap. I think I have only heard 1 person said they were able to get a refund. I have tried 3 times myself and no deal.
I would buy at amazon if it wasnt for Wa state tax I would pay :/
will Nvidia manufacturers be able to design their own custom 970 card which gives access to the full 4GB VRAM like the 980?...or is it not possible?
I read a couple of 4K comments and how the GTX 970 is inadequate for 4K gaming. Problem I have is that it is inadequate right now for 1080P gaming in the latest games including (Mordor, COD AW w/ caching options, Dying Light) with all bells and whistles turned up in the in game menus.
Probably a little of both to be honest once they realized the mistake - side note unrelated to your thoughts (directed at another user) - really...dismissing the [H] crew's SLi performance review because they used 'old games' like Crysis 3, Tomb Raider, and BF4?
God damn those bastards for using games that actually function properly and aren't terrible fucking console ports with the most unoptimized pieces of shit for drivers and coding since the pre-21st century days where swapping graphics cards could be the difference between a 2 fps slideshow and a cool 30-60fps. Fuck those guys running sideways up a hill for not picking the most artificially demanding games and using those as 'quality' test pieces for their grand graphics card experiment conspiracies.
so what should I do [H]?...I keep reading that at my resolution (1920 x 1200) that any VRAM issues won't show itself...but won't more VRAM hungry games become the norm in 2015?...should I 1) return my GTX 970 and buy a 980 to replace it...2) return my GTX 970 and wait for the rumored new 8GB GTX 970 cards coming in April...3) return my 970 and go back to my previous GTX 580 and wait for next-gen cards...4) keep my GTX 970 as it's still an amazing card
MSI NVIDIA GeForce GTX970 4GD5T OC 4GB (3,5GB+0,5GB) GDDR5 256bit (224bit + 32bit) 2*DVI/HDMI/DP Pci-Ex 3.0
Found this on another forum. Looks some resellers are starting to change their marketing for the 970.
http://www.nexths.it/v3/flypage.php...urce=TrovaPrezzi&sembox_content=GTX+970+4GD5T
It depends. If you aren't having any issues, then you should be good. If you plan to future proof yourself for 3-4 years, it might be an issue with future games using more and more vram.
If you want to use DSR as well it might cause you issues.
But at your rez right now, you shouldnt see any issues right now (unless you use alot of vram like Skyrim and Fallout mods etc).
Americans are fucked.
In Europe they have no choice. IF they advertise false info on a product, people can return them. Which in this case anyone can (in Europe).
Americans are fucked.
I'm all about lasting value, so future-proofing would be great...then again if games start using more VRAM, will it really matter if I have 3.5GB or 4GB?...most likely at that point a new card with 6-8GB VRAM will be necessary
I thought I heard EVGA was accepting 970 Step-Ups past the normal window
It depends. If you aren't having any issues, then you should be good. If you plan to future proof yourself for 3-4 years, it might be an issue with future games using more and more vram.
I'm all about lasting value, so future-proofing would be great...then again if games start using more VRAM, will it really matter if I have 3.5GB or 4GB?...most likely at that point a new card with 6-8GB VRAM will be necessary
I thought I heard EVGA was accepting 970 Step-Ups past the normal window
It could be that some games that require 4 GB could get by with 3.5 GB too, but there is no way to tell for sure. Given how optimization is almost non-existent in today's game development, it is not likely that game developers would account for a niche non-standard memory capacity. Then there are games that don't need even close to 4 GB, but simply use what is given to them. They fall under the previous special optimization problem too. The latter could be helped if Nvidia would restrict the 970 to be a 3.5 GB card, which would be the best solution, but that doesn't seem likely as they would have to give up the "true 4GB" claim.
I´ve been thinking along the same lines. If Nvidia were to restrict the memory available reported to the game engine as 3.5GB, the game engine might use those resources better. Game engines of today are meant to be scalable, since they often create a game for many platforms.
They already said they optimize the drivers for that extra 500mb, so It really does make you wonder what is really going on when they "Optimize" games to stay below 3.5gb
I don't think they can optimize games specifically for that 512MB partition while at the same time not having it effect all other video cards which don't have that configuration...Titan 6GB VRAM users, 980 4GB users etc would be pissed off as well if a game is gimped to use less VRAM
so what should I do [H]?...I keep reading that at my resolution (1920 x 1200) that any VRAM issues won't show itself...but won't more VRAM hungry games become the norm in 2015?...should I 1) return my GTX 970 and buy a 980 to replace it...2) return my GTX 970 and wait for the rumored new 8GB GTX 970 cards coming in April...3) return my 970 and go back to my previous GTX 580 and wait for next-gen cards...4) keep my GTX 970 as it's still an amazing card
If im not mistaken people have already showed games shown using 3.5gb, when the 980 GTX was using the full 4gb.
Found this on another forum. Looks some resellers are starting to change their marketing for the 970.
http://www.nexths.it/v3/flypage.php...urce=TrovaPrezzi&sembox_content=GTX+970+4GD5T
This also makes you wonder if Nvidia is cheating or cutting corners if they have to change the profile of a game to use less Vram.
They already said they optimize the drivers for that extra 500mb, so It really does make you wonder what is really going on when they "Optimize" games to stay below 3.5gb
I think Scott Wasson from Techreport was right on the mark when he speculated if the 500mb "value" was more as a marketing value to sell cards as 4GB, since many buy cards according to the amount of memory.
Some have defended the usage of the 500MB, since its faster then using system memory. Sure, it might be in some cases, unless it stalls the rest of the 3.5GB of memory creating stuttering. As you insinuate, I don´t believe either that Nvidia "optimized" games to stay below 3.5GB if it wouldnt be better then having 4GB available for the game engine to use.
Until the memory issue, the difference between a 4GB 980 and a 4GB 970 was that the 970 had some less raw power. By buying two, you could get the raw power back and then some. That has obviously never been the case with the 970.
Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.
Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.
Interesting. I wonder who decided on that spec description, themselves or NVidia?
Yep, and that's exactly how nvidia advertised it and provided press copy to reviewers to post up: "the same memory subsystem as the GTX 980" and simply less shaders/TMU as a result of a few SMM units being disabled. As we all well know by now... that was false advertising (in fact having lower L2 cache, ROP units enabled, and effective memory bandwidth), as was the "4GB" which is actually 3.5GB with a slow pool of 0.5GB in addition accessible but not of practical use in the majority of situations.
I've sent off an email to newegg, unsure what will happen as there have been very mixed results I'm hearing on various forums and nvidia still hasn't released a global policy to take responsibility for their problems.
I just got done chatting with Amazon. I showed the rep several links with the details of the issue as well as pointed them to the most recent customer reviews. They issued a refund without much hesitation, and this was from a purchase from Oct. 2014. If you guys purchased from Amazon, you should be good to go if you really want to return the cards.
Amazon just did the best thing any company has done in this situation. They did what they should for their customers. This is going to win them lots of new loyal customers.
Anyone that defends consumers in this situation is going to gain more of them.
I doubt this will happen. But if it does, the eDRAM or ESRAM or whatever is used will be built into the GPU or at least on the same substrate, see the Xbox One as an example. Microsoft did this to try and make up for the slow DDR3 memory, no such issue exists for discrete GPUs. The PS4 doesn't have eDRAM because it doesn't need it.Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.
Most likely the reseller. Consumer laws are stronger in EU and false/misleading specs can give reason to full refund.
Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.