Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Surely they must have had guidance from NVidia though to ensure that this new description was legally correct.
Admission of guilt from NVidia?
Hope it works out for you!
I got "lucky". I had a MSI 780 Gaming OC card, which I was very pleased with the sound profile on (I prefer quiet cards over performance). That one I put in a SFF case and gave to my nephews. Bought an Asus Matrix 290X Platinum on "black friday" for cheap that I hate the sound profile on (great card otherwise). I´ve managed to make it more quiet with some tweaks and repaste, and its now "acceptable" for shorter period of gaming or for headset gaming. Probably a great card for those less sensitive to noise.
Was finished with my research (especially coil whine) on the 970´s and had decided to buy two of those (also MSI gaming) to hold me over until the aftermarked HBM cards are available. The reason I wanted the extra GPU power, is because I am going to buy a curved 34" 3440 x 1440 display and I am uncertain if to just buy one now or get one with G-SYNC or Freesync capabilities for some "future proofing" (I change vendor often and have several computers, so I can use either).
Because of the memory issue, I´m unsure if I should buy one or two 980´s instead or wait to see if the upcoming 380X or 980TI are quiet enough. The memory issue spoiled my upgrade plans, but luckily I don´t have to deal with returns. Lots of time waisted on research though.
Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.
Narcosis2009 said:Nvidia are not being much help either - Contacted PeterS through their forums after he said he would help people wanting to return their cards by contacting the companies where we bought our cards if we gave him details on order number etc.... instead in private message asked me if wanted help with my game settings... among other totally non helpful replies
I originally was not going to RMA my cards, but Nvidia kept deleting my calm and justified concerns that I posted on their forums.
Well I let PeterS know that OCUK were willing to take back one of my cards due to their exemplary customer service. I got an extremely helpful reply to contact ASUS for the second card (as if that was not stating the obvious) and "thanks for the feedback" and message again if I hit a dead end.
Nvidia's offer of "help" is "outstanding".
The whole experience has left me with an extremely bad feeling for Nvidia.
Unless I'm missing here, segmented memory isn't anything new. In fact both the GTX 660 and GTX 660 Ti had segmented memory, except the difference is in those cases nVidia disclosed this fact at launch, so there was no fuss.
Unless I'm missing here, segmented memory isn't anything new. In fact both the GTX 660 and GTX 660 Ti had segmented memory, except the difference is in those cases nVidia disclosed this fact at launch, so there was no fuss.
Report of Nvidia turning their back on customers:
I wonder if Nvidia was caught slightly off guard with the number of gamers moving to 4k being a bit earlier than they thought. When these things were being manufactured if I'm not mistaken the only 4k monitors around were MST ones that cost 1500+ ? As in they knew it was coming , but didnt really expect to be even a reasonable option for most until well in 2015.
Without the 4k resolution becoming something that will still far more uncommon than other res's , isn't extremely rare anymore , is the reason I think anyone even noticed , or cares much about the 970 issue.
Again not excusing Nvidia at all , just was thinking maybe when originally planned , both the 980 and 970 were something that they didnt think would have to be running Acer/BenQ/Asus etc 4k monitors for under 1000 ?
(cynical side of me thinks we'll be seeing a 6/8GB Ti version soon for the 4k)
I find it amusing that people think 4GB is somehow perfect for 4K gaming anyway. It is barely enough for some current games and most certainly not going to be enough in the near future. 3.5 vs 4GB isn't going to make a big difference in "future proofing" like some people keep referring to. Cards will be moving on to 6 / 8GB configurations when 4K becomes more common, then it won't make much difference if you had an extra 512MB or not.
Wow, havent been paying attention to gpus lately but god damn Nvidia seems to be doing some shady stuff with the 970. How could they not do anything when the people who buy cards as expensive as these would obviously research them.
Its worse.
They did offer to sort peoples refunds/exchanges, then edited the posts and removed the offer of help and left only praise for themselves.
KitGuru.net news article said:Nvidia Corp. on Thursday retracted its promise to improve performance of the GeForce GTX 970 using new drivers. According to the company, one of its representatives made an incorrect statement and there is no new driver with a fix for the graphics card incoming.
As discovered by numerous enthusiasts and confirmed by Nvidia earlier this week, the GeForce GTX 970 graphics card cannot access all four gigabytes of onboard memory at full speed. Due to limitations of the cut-down GM204 graphics processor used on the model GTX 970, only 3.5GB of memory can be accessed with maximum bandwidth, whereas 512MB pool can be accessed at considerably lower speed, which results in performance degradations in certain cases.
gl people are STILL waiting on there money from the last class action that NV lost when they burnt up peoples laptops
gl people are STILL waiting on there money from the last class action that NV lost when they burnt up peoples laptops
Something I was thinking about, on this topic, while I was grinding through run-throughs tonight. I think memory segmentation, and preferred lanes of memory chunks, are most likely technologies we have to look forward to as technology of GPUs in the future evolve. This is new technology in Maxwell afteral, which wasn't possible in previous generations. Think when consumer GPUs finally implement eDRAM, or another type of memory close to the GPU. Games will prefer to use the eDRAM pool of memory (or other type) first, before diving to outside or external VRAM on the board. Who knows what other implementations of memory segmentation we may see. This could be a foreshadowing of future GPU technology and the evolution of how memory is accessed. Food for thought.
Isn't it still about 4x faster than going over PCIe to system memory?Segmentation in 970's context is dividing up memory that would normally be accessed uniformly. The extra .5gb might as well not be there. Its not a feature.
Isn't it still about 4x faster than going over PCIe to system memory?
Seems like a benefit, to some extent.
News have it their GSync monitors are a scam as well. They are charging an insane fee to have that feature which could be had with simple software (AMD mentioned this in the past as well). Be VERY careful with what Nvidia is really trying to sell. I bet a lot people will be leery with the next Nvidia GPU launch.
http://www.reddit.com/r/pcgaming/comments/2uco43/boris_vorontsov_from_enbseries_has_a_theory_on/
WOW.....is all I can say....
http://www.reddit.com/r/pcgaming/comments/2uco43/boris_vorontsov_from_enbseries_has_a_theory_on/
WOW.....is all I can say....
Yeah, I've seen that being mentioned, but I've refrained from posting it until it is definitely confirmed, because it is just too bad to be true. To blatantly lie after being caught lying would be too much even for Nvidia, so there is probably an explanation of why this happens.
RyviusARC said:Most games won't show the stuttering issue when going over 3.5GB because most games that even reach that high are just caching a lot of stuff and needlessly filling up vRAM.
If you want a true test then you need to find a game that has to constantly load in new large amounts data into vRAM.
That is why Skyrim modded with only high resolution textures is a good test because those textures can be 4k or 8k and really push the vRAM over 3.5GB.
Skyrim constantly has to load in more textures to vRAM as it updates the landscape when you move through it.
Most games just load in a large portion of the map and you don't really see a stuttering issue because it's not needing to constantly access new data in vRAM.
So go ahead and try starting out with a non modified Skyrim and only add in high res texture mods.
I tried this with Skyrim and it will run smoothly while running through the game if it stays below 3.5GB of vRAM. Once it goes above 3.5GB it will remain smooth until it has to load in a new grid of land and it will freeze completely until that new area is loaded in.
This is my usage in Skyrim when going over 3.5GB and having to load in a new grid.
My friends PC has no stuttering in game with the same texture mods and he is using a GTX 770 4GB version.
res1i3js said:Unrelated to what you guys are currently discussing I just want to say something about this whole 3.5GB fiasco.
Ever since I got my 970 in October I did not experience any performance or stuttering issues, yes I had issues, but I was able to fix those and they weren't related.
When I first started seeing these reports of games having stuttering issues I began to investigate. I tried BF4 and several of my MMOs because people were reporting issues with those. I played for hours without any stuttering at all. So I thought, well must be something else wrong with their rig. So I tried to help people, I did that for a while until it became too much and stopped.
When members of our community first posted about the 3.5GB issue, I wanted to believe Nvidia was telling the truth and that it wasn't causing an issue. They ignored the point that the concern was about stuttering, not fps loss. Once it hit the news and shit started hitting the fan, I saw people reporting that the GTX 970 would not be as future proof as I would have hoped I decided to not risk it and contacted EVGA to step up to a GTX 980, my only option vs simply throwing it away. Boy was I right to do so. While waiting to get my GTX 980 I got Dying Light.
Dying Light plays like sh*t when the card reaches 3.2-3.5GB of VRAM usage. Stuttering, fps drops, terrible and plain to see. Tomorrow is the day I send back my 970 card, and sad (or glad) to say my original GTX 570 is handling Dying Light perfectly. Also, I have two friends with GTX 980s that have no issues either, so I don't doubt the 900 series as a whole, just the 970.
I am sorry I ever doubted those who have experienced these problems before me and I hope this post relays to others how real this issue is and how it's going to affect those who still hold onto their 970s for the future.