"landmine" is a bit of a stretch.
"splinter" would be a more accurate term.
This is what I'm kind of referring to with the overreacting.
How about surprise. My goodness, the word mincing ...
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
"landmine" is a bit of a stretch.
"splinter" would be a more accurate term.
This is what I'm kind of referring to with the overreacting.
Oh, like this one?
Sorry, but you should discount many of the sites running so-called "objective tests" on this issue as they clearly are not objective or done properly .
There is no stuttering.
Forget about YouTube videos.
Numerous sites have ran objective tests and there is practically no impact.
Thats a fresh new take on the issue, ill give you that....
I've done my own tests and it is visible very readily even if you don't run any logging.
if there was an issue Kyle would be the first one to call NV on there bullshit
if you have been here long enough you know he is not afraid of NV or any one else and HAS called NV, ATi, AMD, Intel and others on shit like this
if Kyle says its not a big deal its just not end of story
Buying a video card for the future is not stupid. Some of us don't have the money to upgrade often, we want hardware that will have the longest lifespan possible.
Also plenty of the GPUs I have owned have lasted more than two years at popular resolutions. Maybe those GPUs didn't hit 60fps in every title, but every title was playable and it allowed me to skip on any upgrades for a while.
Oh, like this one?
I'm not looking to pick on you or anything , but the problem I have with your test is that you're comparing a 4k rendered resolution + scaler (which adds latency) to a 1440p native resolution. Now 4k is more than twice the resolution of 1440p. You experience some performance hits and conclude that it must be related to the segmented memory. But why? To me the reasonable expectation is that doubling the resolution is going to introduce performance issues. Throw in the fact that SLI is not such a great technology for smoothness and your results don't look particularly peculiar to me.
The thing is, the technical "drawback" of the 970 memory controller that people are criticizing is basically replicated in super-magnification in SLI tech.
IIRC I don't think Kyle's really gone either way about his personal feelings. He's pointed to the review of the 970s where the 980s (and I remember one saying 290x's as well) were suggested for 4k due to stuttering. They basically recognized this issue in an indirect way before we even knew what it really was. Kudos for that.
He's also said if you feel wronged you should pursue a refund, ect.
I've been reading [H] for 13 years and I can't remember ever seeing a bias in the reviews & that is why I continue to come here / use commissioned links.
If your post was responding to mine I was mainly talking about other sites - the most obvious site is guru3d, where they barely/not at all utilized the affected RAM and said everything is dandy. I'll trust GoldenTiger over that review.
I have to say, we may not have gotten along throughout our history on this forum, but I have new-found respect for you.
You've really gone above and beyond for these issues and more than proven yourself.
You're not the golden boy.
You're the GoldenTiger.
Stay [H] and keep up the good work!
Considering how US review sites reacted, or lack of reaction, I trust GoldenTiger over any of them. As previously mentioned the Germans didn't have any issue creating an issue on multiple titles (with playable frame rates) either.
But as noted earlier in the [H] review they did suggest 980 SLI over 970s due to smoothness.
Gaming at 4K vs. NV Surround 5760x1200
There is something we need to discuss, and that is what cards are more appropriate for what resolutions of gaming. After this evaluation, we strongly feel that if you are going to be gaming on a 4K display you should spend the extra cash and spring for GeForce GTX 980 SLI. While GeForce GTX 970 SLI can deliver an "OK" 4K gaming experience, we did have to make image quality sacrifices in every game. GeForce GTX 970 SLI cannot "max out" 4K gaming. You will simply have a much better gameplay experience going with GeForce GTX 980 SLI.
Now, if you don't want to spend the extra money for two GTX 980 GPUs, or just don't have the cash, then at least AMD Radeon R9 290X CrossFire would be the better alternative at 4K for the same money. Since R9 290X CF is the same price as GTX 970 SLI, all things being equal we'd go for R9 290X CF for 4K. Performance was faster, we were able to have higher in-game settings, and you'll just have a better experience. However, if you do have the extra cash and go for GTX 980 SLI, then that will definitely be faster and better than R9 290X CF. Basically, for 4K, go with 980 SLI if you can first, if you can't due to money, then go with R9 290X CF. We would not go below that for a 4K experience.
if there was an issue Kyle would be the first one to call NV on there bullshit
if you have been here long enough you know he is not afraid of NV or any one else and HAS called NV, ATi, AMD, Intel and others on shit like this
if Kyle says its not a big deal its just not end of story
IIRC I don't think Kyle's really gone either way about his personal feelings. He's pointed to the review of the 970s where the 980s (and I remember one saying 290x's as well) were suggested for 4k due to stuttering. They basically recognized this issue in an indirect way before we even knew what it really was. Kudos for that.
He's also said if you feel wronged you should pursue a refund, ect.
I've been reading [H] for 13 years and I can't remember ever seeing a bias in the reviews & that is why I continue to come here / use commissioned links.
If your post was responding to mine I was mainly talking about other sites - the most obvious site is guru3d, where they barely/not at all utilized the affected RAM and said everything is dandy. I'll trust GoldenTiger over that review.
I like the title .
On the performance thing though, the issue is the frametimes primarily, despite the low framerate impact. In SLI especially, but even in single cards on some places' testing like PCGH (and even PCper's SLI article is showing extremely high numbers of double-frametime hits which is practically the same effect you'd see with frameskipping), the frametimes become very, very erratic and this results in 50fps even for example becoming unplayable due to the hitching, pausing, and stuttering. You may have 50 frames inside of that second still, but as testing shows, you get them very unevenly and get stuck with the same image for far more time and far more often each second than the normal performance level changes on both other video cards and even the same chip except disabled SMM's (GTX 980), showing the GTX 970's problems aren't within the realm of typical variation. While most cards will lose X percent of their framerate going up in resolution, their frame timing remains about the same relative to the longer rendering times with minimal amounts of extra swinging. On the GTX 970 setups, it becomes extremely frequent and the swings in frametimes are far more severe as a result once you've gone into using the last 512MB of vram (in actual use, not just cached). This goes for multi-card setups like SLI/CF too, where the frametimes vary at a pretty consistent level regardless of resolution.
I don't think the frame time variance here could be reasonably attributed to an issue in shading performance or general performance. Increasing the average frame time (through 'natural' means) will always increase relative frame time variance, but the graphs do tend to look very 'night and day' when it comes to exceeding the memory threshold. Not clear-cut, but still compelling.I'm not looking to pick on you or anything , but the problem I have with your test is that you're comparing a 4k rendered resolution + scaler (which adds latency) to a 1440p native resolution. Now 4k is more than twice the resolution of 1440p.
It's a conclusion not found in direct evidence, but I think it's a reasonable conclusion in this instance that the segmentation is partially to blame. In other tests where differences in frame time variance are less pronounced, I think it's very difficult to say what the exact combination of causes is.You experience some performance hits and conclude that it must be related to the segmented memory. But why?
I want to ask of the community who have purchased GTX 970s this question. Are you happy with the performance you are getting in your games with the 970? Does it deliver a good gameplay experience for you?
Just like any card, I think this is the most important question to ask yourself. If it doesn't provide this for you, and you have the money, then by all means find something that does satisfy this.
I want to ask of the community who have purchased GTX 970s this question. Are you happy with the performance you are getting in your games with the 970? Does it deliver a good gameplay experience for you?
Just like any card, I think this is the most important question to ask yourself. If it doesn't provide this for you, and you have the money, then by all means find something that does satisfy this.
I want to ask of the community who have purchased GTX 970s this question. Are you happy with the performance you are getting in your games with the 970? Does it deliver a good gameplay experience for you?
Just like any card, I think this is the most important question to ask yourself. If it doesn't provide this for you, and you have the money, then by all means find something that does satisfy this.
GoldenTiger,
Could you clarify why you think Guru3D is not utilizing the last 512 mb of VRAM in their testing? I'm not understanding your explanation.
4 gb = 4,096 mb
3.6 gb = 3,686.4 mb
3.5 gb = 3,584 mb
When they say that they are using almost 3.6 gb (or almost 3,686 mb) VRAM usage how is this not utilizing the last 512 mb of VRAM? 3.6 gb is not 3,600 mb. (though even 3,600 mb is technically still utilizing the last 512 mb.) Am I missing something?
I think that they could have probably pushed it a bit more than "almost" 3.6 gb but I believe this still is using VRAM in the last 512 mb.
I want to ask of the community who have purchased GTX 970s this question. Are you happy with the performance you are getting in your games with the 970? Does it deliver a good gameplay experience for you?
Just like any card, I think this is the most important question to ask yourself. If it doesn't provide this for you, and you have the money, then by all means find something that does satisfy this.
Why do you think they are referring to 1000 mb per gb vs. 1024 mb per gb? Do they indicate this anywhere in the article? How do you know what they "mean"? When companies sell hard drives and flash storage it is common to use 1,000 mb per gb. I have never heard anyone referring to gb of memory on a video card as 1000 mb per gb. Do you think that Guru3D thinks 3.6 gb of memory on this video card is 3,600 mb?I might be able to clear it up for you. When they say they are using almost 3.6 GB of VRAM they mean actual GB (See: https://en.wikipedia.org/wiki/Gigabyte) instead of GiB (See: https://en.wikipedia.org/wiki/Gibibyte). 1 GB = 1000 MB so they are using ~3600 MB of VRAM. 1 GiB = 1024 MiB, so the 4 "GB" is actually 4 GiB and has 4x1024 MiB or 4096 MiB of VRAM. The cards have eight 512 MiB VRAM modules to make up 4 GiB. Take the 512 MiB, subtract it from 4096 MiB and you get 3584 MiB. With that confusion you can get to "almost 3.6 GB" and not experience any issues because you technically have to get into the last 512 MiB pool which is past the 3584 marker.
To further back this up, look at what HardOCP said in the Watchdogs portion of that review:
We know now one possible reason why HardOCP had to turn the GTX 970 SLI settings down to Medium instead of High, but since the results of what settings were playable still matched 290 CrossFire, no red flags went up. 970 SLI performed so well with all of the other games in the review that no reasonable person would detect that there was a problem with the 970 to go looking for.
If anyone noticed that 970 SLI wasn't working quite as well with Watchdogs as it did with all the rest of the games, they would have wondered what was wrong with Watchdogs. Sure, 20/20 hindsight tells us this was an early detection of the 970 VRAM issue, but neither HardOCP nor its readers could have known that.
Yeah, unforunately because I value personal experience much more than business principle, I often have to deal with the devil. I am not that kind of person to switch teams (and gimp my gaming experience as a result) just on business grounds .
I used to like nVidia, now I am starting to like their cards only, and even that is pushing it. AMD simply isn't offering a viable alternative for me either (and I am pretty much stuck on nVidia for both my monitor and games I play).
So far the G-Sync thing, from what I can understand, it that it only works on monitors with inherent adaptive-sync (meaning, currently only laptops panels), and G-Sync desktop monitors definitely predates DP 1.2a anyway, so I am not entirely sure what the G-Sync fiasco is really about
Anyone enlighten me on this? Because I haven't found any evidence of G-Sync working without the module on a DESKTOP monitor.
chenw said:So far the G-Sync thing, from what I can understand, it that it only works on monitors with inherent adaptive-sync (meaning, currently only laptops panels), and G-Sync desktop monitors definitely predates DP 1.2a anyway, so I am not entirely sure what the G-Sync fiasco is really about
Why do you think they are referring to 1000 mb per gb vs. 1024 mb per gb? Do they indicate this anywhere in the article? How do you know what they "mean"? When companies sell hard drives and flash storage it is common to use 1,000 mb per gb. I have never heard anyone referring to gb of memory on a video card as 1000 mb per gb. Do you think that Guru3D thinks 3.6 gb of memory on this video card is 3,600 mb?
Like I said:
4 gb = 4,096 mb
3.6 gb = 3,686.4 mb
3.5 gb = 3,584 mb
I want to ask of the community who have purchased GTX 970s this question. Are you happy with the performance you are getting in your games with the 970? Does it deliver a good gameplay experience for you?
Just like any card, I think this is the most important question to ask yourself. If it doesn't provide this for you, and you have the money, then by all means find something that does satisfy this.
Yea and it looks like G-sync might even be able to work without that special monitor.
I learned from Nvidia's marketing bullshit when they tried to push Physx. G-sync is the same bullshit to me.
I'm going to assume that he was not referring to almost 3600mb because of the nature of the article. In the paragraph after the "almost 3.6 GB" statement he also refers to using the last 512mb of memory. I find it highly unlikely that a site like Guru3D that is writing an article about using above the 3.5 gb of memory would actually not go above 3.5gb. I have sent Hilbert, the author of the article, an e-mail to clarify.Educated guess but from what it seems they saw mid to high 3500 and called it "almost 3.6 GB" which would explain why the issues didn't surface. I think a lot of people would have said the same thing. If you saw 3000 MB usage you'd probably say 3 GB instead of mathing it out since it's close enough.