From 2x 570's to 1x 7970. Could NOT be happier.

Again, you misunderstand the concept of microstutter. You can't "see" it. All it is is a lower actual framerate than what's being reported in FPS. If you're getting 90 FPS, for example, you may actually be only rendering at a framerate closer to 70FPS, however, that's still more than most monitors' 60Hz refresh rate, so you can't tell the difference. Or, you could be rendering 50FPS, but the actual framerate is at 40 FPS. Many people can't detect or feel the difference between 40FPS and 50FPS.

So now I can't see it and I can't monitor it's effects using the MSI Afterburner overlay....

....Yet, theoretically, due to the way AFR works it exists and my min FPS isn't really accurate....

I've even heard people say in other forums that you can't record it and play back it's effects using a video camera as sampling rates and refresh rates mask the effects of mircostutter in a recording....

....Yet it exists and is still a problem on modern cards quite possibly because some unqualified professional on a review site said so....

The odd thing here is, as I have stated many times in this thread, I have seen microstutter in the early days of CF/SLi, I have not, however, seen it in modern cards for a long time now.

Personally, I honestly think that a select few people may be perceiving an issue related to an article of hardware that is not related directly to the GPU's and labeling it as microstutter because it sounds a little like an issue that was present in the early days of CF/SLi.
 
Dude, get over it.

Stop trying to push your opinion as fact.

My most recent GTX 480 sli setup certainly was exhibiting microstutter. Everybodies eyes and ability to detect things is different.
 
Had a 470, a 5870, GTX 260, and an 8800GT in dual card configs and they all microstuttered. I noticed it when my fps was in the lower range, higher fps w/o vsync wasn't noticeable. All was fine when the fps was high.

Main reason though that I got rid of my dual card setups was because when it did dip below let's say like your monitors refresh rate or so, you'd get a kinda stutter, very similar to how Fallout was if you guys remember it when it launched a while back. Also, waiting for an SLI, CF profile to show up on launch day titles was a PITA, made me feel like my second card was a waste. Crysis looked so good back then that I didn't care, all i cared about back then was going in forums and posting benchmarks.

Since then I've tossed away my e-peen for a more enjoyable simple gaming experience since it's all I do, with a mix of work. Especially since I'm competitive now.

Newer drivers probably are different now, haven't tried it since. One thing is for sure, room is cooler with one card vs two lol.
 
Had a 470, a 5870, GTX 260, and an 8800GT in dual card configs and they all microstuttered. I noticed it when my fps was in the lower range, higher fps w/o vsync wasn't noticeable. All was fine when the fps was high.

Main reason though that I got rid of my dual card setups was because when it did dip below let's say like your monitors refresh rate or so, you'd get a kinda stutter, very similar to how Fallout was if you guys remember it when it launched a while back. Also, waiting for an SLI, CF profile to show up on launch day titles was a PITA, made me feel like my second card was a waste. Crysis looked so good back then that I didn't care, all i cared about back then was going in forums and posting benchmarks.

Since then I've tossed away my e-peen for a more enjoyable simple gaming experience since it's all I do, with a mix of work. Especially since I'm competitive now.

Newer drivers probably are different now, haven't tried it since. One thing is for sure, room is cooler with one card vs two lol.

Newer drivers will do nothing for it. Unless they push other frame rendering methods that is.
 
Newer drivers will do nothing for it. Unless they push other frame rendering methods that is.

Kinda pessimistic to say they will never be able to fix it.

Here's what it is "uneven frame time distribution" aka "microstutter" (simplified):
The problem is that gpu 1 renders a frame, then gpu 2 renders the next frame. Frame 1 shows up on your screen at :01 on the clock, frame two shows up at :07, frame three at :09, frame 4 at :15, frame 5 at :18, frame 6 at :26, frame 7 at :35.

When there is only one gpu that does every frame, instead of every other frame, it goes like this:

Gpu 1 renders a frame, then the next, and the next, and so on. Frame 1 shows up on your screen at :01, frame two at :05, frame 3 at :10, frame 4 at :15, frame 6 at :20, frame 7 at :25, frame 8 at :30, and so on.

And here it is in real time:

http://www.youtube.com/watch?v=zOtre2f4qZs&hd=1&t=1m17s
 
Microstuttering is like Santa Clause... we know it exists even though we can't always see it.
 
single 7970 vs the 570 sli
Plus for 7970: less power, less noise (depending on setup), less heat, more consistent performance (i.e. SLI optimization lagging), more overclocking headroom (on average), and more onboard Vram, and cheaper (if choosing to buy either one)
Plus for 570 sli: more powerful.... and that's about it? Seems like 7970 is the winner here :D
 
single 7970 vs the 570 sli
Plus for 7970: less power, less noise (depending on setup), less heat, more consistent performance (i.e. SLI optimization lagging), more overclocking headroom (on average), and more onboard Vram, and cheaper (if choosing to buy either one)
Plus for 570 sli: more powerful.... and that's about it? Seems like 7970 is the winner here :D

By that logic there is no reason to get any lesser card than the top end because all of them are going to be less powerful but also less noise and less TDP.
 
....Yet it exists and is still a problem on modern cards quite possibly because some unqualified professional on a review site said so....

The developers of the WWI flightsim 'Rise of Flight' has said more than once that SLI/CF causes stutter in their game in certain situations. I would hardly call game developers "unqualified".
 
By that logic there is no reason to get any lesser card than the top end because all of them are going to be less powerful but also less noise and less TDP.
I suppose I should've added "marginally" better performance. Yes, lesser cards have less TDP but they also have a large performance delta. In this particular case, the performance delta between oc'ed 7970 and 570sli is not that large ;)
 
The developers of the WWI flightsim 'Rise of Flight' has said more than once that SLI/CF causes stutter in their game in certain situations. I would hardly call game developers "unqualified".

That claim would have more merit in general if it were corroborated by more than one developer, IMO.

I suppose I should've added "marginally" better performance. Yes, lesser cards have less TDP but they also have a large performance delta. In this particular case, the performance delta between oc'ed 7970 and 570sli is not that large ;)

Lol...true. I guess "large" is relative but yeah it's probably somewhere around 10-20%.
 
That claim would have more merit in general if it were corroborated by more than one developer, IMO.



Lol...true. I guess "large" is relative but yeah it's probably somewhere around 10-20%.

And it is, both AMD and Nvidia are on record saying that multi-gpu can cause stuttering in certain situations. Look it up.
 
I'd like to test out he 3x gpu theory, but that would be expensive. I currently have a 2 green cards for gaming and physx, and a dual gpu red card for # crunching. So far my setup works well. I've locked in to a 1080p setup and don't have room for 2 more monitors, so my needs are different than OPs. Although, if 3 identical cards and two more 47" screens showed up at my door tomorrow, I'd be happy to accommodate them :D
 
Also, regarding your statements about a 7970 not being able to achieve 1300 on average. I just ordered my XFX Black Edition 7970 from Ebay about 2 weeks ago, and on my first try I was able to hit 1300 and hold it through an entire 3dmark11 run, and have a screenshot to prove it. I'd say if I was able to do so with a random card from Ebay, that there are many others who can do so as well. I've seen people hitting 1340 and higher on AIR as well.

I could care less if trolls like TroyX believe if I hit 1300 or not on air, as that isn't the point of this discussion. Performance is the point of this discussion.

My 7970 @ 1300mhz outperforms 6990s and 590s, and where it doesn't it's dead even. With that said, I know for a fact that a 6990 and 590 outperform 2 560s in SLI without a shadow of a doubt. Putting two and two together, you have your answer. 2 560s are just NOT enough to compete with a heavily OC'd 7970.

Sure, 2 570s in SLI should be able to beat the OC'd 7970 in a few titles, but there will be some that they wont win in, such as those with bad or less then adequate scaling. You always have to take those into consideration, because not everyone plays nothing but mainstream titles.
 
Sure, 2 570s in SLI should be able to beat the OC'd 7970 in a few titles, but there will be some that they wont win in, such as those with bad or less then adequate scaling. You always have to take those into consideration, because not everyone plays nothing but mainstream titles.

In BF3, 2x 570's are much, much faster than a single 7970.

I prefer to use BF3 benchmark because it was designed to run on pc hardware.

Console ports are never good to use to benchmark pc hardware.
 
Last edited:
In BF3, 2x 570's are much, much faster than a single 7970.

I prefer to use BF3 benchmark because it was designed to run on pc hardware.

Console ports are never good to use to benchmark pc hardware.

You're missing the whole point of this thread. Go and re-read the first post.
 
But my 7970 DOES outperform my 2 570's in Eyefinity. Significantly. I couldn't touch ultra on my 570's. Now Ultra is my friend.
 
And it is, both AMD and Nvidia are on record saying that multi-gpu can cause stuttering in certain situations. Look it up.

You seem quite confident on your claim, is it too difficult for you to provide a link?

Stop trying to push your opinion as fact.

There are people that believe in microstutter in modern hardware, and there are people that do not. I am not pushing my opinion as fact, I am merely providing an alternate point of view.

But my 7970 DOES outperform my 2 570's in Eyefinity. Significantly. I couldn't touch ultra on my 570's. Now Ultra is my friend.

At Eyefinity resolutions or 1920 x 1080?
 
Last edited:
http://techreport.com/articles.x/21516/11

"Naturally, we contacted the major graphics chip vendors to see what they had to say about the issue. Somewhat to our surprise, representatives from both AMD and Nvidia quickly and forthrightly acknowledged that multi-GPU micro-stuttering is a real problem, is what we measured in our frame-time analysis, and is difficult to address. Both companies said they've been studying this problem for some time, too. That's intriguing, because neither firm saw fit to inform potential customers about the issue when introducing its most recent multi-GPU product, say the Radeon HD 6990 or the GeForce GTX 590. Hmm."

right from the horses mouth
 
http://techreport.com/articles.x/21516/11

"Naturally, we contacted the major graphics chip vendors to see what they had to say about the issue. Somewhat to our surprise, representatives from both AMD and Nvidia quickly and forthrightly acknowledged that multi-GPU micro-stuttering is a real problem, is what we measured in our frame-time analysis, and is difficult to address. Both companies said they've been studying this problem for some time, too. That's intriguing, because neither firm saw fit to inform potential customers about the issue when introducing its most recent multi-GPU product, say the Radeon HD 6990 or the GeForce GTX 590. Hmm."

right from the horses mouth
Game, set, match.
 
You seem quite confident on your claim, is it too difficult for you to provide a link?



There are people that believe in microstutter in modern hardware, and there are people that do not. I am not pushing my opinion as fact, I am merely providing an alternate point of view.

The link you wanted has been posted by Lavaheadache.

Nothing wrong with having an opinion, it's when you keep ignoring other people telling you differently. Microstutter does exist and it's nothing got to do with other pieces of hardware. Your alternate point of view is wrong.
 
I'm running 2x MSI GTX 570 Twin Frozr II's and I love it. Sure I'd like a little more VRAM, in fact I was about to upgrade to 2x 7970 in CF but i'm leaning towards keeping my 570s for now. Everything is totally solid on ultra with high res textures. (Crysis 2, Skyrim, Alan wake, etc.) The only problem I run into with the VRAM is that with bf3 on ultra with 4xAA I get very short random fps drops due to the vram capping even though otherwise its a solid 60 at all times. Sucks that all their power is held back by the vram but if i drop it down to 2x its nothing but a perfect playing experience. So i'll hang on to them until next gen. Single 1080p 27inch btw nothing fancy.
 
I'm running 2x MSI GTX 570 Twin Frozr II's and I love it. Sure I'd like a little more VRAM, in fact I was about to upgrade to 2x 7970 in CF but i'm leaning towards keeping my 570s for now. Everything is totally solid on ultra with high res textures. (Crysis 2, Skyrim, Alan wake, etc.) The only problem I run into with the VRAM is that with bf3 on ultra with 4xAA I get very short random fps drops due to the vram capping even though otherwise its a solid 60 at all times. Sucks that all their power is held back by the vram but if i drop it down to 2x its nothing but a perfect playing experience. So i'll hang on to them until next gen. Single 1080p 27inch btw nothing fancy.

yep, that is what a vram limit is like. I was getting that in Modded skyrim with my GTX 580 1.5gb with AA @2560x1600.
 
Getting "short stutters" with 60 FPS at all other times is not really indicative of VRAM limitations. Also, I have the same setup (570 SLI) and can play BF4 Ultra 4x FSAA just fine.

Try playing Skyrim with 2k texture mods and 8x FSAA on 1 GB VRAM and you'll see what a real VRAM limitation is like.
 
Getting "short stutters" with 60 FPS at all other times is not really indicative of VRAM limitations. Also, I have the same setup (570 SLI) and can play BF4 Ultra 4x FSAA just fine.

Try playing Skyrim with 2k texture mods and 8x FSAA on 1 GB VRAM and you'll see what a real VRAM limitation is like.

What res were you playing at and were you using multi-display (i.e. surround)? Also, what version of the card were you using? (the reference VRAM1280MB card or the 2560MB model)?

I ask because I had two 570's at first on a single screen @ 1920x1200 (Do you play @ 1080?). Ultra with 4xAA was playable in non-crowded maps (single player is perfectly fine and less stressing), but as soon as lots of things happened (i.e. many explosions and on screen players), everything went to shite because of the VRAM wall. Huge dips in FPS and stuttering (think 80+FPS down to 30's)

I suspect that the 2560MB EVGA model would suffice, but they are not good value right now as Kepler approaches (and other cards have been released with more memory at similar price points).

Either way, consistency of FPS with SLI's 570's was simply not there for me in BF3. The GPU's were certainly capable, the VRAM capacity was not.
 
1920x1080, normal 1.25 GB models. I'm not sure that x1080 vs x1200 would make a huge difference, but who knows.

Even on 64-player maps they work fine for me. None of the slowdown that you are describing.
 
http://techreport.com/articles.x/21516/11

"Naturally, we contacted the major graphics chip vendors to see what they had to say about the issue. Somewhat to our surprise, representatives from both AMD and Nvidia quickly and forthrightly acknowledged that multi-GPU micro-stuttering is a real problem, is what we measured in our frame-time analysis, and is difficult to address. Both companies said they've been studying this problem for some time, too. That's intriguing, because neither firm saw fit to inform potential customers about the issue when introducing its most recent multi-GPU product, say the Radeon HD 6990 or the GeForce GTX 590. Hmm."

right from the horses mouth

If you actually read the article fully they explain how there is hardware in-place (on the nVidia side, anyway) to lessen or eliminate the issue, but it may not always work for all configurations and applications.

So, basically, you may or may not actually get microstutter. They're certainly not saying that EVERYONE experiences it.
 
Actually I have read the article many times.

I think maybe you have not. Yes, Nvidia says that they have hardware in place to combat the issue but nowhere is there even an implication that microstutter is eliminated or even consistently lessend. Nice try at making me look like an idiot though.
 
Lol, calm down. I wasn't trying to "make you look like an idiot", I was simply stating that this is not a justification that microstutter is apparent on ALL multi-GPU setups.

No one is saying microstutter doesn't exist. But saying that people that don't have it on their particular setup simply "don't notice it" isn't true, either.

I think this is a good quote to point out from that same article:

Ultimately, though, the user experience should be paramount in any assessment of graphics solutions. For example, we still need to get a good read on a basic question: how much of a problem is micro-stuttering, really? (I'm thinking of the visual discontinuities caused by jitter, not the potential for longer frame times, which are easer to pinpoint.) The answer depends very much on user perception, and user perception will depend on the person involved, on his monitor type, and on the degree of the problem.

So, yes, it can be a perception issue. But also related to hardware and setup as well.
 
Lol, calm down. I wasn't trying to "make you look like an idiot", I was simply stating that this is not a justification that microstutter is apparent on ALL multi-GPU setups.

No one is saying microstutter doesn't exist. But saying that people that don't have it on their particular setup simply "don't notice it" isn't true, either.

I think this is a good quote to point out from that same article:



So, yes, it can be a perception issue. But also related to hardware and setup as well.

I can agree with that statement in it's entirety. Microstutter is very perception based. As is frames per second in general. For example, I have a very hard time readjusting to my U3011 after playing on my 120hz Asus Vg236h. People claim that they can't tell the difference between 120hz and 60 on an LCD and I almost can't stand 60hz lcd's anymore. Completely perception based.

Now you can probably tell why I'm complaining about microstutter. I will admit that running with Vsync makes it less noticeable, esp. with a 120hz monitor but it is definately still there. I have had many friends try out my past sli and crossfire systems and be completely blown away with how smooth it was and what not while I stood there hating it.
 
Here is a quick benchmark of GTX560 sli in Serious Sam 3 BFE. Microstutter was plain as day in this title.

6dc04410_FrapscalcSeriousSam3.PNG
 
So, yes, it can be a perception issue. But also related to hardware and setup as well.

Some people think a game running at 20fps is acceptable and other's think it is horrible and you need 40fps or more for acceptable frame rates. That does not mean the game is not actually running at 20fps.
 
Some people think a game running at 20fps is acceptable and other's think it is horrible and you need 40fps or more for acceptable frame rates. That does not mean the game is not actually running at 20fps.

Again, that does not mean that microstutter actually effects every mutli-GPU setup, or at least to the same extent on every setup.
 
So if I run my 560 Ti SLI with vsync on my 60hz monitor my game is actually running at 30fps because I'm using SLI and all SLI has microstutter no matter if I realize it or not?

interdasting_re_50_weirdest_things_you_never_knew_about_sex-s685x567-142116.jpg
 
Back
Top