Interesting data on Crossfire vs SLI

I want to see more. What, if any, "fixes" are out there that would limit the CF instability. I am considering a CF 7970 set up, I want to know I can get consistent frame rates.
 
We have been saying this for years.

Ryan has worked long and hard on getting to a point where he could collect meaningful data, and kudos to him for doing so. The fact of the matter is that I did not want to put all the resources into collecting it for a few graphs to tell me what we already know. The data is hard to present and extremely difficult to collect properly. Ryan has ever stated that this will not be a regular review metric for them due to how resource intensive it is.

The CF experience is not as good as the SLI experience.

Like it or not, we will hold our subjective analysis of real world gameplay on this front.
 
Interesting. How come all the 4GB 680's on the market are so steep in price or completely out of stock (like on NewEgg) these days?

I have purchased all the parts for my new build EXCEPT the video cards. Trying to decide on CF or SLI for a 2560x1600 setup. This kind of information is making my decision harder.
 
Wow, that graph of framerate with the "runt" frames omitted is incredibly damning. You're effectively no better off using Crossfire than a single card.
 
Wow, that graph of framerate with the "runt" frames omitted is incredibly damning. You're effectively no better off using Crossfire than a single card.

I wouldn't call it "damning" by any means. Limit the frames to 59FPS and you have a different story, no? What's the point of the 80-100 frames on a monitor with a 60hz referesh rate, anyway?

Yes there would be greater disparity when you are gaming at 120hz but we are talking about high resolution or multi-monitor setups. People arent going to be trying for 120fps in games like BF3 on maxed settings, but if you are for some reason I guess this article is telling you to go Nvidia.
 
Last edited:
I wouldn't call it "damning" by any means.

You wouldn't? Based on this graph, you get no real advantage to CF 7970 over a single card.

Limit the frames to 59FPS and you have a different story, no? What's the point of the 80-100 frames on a monitor with a 60hz referesh rate, anyway?

Not really, no. Based on the graph I posted, most of the time you're getting less than 60 "real" FPS.
 
doesn't SFR (split frame rendering) and supertile rendering avoid these microstutter issues at the cost of slightly less total frames per second?
 
I really want to see more evidence regarding this issue. It's baffling to me that this chart basically says adding a 2nd 7970 does nothing.
 
I really want to see more evidence regarding this issue. It's baffling to me that this chart basically says adding a 2nd 7970 does nothing.

Yeah im smelling bullshit here lol. Running 3892x1200 resolution here and that second 7970 is definitely doing something.
 
Yeah im smelling bullshit here lol. Running 3892x1200 resolution here and that second 7970 is definitely doing something.

Take out one of your 7970s and see what your frame rates are like. Don't even go by FRAPS or an on screen FPS visual. Just test the general smoothness. Pretty sure this would have been discovered a LONG time ago if people upgraded to their 2nd card and didn't notice a much higher responsiveness using the same settings. I'd really like to know because I sold my rig and I have purchased everything except the vid cards for my new rig. This data has me NOT making a purchase until it's confirmed or debunked.
 
Take out one of your 7970s and see what your frame rates are like. Don't even go by FRAPS or an on screen FPS visual. Just test the general smoothness. Pretty sure this would have been discovered a LONG time ago if people upgraded to their 2nd card and didn't notice a much higher responsiveness using the same settings. I'd really like to know because I sold my rig and I have purchased everything except the vid cards for my new rig. This data has me NOT making a purchase until it's confirmed or debunked.

Crysis 3 is a stutter fest without the second card. As long as you limit your frame rate to 60fps (not vsync, use msi afterburner to lock the fps at 60) you will not get microstutter.
 
Crysis 3 is a stutter fest without the second card. As long as you limit your frame rate to 60fps (not vsync, use msi afterburner to lock the fps at 60) you will not get microstutter.

That's all good and well but the BF3 chart from above is showing frame rates in the 80s for SLI'ed 680s. Its just hard to believe that data.
 
So one GPU = AMD (excluding the Titan) 680 vs. 7970 ghz edition

Multi GPU = Nvidia

Sound about right?

It's why I went with 680's over the 7970 when I was in the market for new GPU's several months ago. I knew AMD had overtaken Nvidia with their latest drivers, but the lack of smoothness on AMD cards in dual GPU configuration made me go with the "inferior" 680 in SLI.
 
Would be interested to see the same level of scrutiny applied to both Nvidia and AMD with SLI and Crossfire in terms of input lag. To the layman I would wonder if Nvidia is smoothing things out at the cost of extra input lag...even a couple ms could make these frames smooth out.
 
Got about a week to make my decision on CF 7970s or SLI 680 4GBs. I guess I'll keep an eye on this and see if something comes up to help me make a clearer decision. As of yesterday I was 100% for CF 7970's, now I'm totally on the fence. (Titan is a potential option too but the likelihood of me finding one right at the right time for MSRP is probably slim to none).
 
EPOQ I was in the exact same situation about four months ago and went with the 680's.

AMD had already released their latest driver update which made the 7970 ghz the better card overall compared to the 680 and still went with the 680's because SLI just feels better over Xfire. I have the 2gig versions EVGA FTW 680's and the performance at 2560x1440 is stunning and buttery smooth. Crysis 3 with no AA is 60+ FPS for the most part. the first level chugs, but that is because of poor coding. Very high settings for everything. Card is OC'd to 1254 with the memory at stock.

Every other game that isn't optimized like **** runs flawlessly. I am sure you will get similar or better performance with the 7970, but the lack of smoothness is a deal breaker for me.

Just my two cents.
 
EPOQ I was in the exact same situation about four months ago and went with the 680's.

AMD had already released their latest driver update which made the 7970 ghz the better card overall compared to the 680 and still went with the 680's because SLI just feels better over Xfire. I have the 2gig versions EVGA FTW 680's and the performance at 2560x1440 is stunning and buttery smooth. Crysis 3 with no AA is 60+ FPS for the most part. the first level chugs, but that is because of poor coding. Very high settings for everything. Card is OC'd to 1254 with the memory at stock.

Every other game that isn't optimized like **** runs flawlessly. I am sure you will get similar or better performance with the 7970, but the lack of smoothness is a deal breaker for me.

Just my two cents.

I think I'll try for a Titan if I can get lucky right in my window of purchase, but I may end up going for the 4GB FTW+'s. The conversation PRIOR to this article for me was the performance was similar but the price point and "never settle" game bundle made it an easy choice to pick up the 7970's. This has me thinking I should just fork over a few extra bucks for the 680's so I can avoid the potential "I told you so" scenario later. I'll keep tabs on this thread and the OP's thread in the mean time.
 
The F2P cash bundle from nvidia isn't as good, but it isn't bad either. If you are into PS2 it's a lot of in game currency.
 
The F2P cash bundle from nvidia isn't as good, but it isn't bad either. If you are into PS2 it's a lot of in game currency.

Funny you mention that. I was just looking at it thinking to myself, "Well, I guess I'll get into PS2..."
 
had a chance to see a Titan in action today. Was slower than cheaper Xfire'd 7970 ghz editions. Very cool looking card inside the case tho.
 
Last edited:
Cant argue with the data. Its odd to me, I have/had/do play on multi monitor setups of both Nvidia and AMD cards. Personally I like the AMD card setup, to me game play has a better feel to it. Maybe its because I am going blind I dunno. I like the Nvidia setup but having both in the same room, it just feels better to me with the AMD cards.

Could be that AMD's driver team isnt as "on the ball" as we all think? Most likely they already knew this but have been focused on getting Crossfire to work properly that this is the last "bug" or "kink" to work out? Who knows. While I am disappointed, I am not about to ditch my AMD cards. I have been running tri fire since release and I have been through ups and downs over the past year with AMD drivers, I believe AMD has superior hardware if only the software was at the same level and this is where Nvidia shines.

Either way, the report should at least I hope, light a fire under AMD's backside to get things together. In the end as we say, we all benefit as whole from reports like this and competition between AMD and Nvidia. All good stuff :)
 
Last edited:
i read somewhere that amd has been aware of this 'smoothness' problem and has been working quite feverishly on it for quite some time. the article suggested that amd likely is devoting more of its relative resources than nvidia, since the apu and its 'dual graphics' capability is intended to become the corner stone of amd's computing platform. i wish i remembered where i read that. as a devotee of multi gpu since nvidia resurrected the tech in 05, i've often noticed and commented on how much smoother sli appears over crossfire most of the time. really, the only time both methods performed equally shitty was with the 3870x2 and 9800gx2. i do experience less choppiness when gaming on gtx 680 sli than with 7970 cf, but the cf setup isn't so poor as to significantly distract from my enjoyment of my games.
 
I have a 7970 and after hearing about next gen delays I wanted to get another one for cfx but all these issues I have read about has put a damper on that for now. Im sure this is hurting Amd gfx sales and hope that AMD are working hard on a fix and that we will have it soon.

Speaking of CFX fix from AMD in the future do you Kyle have any info from AMD regarding these issues?
 
Crysis 3 is a stutter fest without the second card. As long as you limit your frame rate to 60fps (not vsync, use msi afterburner to lock the fps at 60) you will not get microstutter.

Not if you fall below the fps cap, in your case 60fps. What if you want to increase image quality so that you get drops to 40fps in some scenarios? Limit the frame rate at 40?

Maybe CF should be benched with a fps cap at min fps, then :D
 
The question I have is what the runt frames really mean/show. Their data collection is showing the time the frame is on the screen (or the portion of the frame, more accurately, I guess), so why are some frames being displayed for such a short time? Is it a case where a frame was partially rendered and then dropped so the next frame could be worked on, or is it just bad load balancing between the cards resulting in two frames being completed almost simultaneously.
 
From my own personal experience a few months ago. After using 2 MSI GTX 670 power editions in SLI, it could not handle certain games at max. Sleeping Dogs, Far Cry 3 and The Witcher 2 were the games I tested at time. I could not max the settings out on either game nore could I get a stable overclock with these cards. I read that Nvidia locked the voltage on their cards as well now?

The one game that really pissed me off was NFS most wanted. I couldn't even hit a constant 60 fps - this was mostly on the developer criterion for a shitty port but the issue has since been fixed. I did notice on this particular game that the NVidia cards felt smoother then AMD cards when the frame rate would take a dive before the fix that is.

The only other game that felt a smoother then amd at the time was far cry 2. After returning the cards for 2 7970 cards. I was able to run all the games mentioned above at max no less then 60 fps but had to run radeon pro <-- (awesome tool) on Far Cry 2 to get it smooth.

A few weeks later, AMD started releasing drivers that seem to address all these stuttering issues. The only game that I have stuttering issues with right now using the latest beta 6 drivers is Rage. Crysis 3 runs at 51- 60 fps everything set to max and no stuttering exist. (no third party tool used)

The bottom line is I am very happy with my setup and I think AMD is doing a great job improving their drivers on a bi-weekly basis. After all, the ps4 is being powered by AMD, as are all the other consoles. Nvidia does make great hardware but why the decision for all next gen systems to choose AMD over Nvidia. The answer is simple, AMD makes excellent hardware as well and Nvidia just cant compete with the price to performance ratio.


The only game
 
Last edited:
See theres all these articles out claiming frame latency problems with AMD cards in crossfire. While there may be an issue here (one which AMD is addressing with the memory management rewrite), a lot of end users are not experiencing this in games.
 
Yeah I'm starting to see that. Perhaps I'll just go for broke and buy a pair of 680s AND a pair of 7970s and do the damn comparison myself and make a final decision over what I see with my OWN eyes. All this he said she said is not helping.
 
Interesting, though I question the presentation as first impressions are, first impressions. The very first example screen capture are a best case Nvidia SLI and a worst case AMD XFR as seen when looking at screen captures further into the review.

I dont question their information, as I dont know enough about it. It is either badly presented or it can give the appearance of an agenda. If you want to appear unbiased, that isn't done.
 
...but why the decision for all next gen systems to choose AMD over Nvidia. The answer is simple, AMD makes excellent hardware as well and Nvidia just cant compete with the price to performance ratio.
No, I don't believe the answer is that simple.
 
i've know this for over 6 years and made my own custom projects with the nvidia based drivers, i've also done research on the laptop variants of high end cards, crossfire is only pre-tuned for a few select games that are very popular, other then that it's utter garbage. this is not including the defect anomalies i've seen more with crossfire systems.

in some cases from my tests, crossfire would work half of the time, where you could see a double fps gain then turn your mouse around and see only 10% gain( and there was no cpu bottleneck) redone the same level tests with nvidia and i get a fluid double frames. i wouldn't be surprised if these hardware review sites get paid to lie... anyways

i will not go ati, i tried them out last year once more, sorry... nvidia's head of the game in these departments, half of these reviews dont even test properly...
 
I had Crossfire Eyefinity right out of the gate and always felt there were horrible issues with smoothness that went beyond framerates. Switching to SLI was night and day. I would be very reluctant to touch Eyefinity or Crossfire again without seeing it first-hand.
 
I had Crossfire Eyefinity right out of the gate and always felt there were horrible issues with smoothness that went beyond framerates. Switching to SLI was night and day. I would be very reluctant to touch Eyefinity or Crossfire again without seeing it first-hand.

you have seen the light, it is GREEN literally $$$ pay to play
 
These posts regarding stuttering, smoothness issues has been addressed by AMD in the last month, I can assure you that from my personal experience a vast improvement in less then one month.

All I can say is I can play the latest and greatest games with no stuttering outside Rage in Xfire
 
These posts regarding stuttering, smoothness issues has been addressed by AMD in the last month, I can assure you that from my personal experience a vast improvement in less then one month.

All I can say is I can play the latest and greatest games with no stuttering outside Rage in Xfire


A few select games is not fixing the whole problem, just ackknowledging and addressing certain games to look like it's getting fixed. I've been saying this for years as well.
 
I just don't get it. If you are capable of seeing high 50s to 60fps at all time in CF, why the hell can't you just play with Vsync and or use the Frame Limiter in Radeon Pro or Afterburner and nip this whole thing in the bud and end the conversation. The stuttering only seems to be an issue when the frame rates are all over the place. Yes, you may have to change it to address certain games where you aren't capable of maintaining that same # of frames, but it doesn't seem like that big of a deal. Not to mention, the more this is brought to the attention of AMD, the more likelihood in them creating a more permanent fix.
 
Vsync sets a cap on frame rate. It doesn't have any impact on frame times.

This doesn't appear to be a frame rate-specific issue.
 
A few select games is not fixing the whole problem, just ackknowledging and addressing certain games to look like it's getting fixed. I've been saying this for years as well.

+ on this comment

I just don't get it. If you are capable of seeing high 50s to 60fps at all time in CF, why the hell can't you just play with Vsync and or use the Frame Limiter in Radeon Pro or Afterburner and nip this whole thing in the bud and end the conversation. The stuttering only seems to be an issue when the frame rates are all over the place. Yes, you may have to change it to address certain games where you aren't capable of maintaining that same # of frames, but it doesn't seem like that big of a deal. Not to mention, the more this is brought to the attention of AMD, the more likelihood in them creating a more permanent fix.

here's the most interesting part, me personally i always game with vsync on, because otherwise you dont get a fluid action. If you had random fps and in 1 section you turned around and got 400fps(which i'm sure you had when you stared at a sky) but then when you turn around to more intensive environments the fps dips, you obviously can notice the difference in fluidness/smoothness.

The funny part is, that some of these Stutters arent taking into account the fact that somebody's, monitor just cant handle natively that much fps(if you recall some old games, if you didnt turn on vsync they would work super super fast) so it chugs and skips frames aka micro stutter, other examples shitty coding by the game dev's themselves which get fixed later in patchs, slow drives or unproperly configured bioses. I would even like to point out, in a way long time ago test, which was done i believe on a amd system(way back) adjusting the timmings from 3 to 2.5 cas latency. i ended up unlocking from 30 fps to 60 fps, as simple as that.

Obviously my system was bottle necked. It's all these factors and people today dont have properly configured rigs and so instant re-action, bogus claims and everybody jumps the bandwagon, with out properly doing tests. I've yet to see any micro stutters on the games i played, i keep my system Tight and configured... and i cherry pick games IQ and DETAIL like a whore.... just some food for thought, about the claims you hear. Some times it is just improperly implemented sli drivers which cause those brief pauses for that particular game, that's because it's not working in sync properly, but other times, it's just peoples systems, you shouldn't play games with vsync off and enable
triple buffering. vsync off is for people who cant sustain fps to match there monitors refresh rate

edit: heres another good example, conducting sli microstutter tests on various different motherboard models and vendors, not each board is built the same even though they can take the same processor and have same chipset, we have never seen actual tests like this.. so until then its he say she say and i say i have no issue with microstutter on 4-way sli gtx 680s 4gb. o yea one more thing i forgot to mention, VRAM should be taken into account as well... but i wont get into that got to go...
 
Last edited:
Back
Top