Eyefinity/Multi-7970 users bad news

Multi-GPU solutions are just hackjob stop-gaps. A single GPU with that many cores doesn't exist yet, so they slap two together in order to accomplish similar performance gains as best they can. If you want a real, proper upgrade, wait for the next beastly single-GPU card.
I would disagree with that assessment as multi GPU cards are an effective way from a space point of view to get multiple cards into the system, not to metion increasing performance
Multi-GPU on one card has almost all the same problems as multi-GPU on two cards. All my arguments about added complexity still hold.

A single-GPU with more cores is still a lot simpler than a single card with two GPU's, two banks of memory, the added complexity of a PCIe hub, and an internal PCIe bus between the two sets of graphics hardware (which means even more redundant hardware, you're now duplicating a chunk of the motherboard as well).

Putting both GPUs on one card helps with physical space requirements, and somewhat with performance, but it's still not as effective as just putting more cores on one GPU.

So... what exactly are you disagreeing with? It's still just as much of a stop-gap solution, holding position until a single GPU can be released that performs at the same level.
 
Last edited:
Multi-GPU on one card has almost all the same problems as multi-GPU on two cards. All my arguments about added complexity still hold.

A single-GPU with more cores is still a lot simpler than a single card with two GPU's, two banks of memory, the added complexity of a PCIe hub, and an internal PCIe bus between the two sets of graphics hardware (which means even more redundant hardware, you're now duplicating a chunk of the motherboard as well).

Putting both GPUs on one card helps with physical space requirements, and somewhat with performance, but it's still not as effective as just putting more cores on one GPU.

So... what exactly are you disagreeing with? It's still just as much of a stop-gap solution, holding position until a single GPU can be released that performs at the same level.

I think what is being asked is essentially, what's your point? You say that multi-card solutions are a stop gap to get more power because the current single card solution doesn't have enough power. The solution to this is to have a new gpu released that performs better so you don't have to use a multi-card solution.

But unless the single card solution released ever provides enough power to run every crazy resolution with all bells and whistles (unlikely, unless software/game development completely halts), there'll always be a reason to want more power than the current single card.

At the end of the day, you're more or less asserting that the solution to multi-gpu issues is simply to have current tech magically skip a few generations and be coming out about 5 years ahead of where they are.
 
At the end of the day, you're more or less asserting that the solution to multi-gpu issues is simply to have current tech magically skip a few generations and be coming out about 5 years ahead of where they are.

No, I'm saying (because of it's poorly implemented and stop-gap nature) multi-GPU is simply not worth it, and the solution to all of its problems is simply to avoid it like the plague. The vast majority should simply wait for a better single-GPU card instead of trying to glue two graphics cards together.
 
Multi-GPU on one card has almost all the same problems as multi-GPU on two cards. All my arguments about added complexity still hold.

A single-GPU with more cores is still a lot simpler than a single card with two GPU's, two banks of memory, the added complexity of a PCIe hub, and an internal PCIe bus between the two sets of graphics hardware (which means even more redundant hardware, you're now duplicating a chunk of the motherboard as well).

Putting both GPUs on one card helps with physical space requirements, and somewhat with performance, but it's still not as effective as just putting more cores on one GPU.

So... what exactly are you disagreeing with? It's still just as much of a stop-gap solution, holding position until a single GPU can be released that performs at the same level.

it's not a stop gap solution if you need more performance that a single GPU offers at the time of release. 2 cards in the same slot space is ideal if you are trying to drive as many monitors as you can in said space a single 6990 is (in theory) capable of driving 12 displays at once....
 
it's not a stop gap solution if you need more performance that a single GPU offers at the time of release. 2 cards in the same slot space is ideal if you are trying to drive as many monitors as you can in said space a single 6990 is (in theory) capable of driving 12 displays at once....

That's the definition of a stop-gap... it's a less-than-ideal solution that holds you over until a better solution can replace it.
 
^it's always going to be like that man one card comes out and then later another faster one comes out.....# of GPU's on the card is irrelevant. to you it might be a stop gap because of microstuttering but even single GPU cards stutter...it's all part of the way it goes
 
to you it might be a stop gap because of microstuttering but even single GPU cards stutter...it's all part of the way it goes
Normal game engine stutter and loading stutter are simply facts of game engines, all cards will experience that. Microstutter on top of that is the fault of the hardware solution. Very different things.

And it doesn't just "seem" stop gap, it is stop-gap. If AMD could have feasibly released a single-GPU card as fast as the 6990 when the 6990 was launched, you can bet they would have (and they would have lorded its performance over Nvidia, big time). In this hypothetical scenario, and considering what the power draw and heat output would have been like for a card like this using the larger process size that the 6990 used, why even bother releasing a ridiculous dual-GPU card when your single-GPU cards keep up with the competition, right?
 
Last edited:
Most people don't even understand what micro stuttering is. If somebody who proclaims to notice micro stuttering is asked to define it, he'd most probably say game jitters or stutters. Rather then the true fact that while that is true, the problem is in the inconsistency of the graphics cards.
They both are drawing frames as they list it, but some frames might be drawn in 0.1 seconds while the next frame will be drawn in 0.5 seconds. Averaged out it still might be 55 FPS, but it's not uniformly spread across the entire "timeline"

I've noticed it myself, but the only time it gets noticeable is as stated in this thread, when the FPS is below the refresh rate. (for me it might be below 60fps, as i never noticed it above 60fps)
 
BFBC2 also had mad stutter with CF when it first came out and AMD dealt with it with driver and CAP fixes. Give it some time perhaps?
 
So is this a eyefinity issue primarily or will xfire 7970's give bad microstuttering regardless? I was under the assumption that the 69XX series did a good job at removing it in single display setups?
 
I am very happy with my 69XX series scalability and lack of microstutter. I had it in spades along with other display issues with 58XX CF.
 
BFBC2 also had mad stutter with CF when it first came out and AMD dealt with it with driver and CAP fixes. Give it some time perhaps?

The point fo the main article is why AMD doesn't have better support "before" game launches. AMD is always lagging behind in driver support for their new cards or games.
Giving them time doesn't help the customer who can't play their game as advertised. It could be the next driver update, several after or not fixed at all.
 
No, I'm saying (because of it's poorly implemented and stop-gap nature) multi-GPU is simply not worth it, and the solution to all of its problems is simply to avoid it like the plague. The vast majority should simply wait for a better single-GPU card instead of trying to glue two graphics cards together.

Sorry, but I have to disagree. Multi-GPU isn't a stop gap solution. No matter how big and sophisticated you can make a die there is ALWAYS a limit. Multi-GPU is a PERMANENT solution that allows for extension of single GPU power by allowing a single GPU to work in an array. Arrays of devices are used in all facets of computers, memory, storage, GPUs and CPUs.

Again, limits of a single device will ALWAYS exist and arrays will ALWAYS allow as to extent the capabilities of single devices.
 
So is this a eyefinity issue primarily or will xfire 7970's give bad microstuttering regardless? I was under the assumption that the 69XX series did a good job at removing it in single display setups?

Yes, mainly an eyefinity issue. While running my 2x 6990's on a single screen I've never had any stuttering. It's under eyefinity that the problems really arise, just like the reviewer pointed out.
 
Yes, mainly an eyefinity issue. While running my 2x 6990's on a single screen I've never had any stuttering. It's under eyefinity that the problems really arise, just like the reviewer pointed out.

AMD doesn't seem to be as able to scale across GPUs and monitors as well nVidia. AMD just got HD3D working with CF and Eyefinity, a capability that nVidia had out of the gate with it's multi-monitor solution. However AMD does much better support for more than 3 monitors.
 
I had 3x5870 3x6970 reference 2x6970 Asus DC II on eyefinity setup and of course as you can see I know the problem with multigpu/multimonitor gaming and believe me both sides Amd/Nvidia have problems.
If I can advice you the best way to play games on eyefinity/surround setup with no stuttering is to stay with two gpu cores.
Three or four cores depend on game made problems.
I played all the series of CoD fine but on Dirt 2/3 and BF3 I had stuttering with trifire.
Same problem with my friend with 580 3sli but he preferred to play with vsync on to solve problems.
Another friend has your setup 6990CF and play BF3 with 3x1 monitor ok but when he did 5x1 quadcore gpu setup made problems.
I dont know if you prefer to play with vsync but I am not.
I want to have all the power from my gpu to my screen and no mouse lag.
Both companies Amd/Nvidia have their plus and minus.

I actually use both, for several months. I can tell you nvidia is better with multi GPU surround vs AMD, crossfire stutters and feels choppy, crossfire 100fps feels slower than the single gpu 60fps. Nvdia however was smooth with dual GPU and surround. This is one reason why i will wait for Kepler.
 
I actually use both, for several months. I can tell you nvidia is better with multi GPU surround vs AMD, crossfire stutters and feels choppy, crossfire 100fps feels slower than the single gpu 60fps. Nvdia however was smooth with dual GPU and surround. This is one reason why i will wait for Kepler.

My experience is the same. For those that say nvidia has just as much problem with micro-stuttering and any type of stuttering in multi-screen setups, I say no way. Having used many types of each brand and many screen setups, nVidia comes ahead in that regard easily.
 
isn't microstuttering eliminated with split rendering or tile rendering?

I imagine if you had both gpu's working on a half frame each and then assembling them and then showing them, you would completely cut out micro stuttering. Although I imagine that would introduce more lag, but then again lag in between frames is really what micro stuttering is.
 
I have x79 and bf3 is horrendous with 2x6990. I'm hoping 2 7990 or 3 7970 will get the kinks worked out. Hell ppl the 7900 series are not even officially launched and we have no idea what cat 12.2 will bring us.
 
I have x79 and bf3 is horrendous with 2x6990. I'm hoping 2 7990 or 3 7970 will get the kinks worked out. Hell ppl the 7900 series are not even officially launched and we have no idea what cat 12.2 will bring us.

I think its a good idea to wait for Kepler, AMD's multi GPU setups are a nightmare, unless by some miracle the 7970 is different, i dont see myself spending $600 on the 7970. nVidia has the better multi GPU setup, i have had both, i speak from experience, not by what i read on review sites. Neither are flawless, such thing does not exist in this world, someone will always find a flaw in everything, but nvidia is the way to go for multi gpu setup. Single GPU, both are great.
 
When using my 2x 6990's on 3x1 portrait, turning on Vsync did not make the stuttering any better. With this report my 5x1 Eyefinity plans are cancelled and I will be going with a single 7990 on my 3x1 setup.

If you are talking about bf3, then turn off vsync and try 'gametime.maxvariablefps 60' typed into the ingame console, it helps my eyefinity setup run very smooth.
 
I imagine if you had both gpu's working on a half frame each and then assembling them and then showing them, you would completely cut out micro stuttering. Although I imagine that would introduce more lag, but then again lag in between frames is really what micro stuttering is.

I've forced tiling and what not instead of alternate frame rendering and the performance was atrocious. It was slower than going with a single card. AFR is really the only thing used anymore and that equals stuttering.

I have x79 and bf3 is horrendous with 2x6990. I'm hoping 2 7990 or 3 7970 will get the kinks worked out. Hell ppl the 7900 series are not even officially launched and we have no idea what cat 12.2 will bring us.

Ya, 2x 6990 is horrible in BF3. With you having X79, I would get new cards then you can take advantage of PCI-E 3.0. Not only additional bandwidth but PCI-E 3.0 is suppose to have faster latency versus 2.0. Stuttering is hardware based, drivers have never helped stuttering problems so I doubt 12.2 will do anything.

If you are talking about bf3, then turn off vsync and try 'gametime.maxvariablefps 60' typed into the ingame console, it helps my eyefinity setup run very smooth.

I have 120 Hz screens.
 
Maybe I will have to wait for Kepler after all, or maybe a single 7990 as my single 6900 doesn't nearly have the stuttering problems as 2x 6990s.

I want to see the power of Kepler. :D

1100/1450
unigine7990.jpg
 
I have 2 480 in SLI on 3 Dell U2311H monitors. Setup in Portrait Mode and never had any serious issue with games. I did have an issue with painkiller but driver 285 fix it. (it would not load before)

My brother-in-law in the other hand bought a 5970, and TRIED to play on surround mode and would encounters issues after issues after issues, till he disabled the crossfire support. HE would then complaint it was to high of a resolution. So he got rid of the 2 other monitors and stayed on a single 23 inch.

After seeing that, I am staying away from AMD and CFX. My 2 cents.
 
For folks that say Nvidia has less pronounced microstuttering, there is likely some truth to that. Says Nvidia's Tom Petersen:

In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)

Source: http://techreport.com/articles.x/21516/11 << Good read, by the way.
 
For folks that say Nvidia has less pronounced microstuttering, there is likely some truth to that. Says Nvidia's Tom Petersen:



Source: http://techreport.com/articles.x/21516/11 << Good read, by the way.

Thanks for the link. This also might explain something that pops up in game reviews where CF setups clock higher frame rates but SLI setups will play games more smoothly. The explanation of nVidia's frame-metering technology seems to provide a good reason for this observation.
 
For folks that say Nvidia has less pronounced microstuttering, there is likely some truth to that. Says Nvidia's Tom Petersen:



Source: http://techreport.com/articles.x/21516/11 << Good read, by the way.

Great article. I think a few slides can sum it up:

bc2-gtx580sli.gif


bc2-6970cfx.gif



This is why I feel Kepler is going to win the multi-GPU crown this year. nVIdia has taken active steps to minimize stuttering and AMD has not.
 
Last edited:
View all of the results here: http://techreport.com/articles.x/21516/5

(multi-gpu results are bottom half of page) The AMD results look poor across the board.

I read through the entire article and frankly there's a lot more than just frametimes behind everything... at this point between my own experiences and the quantitative+subjective measurements, I'm going to try to stick to a single super-fast GPU for a long time now... excellent read. While nVidia is better on multi-gpu rendering, both still aren't perfect. I do agree that for multi-gpu, even ignoring driver issues and solely going off of info in this kind of article, nVidia is definitely the superior choice for multi-GPU setups where and when needed.
 
The problem is for those of us that love multi-monitor. We are forced to go SLI/Crossfire as there is no single card capable of multi-monitor performance. :(
 
I've forced tiling and what not instead of alternate frame rendering and the performance was atrocious. It was slower than going with a single card. AFR is really the only thing used anymore and that equals stuttering.
.

I want to hear more about this. I don't have SLI or crossfire. do you guys have options to force Split Frame Rendering or Tile Rendering in games? Do you guys get worse framerate using this method than a single card? Does micro stutter change?

I'm not sure how split rendering even works, i mean, it can't just literally split the screen down the middle and render only half right? That sounds like something you can only do with ray tracing. I would imagine that some of one card may render a model that intersects into the other half's screen, which means there's some redundant rendering going on, or some sort of driver logic that has to tell which card to render which model (like lucid hydra technology)
I read through the entire article and frankly there's a lot more than just frametimes behind everything...
behind "everything"? behind what? The entire article is only dealing with the percieved smooth motion of frames. how do frametimes not completely capture everything having to do with video smoothness and our specific conversation: microstutter.
 
The problem is for those of us that love multi-monitor. We are forced to go SLI/Crossfire as there is no single card capable of multi-monitor performance. :(

This.

I was ready to buy 2 7970s for BF3 in ultra across 3 monitors but will sit on my hands until I hear more back about their xfire microstutter. I'd be pissed if my $1200 videocards played worsed than my 570s....
 
behind "everything"? behind what? The entire article is only dealing with the percieved smooth motion of frames. how do frametimes not completely capture everything having to do with video smoothness and our specific conversation: microstutter.

Oh yay, I finally have occasion to use it: "RTFA".
 
Back
Top