AMD Crossfire a scam - Almost no benefit over single card

Status
Not open for further replies.
The issue is, the way I read it, the frames are mis-timed because the GPU cannot fully render the frame to be displayed before the previous frame has been rendered by the other GPU. So unless you can render the frame faster somehow, the only thing you can do is remove the runt frame (which will most likely cause artifacts similar to the ones already present) or slow down the frame being delivered by the other GPU - Either way you're loosing a partially rendered frame that fraps counts as a whole frame, thus dropping the recorded average fps.

I can assure you, the GPU is not fully rendering all the frames, if it was there wouldn't be an issue with runt frames.

The way I see it, due to AFR and AMD's lack of frame metering the fully rendered frames are just not delivered evenly - aka microstutter. Runt frames are still fully rendered on the GPU, but delivered with very poor timings.

I would be very surprised if AMD was only rendering partial frames on purpose for performance advantages, and then timing the next frame to cover it up.. sounds like its more difficult to do than implementing frame metering in the first place.
 
As they have it now the frames are rendered more immediately. This is actually by design but they say they didn't realize it was contributing to MS. You can choose not to believe it, it you want.

I don't believe its by design as thats just the default behaviour of AFR. Them saying they purposley deliver frames in as little time as possible is another way of saying they don't have frame metering tech. AFR, or any method of frame rendering tries to deviler them as quickly as possible, however Nv noticed long ago that is not necessarily conducive to smooth gameplay so added frame metering to deal with the issue. Basically Nv went that extra step to ensure smooth performance.
 
I would be very surprised if AMD was only rendering partial frames on purpose for performance advantages, and then timing the next frame to cover it up.. sounds like its more difficult to do than implementing frame metering in the first place.

Yup, this is NOT what is happening. If they were doing the partial frame thing then vsync wouldn't fix it while keeping performance high.


EDIT:

BTW Flu!d, you have a wrong definition of what rendering is. Each frame is fully rendered in the frame buffer, the work is fully done, hence the GPU's actually heat up and have to draw more power, what is happening is that as the card is reading and thus sending the info from the frame buffer to the display, the new frame overwrites it.
This is normal for a VSync Off situation, thing is that if you had a single GPU then you would only experience the stutter of a sudden timing variation if something happened suddenly in the render pipeline to tax the system.
With multiGPU, let me try and put it graphically, think that each 1 is the actual frame:
Card1: 10000010000010000010000010
Card2: 01000001000001000001000001

What the user perceives:
Actual: 11000011000011000011000011
Two frames so fast that are felt more like a single one, specially as it doesn't even draw more than a couple lines of the first when the 2nd one begins to be drawn instead.

Ideal: 1001001001001001001001001001
^^ This is what proper frame metering has to try and achieve, adding delays of tens of seconds in order to achieve a smoother experience.

edit to add 2:
http://www.rage3d.com/board/showthread.php?t=33972974
Last post by caveman-jim about why AFR was the defacto SLI & Crossfire solution:
"From previous conversations with AMD peeps, it's not worth the effort. while what you say is true, the performance doesnt justify the result, especially when you take into account AO which requires info from what could be in the other card's frame buffer."
 
Last edited:
-.-
-9000

It uses the same way as Nvidia's sli, just never thought about frame metering, and now they have to play catchup, as nvidia has done the frame metering for quite some time in their drivers.
 
A runt frame is an Ethernet frame that is less than the IEEE 802.3 minimum length of 64 octets. Possible causes are collision, underruns, a bad network card or software.
 
I can assure you, the GPU is not fully rendering all the frames, if it was there wouldn't be an issue with runt frames.
I can assure you they are, as has been explained numerous times in this and the 19 similar threads, sometimes using pictures. Maybe check that out.
 
I can assure you they are, as has been explained numerous times in this and the 19 similar threads, sometimes using pictures. Maybe check that out.

Some people just believe whatever they want to believe in, and this thread is a great example of it.

I am not quite sure how the term "Runt Frame" are used in this case. It's pretty idiotic in my opinion.

Somehow the word "micro-stuttering" vanished, which I find it odd when both camp are suffering the same issue until lately nVidia sort it out.

Aside from that, it feels like some viral has gone into play lately....:rolleyes:

PS: I don't want to bash anyone, but some trolls are quite obvious. Anyway, I do hope AMD somehow bring something equivalent solution to the table.
 
I am not quite sure how the term "Runt Frame" are used in this case. It's pretty idiotic in my opinion.


PCPer started to misuse the term since february and they got it to stick with the multitude of ahem "fans" who didn't know better.
 
Interesting thread. This I know:

I have one Radeon 6950 2GB. It handles things damned well and I can usually run max settings at playable frames on anything. Not locked at 60fps solid, but playable.

My buddy has two 6950s Crossfired. In the same games, he's getting that constant 60fps I wish I had.

If Crossfire were a scam, why would there be such a difference?

Yea, I think someone's out to lunch here.
 
Interesting thread. This I know:

I have one Radeon 6950 2GB. It handles things damned well and I can usually run max settings at playable frames on anything. Not locked at 60fps solid, but playable.

My buddy has two 6950s Crossfired. In the same games, he's getting that constant 60fps I wish I had.

If Crossfire were a scam, why would there be such a difference?

Yea, I think someone's out to lunch here.

You weren't paying very close attention to the content of the article were you? The gist is that while the reported frame rate may indicate an 80-90% increase in raw out put, the end result being presented to the user after the scene has gone through the entire pipeline isn't anywhere near as impressive due to how AMD handle their multi GPU tech. It's been an commonly noted issue with crossfire for a long time now however people haven't been able to properly quantify it with legit metrics.
 
Just a few comments to get this thread back on track:
I don't work for Nvidia, I sell vitamins for a Supplement company and consider myself a closet geek.
I have run multi card setups for about 10 years by both vendors, alternating based on price/performance and the games I'm playing.
I am currently on AMD hardware only
I build a new system every year and consider myself to be a hardware/gamer junkie in this regard.
I have spent so many years learning about the technology to understand the differences in performance I could have attained multiple college degrees if I had put my time into that.

I can perceive the difference and unquestionable slop that my crossfire cards are displaying. It is not a plug for nVidia, who have issues on their side, but simply the attempt to bring more information about this situation into the open so that AMD may 'fix' what is unequivocally broken.

I cannot run at 2560x resolution with 1 card comfortably. I prefer to run near 60fps avg if possible as I have always had enough horse power in my rigs to approach or exceed that. I don't like minimum frame rates under 45, and was a competitive gamer for many years active in many competitions, ladders, and leagues.

My expectations of buying two amd cards to crossfire have always been that I will get a substantial performance boost in quality playability and raw speed, with the limitation of some microstutter. Unfortunately this has been misguided as I just haven't felt the gameplay felt very good this generation of games using crossfire. I just didn't understand it. Some games were fine, and rarely showed any hiccups while delivering a substantial increase in gameplay quality. Unfortunately for the last year virtually every game I have put time into has been a nightmare compared to the feel of a single card.

I don't use AA in some games because I feel it creates a latency lag on AMD games delivering frames (SEE BF3 with MSAA enabled).

I try not to use VSYNC or TRIPLE BUFFERING as I feel an increase on latency lag (you can feel it when mousing around the screen) in a large amount of titles.

Sometimes I'll set my frame limit to 60 in afterburner's rivatuner utility. (radpro won't install right on my win7 for some reason)

A lot of people don't comprehend the rediculous amount of power you need to game over 1920x resolutions. This adds more processing time, resources, communication, ect which detracts from the end user feel. Eyefinity is even more profoundly effected by these inconsistencies so I haven't had an interest in working with that much overhead and chopping down textures and settings and other options to get a 40fps avg that looks less than ideal for thousands of dollars.

From what it sounds like, July will be the next real update to this story, as by then AMD is advertising that they will have a similar metering built into their softwear that NVIDIA has been using for a while now. Will this be enough to fix a completely broken system? Who knows, but from the actual quotes I've read from AMD they aren't exactly advertising it as the solution, so don't hold your breath.

Almost every major game I've played this cycle has been a real let down in crossfire. BF3, Crysis3, Bioshock INF (what good is 200fps if its a choppy mess in battles?), Farcry3.

My friends and I have joked for a long time that NVIDIA vacuums up the uninformed buyer via marketing and other metrics meant to create an aura of apple esq magic around their brand. Ever since the my 9700pro blew my mind and let me just max everything the FK out for the first time, I have been completely open to any vendor's product based on advertised/perceived price/performance ratios. As I moved into higher Resolution, AMD has held that perceived seat for quite some time. This was the first generation however that I simply felt utterly unsatisfied across the board with AMD.
 
Not even a litte. Too few nvidiots willing to accept thaCF does improve performance, perceived and as well actual, and too few AMD fanatics are willing admit there is a real problem.

However, if 30 to 90% better performance in most games is considered "almost no improvement", then NV's SLI, is in a similar, slightly slower sinking boat.
 
I just installed a 6970 into a customer's machine that was running a 6990, and hooked them all up to crossfire. I noticed a bit of Microstutter every few seconds or so, but I can definately say that there was a huge perceptual benefit. Not just numbers, the games felt much more responsive. I don't believe the line "almost no benefit over single card" but I can understand the room for improvement, as SLI setups feel a bit more 'smooth'. Though the 6990/70 tri-fire was definately a step up from 660ti SLI in terms of what visual quality I could crank.
 
I know plenty of folks that gave up on multi-card configs this generation alone for both sides that also run very high resolutions.

Runt frames was/is very much ill used by the reviewers and editiors unless that particular term was used in the context that their test system was streamed over the net to the recording system so then yes it could make sence, but that does not seem to be the case, I still think Nvidia is doing something, heck even that "capture" card could be tricked out, definitely would not be the first time Nvidia or Intel or even AMD f***ed with the results now would it :p
 
but doesn't this just invalidate any review comparing crossfire to sli.
Look at Hard we have comparisons that almost always show AMD with higher frames than Nvidia.
But by this those numbers mean nothing.

I think that N knew about this and the only way they could jump up and say look they are cheating was to come out with a tool that showed it.

Was looking at the 680 to 7970 dual card review and comparison and no where did they mention stuttering, did they just not say it or did they not see it?
 
some are more prone to it then others, but I believe PCper and even Nvidia should be helping AMD figure it out to give the users the better experience, not doing it to hurt AMD bottom line like is very much routine, hell even if they developed an add in chip to help would be better then nothing, something agnostic to both parties so no one is "hurt" and both sides are "helped"

Good to say " you are not getting your moneys worth" but also not good enough to point out say what AMD cards are very good at, which is why I stick with them, they all have issues, in my personal opinion, I like the way AMD cards render shadows, lighting, smoke, water vs Nvidia cards in the vast majority of games.

It is a known issue, if Nvidia cared about their consumers as much as they claim to, they would put aside their petty fighting and work with AMD to get this kind of stuff sorted out so the best man could win :)
 
but doesn't this just invalidate any review comparing crossfire to sli.
Look at Hard we have comparisons that almost always show AMD with higher frames than Nvidia.
But by this those numbers mean nothing.

I think that N knew about this and the only way they could jump up and say look they are cheating was to come out with a tool that showed it.

Was looking at the 680 to 7970 dual card review and comparison and no where did they mention stuttering, did they just not say it or did they not see it?

Except AMD isn't cheating, they just had piss poor quality control with their crossfire support.

The stuttering isn't in every game and it doesn't effect everyone. I had a 4870 cf set up and I didn't notice stuttering on supposedly the worst stuttering set up. Even looming for it I wasnt seeing it.
 
but doesn't this just invalidate any review comparing crossfire to sli.
Look at Hard we have comparisons that almost always show AMD with higher frames than Nvidia.
But by this those numbers mean nothing.

I think that N knew about this and the only way they could jump up and say look they are cheating was to come out with a tool that showed it.

Was looking at the 680 to 7970 dual card review and comparison and no where did they mention stuttering, did they just not say it or did they not see it?


[H] doesn't use the numbers to review.. they do "Highest Playable Setting" .. meaning they play and feel how the game plays.. Kyle and Brent have been saying that "FPS" Numbers have been misleading for years..

infact in [H]'s reviews you can't even look at the numbers or you won't know whats going on.. i look at the resolution and game settings to determine how well a card is reviewed..

that is why these numbers and analysis completely shows up in Reviews here.. if an SLI/XFire setup sucks compared to the single card.. irregardless of the framerate numbers.. they'll say it and it'll show up in the review..
 
It is a known issue, if Nvidia cared about their consumers as much as they claim to, they would put aside their petty fighting and work with AMD to get this kind of stuff sorted out so the best man could win :)

Nvidia helps their customers by doing whatever frame metering/latency control they already do - how would helping AMD be helping Nvidia's customers? And I'm sure that by Nvidia's reckoning the best man aleady won, and it was them. This isn't fluffy-bunny world, this is corporate competition.
 
now that I know what it is and how much size it occupies, you can see the 'runt' frame when playing games, since it is usually being rendered in the same location, it creates a tearing like look that really takes away from the smoothness. The runt location changes up and down from time to time but is usually 1/3-2/3 up the screen. looks like this:

1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
2///////////////////////////////////////// 5/////////////////////////////////////////
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------

1,3,4,6 = beautifully rendered frame
2,5 = misaligned dwarf of a frame called runt


Despite the comments that they are 'too small to count', it's more like they are big enough to be an annoyance and burden, but not small enough to be insignificant.
It appears almost as a blurred line across the screen because the runt being drawn doesn't quite match up with the frame alignment. (This is in crossfire.)
 
Last edited:
hmmm, I daresay Nvidia cares more about the money they will make or not make more then anything else, if that means they have to screw with competition then they will, even if it means "their" customers suffer at the end of the day, OMG they did one possible positive thing to showcase what they are doing but will deny all wrongdoings they have ever done :p
 
i bought crossfire because of the generic fps performance vs heat/power consumption.


but i do see some games where crossfire blows, but most of the times it is significantly better than with one card.

but i will probably play with vsync options more since ive read this article (i usually just turn it off)
 
now that I know what it is and how much size it occupies, you can see the 'runt' frame when playing games, since it is usually being rendered in the same location, it creates a tearing like look that really takes away from the smoothness. The runt location changes up and down from time to time but is usually 1/3-2/3 up the screen. looks like this:

1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
2///////////////////////////////////////// 5/////////////////////////////////////////
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------

1,3,4,6 = beautifully rendered frame
2,5 = misaligned dwarf of a frame called runt


Despite the comments that they are 'too small to count', it's more like they are big enough to be an annoyance and burden, but not small enough to be insignificant.
It appears almost as a blurred line across the screen because the runt being drawn doesn't quite match up with the frame alignment. (This is in crossfire.)
Lol. They're 2 pixels high and gone in 16ms. Anytime your framerate doesn't match refresh rate you're going to get tearing.
 
:p so many point it out once they see a little fire and call it a blaze, Nvidia has these problems as well, they make a big stink on it that Oh we had the frame metering for quite some time, umm yeh my arse, I have known folks to switch sides constantly, to go to single cards for the issues they had with almighty SLI and crap quality control for Nvidia as well.

stuttering, tearing are acceptable terms but this "runt" frame crud has to go, it is way not properly in context for its description. The one thing I have to give credit to Nvidia for this generation is the dynamic Vsync(its a start) and the boost that all the "new" cards are using. That folks is advancing a design, just like AMD did for Eyefinity and pushing forward with GDDR3-4-5 among other things.

They all have things they need to polish out, but I somehow feel this is making a mountain out of a molehill, so many I know use CF and SLI for various reasons, a few I know use high resolution multi-monitor on a single card for productivity, so many switched to single card and turned settings down a bit. Is it really that bad to not go balls to the wall just cause you "should" be able to?

Hell even multi-threading for processors in general is nowhere near where it should be and processors have been out far longer then modern graphics cards with modern features have been, sometimes you ask for the sun moon and stars and instead you will just end up with a bucket of mud, it is what it is.

RadeonPro apparently helps AMD cards tons, and little things Nvidia wise who are "oh so much better and always have been" do not work perfectly 100% so don`t kid yourselves, to much proof against that as well LOL
 
now that I know what it is and how much size it occupies, you can see the 'runt' frame when playing games, since it is usually being rendered in the same location, it creates a tearing like look that really takes away from the smoothness. The runt location changes up and down from time to time but is usually 1/3-2/3 up the screen. looks like this:

1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
1----------------------------------------- 4-----------------------------------------
2///////////////////////////////////////// 5/////////////////////////////////////////
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------
3----------------------------------------- 6-----------------------------------------

1,3,4,6 = beautifully rendered frame
2,5 = misaligned dwarf of a frame called runt


Despite the comments that they are 'too small to count', it's more like they are big enough to be an annoyance and burden, but not small enough to be insignificant.
It appears almost as a blurred line across the screen because the runt being drawn doesn't quite match up with the frame alignment. (This is in crossfire.)

I have to comment on this. I get this misaligned effect a lot when I play Diablo III on my single card Radeon 7750 at 1920x1200. It's not the lack of FPS, but it seems like something isn't "catching up".
 
The thing I don't understand is that games go from being unplayable to playable @ 7680x1600 with 1 to 2 cards. I'm not saying that those runt frames aren't happening (because there's evidence to support it), but by no means does the second card NOT help the gameplay experience.

V-sync is botched. Even if I get more than 60 FPS (in which I do on many games with the settings I use), V-sync keeps my frame rate at 30 FPS (which is very strange). However, it appears to work if I run the games at 5760x1200 or 5040x1050... So I'm at a loss.
 
From the first page of the article ....

"to help expedite a lot of this time consuming testing, some of the code base and applications were developed with NVIDIA and thus were distributed to other editors recently."

"NVIDIA was responsible for developing the color overlay that sits between the game and DirectX (in the same location of the pipeline as FRAPS essentially) as well as the software extractor that reads the captured video file to generate raw information about the lengths of those bars in an XLS file."

"Not only do we NEED to have these tools vetted by other editors, but we also depend on the community to keep us on our toes as well...That is still the goal – with only one minor exception: NVIDIA doesn’t want the source code of the overlay to leak out simply because of some potential patent/liability concerns."

And NO ONE has questioned this!? ... Seriously!?
 
Plenty of people questioned it and realized that all it is doing is putting a color overlay on the image. Guru3D already made their own overlay and got similar results, and new versions of Precision and Afterburner will also generate the overlay. So, in short, not an issue except for people who like to make things into issues.
 
Plenty of people questioned it and realized that all it is doing is putting a color overlay on the image. Guru3D already made their own overlay and got similar results, and new versions of Precision and Afterburner will also generate the overlay. So, in short, not an issue except for people who like to make things into issues.

Got any links to that info?
 
Got any links to that info?

Considering FCAT primarily is an open project we feel it is a reliable benchmarking methodology. The fact that the ideas and initial software needed originates from NVIDIA doesn't mean it is a subjective methodology. Contrary, this can become one of the best benchmark methods we have ever tried, it is very time consuming though. From the get-go NVIDIA has been open and transparent about everything related to this testing method. Once the tagged frames arrive at the FCAT machine honestly, it can't see what graphics card is rendering the game, as such it creates an equal playing field for any brand or type of card you connect to it.

The one piece of software that could be tainted as such would always be on the game machine side, and that's the software overlay - hence here at Guru3D we started implementing our own coded FCAT overlay version into RTSS, the overlay statistics software for MSI AfterBurner and Precision to immediately kill of that suspicion. Honestly, NVIDIA would shoot itself in the foot to mess with stuff to gain results in their favor, as the media will detect it, that I guarantee you. But we'd love to hear about it from you guys.

Of course NVIDIA had alternative motives to start a discussion about FCAT, it exposes the things we see with AMD's AFR sync multi-GPU solutions versus what need to be more consant frame time. If we reverse that, if NVIDIA would have had issues at hand, they'd never introduced this methodology towards the public. But, that doesn't mean the FCAT methodology is flawed. Now I took Hitman as a bit of a showcase in this article, as it has extreme (way more then other games) issues with showing latency spikes that can not be seen on the monitor. Realistically FCAT didn't show them on AMD single GPU solutions, so that works out pretty well and benificial for AMD don't you think ? So yeah, I believe it is one of the best tools we have had at hand in a long time. But I'm very curious what you guys think about FCAT and will follow and love to listen to you guys in our forums.

http://www.guru3d.com/articles_pages/fcat_benchmarking_review,9.html

And the Precision info was from a comment EVGA-Jacob made over on Overclock.net
 
Plenty of people questioned it and realized that all it is doing is putting a color overlay on the image. Guru3D already made their own overlay and got similar results, and new versions of Precision and Afterburner will also generate the overlay. So, in short, not an issue except for people who like to make things into issues.

Well I personally don't like making an issue out of thing just for the hell of it, but I also don't take things on face value either, and this was screaming at me when I was reading those paragraphs.

Thanks for the Guru3D link.
 
Well I personally don't like making an issue out of thing just for the hell of it, but I also don't take things on face value either, and this was screaming at me when I was reading those paragraphs.

Thanks for the Guru3D link.

I think if the analysis software was closed source it would be a bigger issue, but the color overlay seems pretty generic. The biggest issue is that there is really no way for end-user enthusiasts like us to duplicate the findings - you can enable the color overlay now (I've tried it), but without a capture card it doesn't do anything for you, since locally captured videos don't show it. So there's no way for users to validate the findings, or do other tests beyond what the big websites do.
 
I think if the analysis software was closed source it would be a bigger issue, but the color overlay seems pretty generic. The biggest issue is that there is really no way for end-user enthusiasts like us to duplicate the findings - you can enable the color overlay now (I've tried it), but without a capture card it doesn't do anything for you, since locally captured videos don't show it. So there's no way for users to validate the findings, or do other tests beyond what the big websites do.

60fps video camera? It would show the lines at least. 120fps or more would be better (as the frame isn't visible the whole time).
 
Just chiming in again: I sold a customer a system with an i5, and two HD 7850s (sapphire dualX 2gb): in Unigine valley at max settings, at 70-110fps I could not notice any stutter, it was actually quite smoothe. I was on a 60hz screen. I HAVE noticed stutter with other AMD setups, so I'm not trying to be a fanboy here: rather I'm trying to help drive the point that 'almost no benefit' is a total lie, and I've never seen stutter so bad that I would call it 'no benefit' over a single card. That said, a 660ti SLI setup is reliably smoothe, and I've haven't seen drastic stuttering with an Nvidia setup recently.
 
Status
Not open for further replies.
Back
Top