AMD Radeon RX 480 8GB CrossFire Review @ [H]

I enjoyed the review thank you.

Personally, as one or two others mentioned, I think the frametime analysis should have been its own separate article among multiple xfire/SLI setups.

Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?

It's okay to say (if warranted) that AMD frametime sucks butt compared to Nvidia frametime. It's also okay to back up your user experience with frametime data if that user experience simply sucked due to the issue. ...but to include a chart and discuss it on every game? It then becomes the focus and In my mind, changes into a xfire frametime review.

Review page 4:
"After disabling Wind Effected Snow, "Ultra" settings were very enjoyable in this game on RX 480 CrossFire at 1440p with great framerates."

.... and further down, when bringing up frametime:
"There is quite a big difference in The Division in frametime between AMD Radeon RX 480 8GB CrossFire frametime and GeForce GTX 1080 frametime. This coincides with what we felt in this game on CrossFire."

Which is it? I'd argue if the play was "very enjoyable.... ....with great framerates" then putting excessive focus on discussing frametime seems disingenuous.
 
Personally, as one or two others mentioned, I think the frametime analysis should have been its own separate article among multiple xfire/SLI setups.

That would not be a bad idea.

Then it could be linked in all caps every single time anything multi-GPU is reviewed. Something like:

WARNING: FRAMERATE NUMBERS IN THIS ARTICLE ARE NOT NECESSARILY REFLECTIVE OF THE ACTUAL GAMING EXPERIENCE DUE TO ISSUES ASSOCIATED WITH MULTI-GPU SETUPS

This statement should be in every SLI and Crossfire article, including articles reviewing products which have multiple GPU's on the same board, as they work EXACTLY the same way, and for some reason many people seem to think they are somehow different than regular SLI/Crossfire.

Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?

I don't know, maybe because AMD specifically made this claim in their launch?


AMD-Radeon-RX-480-Crossfire-vs-Nvidia-GTX-1080.jpg



Normally I would agree. SLI/Crossfire compared to single GPU is never fair, because while you might get higher average and max framerates, the min framerates usually suffer, and the frametime inconsistencies/stutter are obnoxious as is the added input lag. Once AMD put this on the table as their way of competing with Nvidia on the high end, it instantly became a fair test though.
 
Last edited:
I enjoyed the review thank you.

Personally, as one or two others mentioned, I think the frametime analysis should have been its own separate article among multiple xfire/SLI setups.

Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?

It's okay to say (if warranted) that AMD frametime sucks butt compared to Nvidia frametime. It's also okay to back up your user experience with frametime data if that user experience simply sucked due to the issue. ...but to include a chart and discuss it on every game? It then becomes the focus and In my mind, changes into a xfire frametime review.

Review page 4:
"After disabling Wind Effected Snow, "Ultra" settings were very enjoyable in this game on RX 480 CrossFire at 1440p with great framerates."

.... and further down, when bringing up frametime:
"There is quite a big difference in The Division in frametime between AMD Radeon RX 480 8GB CrossFire frametime and GeForce GTX 1080 frametime. This coincides with what we felt in this game on CrossFire."

Which is it? I'd argue if the play was "very enjoyable.... ....with great framerates" then putting excessive focus on discussing frametime seems disingenuous.
If you think comparing the two is unfair, I would suggest you take that up with AMD as they were the ones that put forth this comparison first. Obviously AMD thinks it is a fair comparison, so I am unsure as why you think it is unfair for HardOCP to hold AMD to its own marketing.

The point is that we had to turn off features to get the game to feel fluid. There are issues with AMD RX 480 CF frametimes that make the game feel NOT fluid at framerates that would fine on a single card. We have documented this for years, and was why NVIDIA invented FCAT.
 
Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?
HardOCP reviews have focused on "real world gaming" for a very long time. Frame rates can be deceiving if you don't also have frame time data, because it's possible that a game will have a very playable framerate in raw benchmarking but in practice be plagued with stutters. There is also a palpable difference in feel between a game running 60fps with a consistent 16.6ms frametime (perfect timing @ 60hz) and one that shows 60fps but has frametimes swing up to 30-40ms.

The point of the article was vetting AMD's claim that 2x RX480s delivers better performance than the GTX 1080. Frame times are an important metric in validating that.
 
Your reviews are centred around gameplay.
I really wish this kind of information was in the review.
This is perhaps one of the most important elements when comparing with multi card.
We are all about the gameplay which is a large part of why we come here!

Sorry for the confusion, what I meant was, I could tell the difference when I put the cards in and started gaming. Once I found the highest playable settings (which required me to get the framerates high enough) then it smoothed out. If we had reversed the settings, and put the RX 480 CF at the playable settings that GTX 1080 is playable at, then for sure you would be able to tell the difference just by feel. I started out each gaming by trying to play at the highest in-game settings, and at the highest in-game settings, those games were laggy and choppy, until I dialed in what was playable. This has happened before with CrossFire in the past, the need to obtain high FPS to negate the difference in the choppiness to smooth it out and make it playable.
 
I enjoyed the review thank you.

Personally, as one or two others mentioned, I think the frametime analysis should have been its own separate article among multiple xfire/SLI setups.

Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?

It's okay to say (if warranted) that AMD frametime sucks butt compared to Nvidia frametime. It's also okay to back up your user experience with frametime data if that user experience simply sucked due to the issue. ...but to include a chart and discuss it on every game? It then becomes the focus and In my mind, changes into a xfire frametime review.

Review page 4:
"After disabling Wind Effected Snow, "Ultra" settings were very enjoyable in this game on RX 480 CrossFire at 1440p with great framerates."

.... and further down, when bringing up frametime:
"There is quite a big difference in The Division in frametime between AMD Radeon RX 480 8GB CrossFire frametime and GeForce GTX 1080 frametime. This coincides with what we felt in this game on CrossFire."

Which is it? I'd argue if the play was "very enjoyable.... ....with great framerates" then putting excessive focus on discussing frametime seems disingenuous.

The game had poor FPS with wind effect snow, it for some reason affected frame rates and created some stutter. Turning it off gave a big boost to both, and as we stated we had great frame rates then. However, the frame times revealed it wasn't exactly perfect still. Before finding our playable settings, the game was very choppy and it took a high FPS to smooth it out. Knowing how bad the frame times were after testing was one reason why we felt the game to be choppy in CrossFire and needing to adjust the settings like wind effected snow to be playable.
 
The game had poor FPS with wind effect snow, it for some reason affected frame rates and created some stutter. Turning it off gave a big boost to both, and as we stated we had great frame rates then. However, the frame times revealed it wasn't exactly perfect still. Before finding our playable settings, the game was very choppy and it took a high FPS to smooth it out. Knowing how bad the frame times were after testing was one reason why we felt the game to be choppy in CrossFire and needing to adjust the settings like wind effected snow to be playable.

Are you saying that you choose the final IQ taking the choppy/frametime/latency into consideration? i.e, IQ used in your review did not produce frametime induced stutter?

Or are you saying the IQ led to enjoyable playing, but frametime still affected play on occasion?
 
Are you saying that you choose the final IQ taking the choppy/frametime/latency into consideration? i.e, IQ used in your review did not produce frametime induced stutter?

Or are you saying the IQ led to enjoyable playing, but frametime still affected play on occasion?

The frame time that is graphed is of the playable settings, so technically the frame time is what is shown for the playable settings.

Did I feel the stutter at the playable settings? No That was the point of lowering settings in the game to find what is playable and smooth.
 
There may be hope for DX12 mGPU.


Microsoft Refines DirectX 12 Multi-GPU with Simple Abstraction Layer

Microsoft is sparing no efforts in promoting DirectX 12 native multi-GPU as the go-to multi-GPU solution for game developers, obsoleting proprietary technologies like SLI and CrossFire. The company recently announced that it is making it easier for game developers to code their games to take advantage of multiple GPUs without as much coding as they do now. This involves the use of a new hardware abstraction layer that simplifies the process of pooling multiple GPUs in a system, which will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards.

This is the first major step by Microsoft since its announcement that DirectX 12, in theory, supports true Mixed Multi-Adapter configurations. The company stated that it will release the new abstraction layer as part of a comprehensive framework into the company's GitHub repository with two sample projects, one which takes advantage of the new multi-GPU tech, and one without. Exposed to this code, game developers' learning curve will be significantly reduced, and they will have a template on how to implement multi-GPU in their DirectX 12 projects with minimal effort. With this, Microsoft is supporting game developers in implementing API native multi-GPU, even as GPU manufacturers stated that while their GPUs will support EMA, the onus will be on game-developers to keep their games optimized.


From what I have seen of DX12 multi-GPU thus far, it is cool that it works with mixed GPU's from mixed vendors, but the stuttering, and other multi-GPU issues (as tested in AotS) are actually WORSE than in SLI and Crossfire.
 
I think there is another aspect hard to pin down is that different folks sense different with frame time changes. What is smooth or comfortable with Brent maybe not with you. To me dual GPU's if they are not very consistent in producing the frame rates then you are not gaining much in that you have to lower settings and get even a much higher frame rate (if you can) to have a smooth game play defeating the purpose of the second card.

Also, WTH happen to AMD frame pacing? It looks like it went out the window and back to the poorer quality variances which only looks good with fps numbers but not actually benefiting the user (actually making it worst for the user).

I also notice, at least with the 16.6.2 drivers (these for me were bad drivers, doing funky stuff with the Nano fan) that frame rate target basically did not work on anything I tested. This helps smooth out CFX game play but you shouldn't have to do that if you have good frame pacing.

I wonder if this is just growing pains with the 480x with the frame times and other CFX configurations have better smoother frame times. Yes maybe a separate CFX/SLI review dealing with smoothness, frame times etc. would help identify a broader aspect. Does Figi GPU's have this gross problem with frame pacing? Nvidia cards? etc.
 
Likely frame pacing will be better once AMD refines the drivers for the cards. They've only been out about two weeks so there's still probably a few tweaks that AMD can make.
 
Likely frame pacing will be better once AMD refines the drivers for the cards. They've only been out about two weeks so there's still probably a few tweaks that AMD can make.
How long do you think people should wait before buying one?
 
If you're only buying one, it won't matter. Frame pacing is only a problem in CFX.
That much was certain.
What about crossfire?

edit: I spose I did say one, my error.
 
It was a rhetorical question, soz.
Algrim acknowledged the problem but was defending the release of the card, touted as a 1080 beater in crossfire, but the support isnt there.
I wondered how long he thought was acceptable for support to arrive.
 
Nope, 1070 or 1080 is a lot more appealing. CFX and SLI have been getting pushed to the side the last few years (SLI scaling on the 1070 is a joke so far), I'm going back to single card. If CF 480's just destroyed the competition it would be one thing, but they dont, no reason to saddle myself with multi-card problems anymore. Bring on the 1080ti and let's call it a day.
 
Unfortunately the review did not test DX 12 EMA with ROTTR recent update. I wonder if the frame times with DX 12 will be a big improvement. At least two data points could be tested AoTs and ROTTR. Opportunity with the 1060 to do this though if HardOCP actually gets two 1060's for review.
 
I thought EMA mgpu didn't need bridges, that it was through the pcie and can mix vendors? Or is that just in aots? I'm getting confused by all the different mgpu types...
 
I thought EMA mgpu didn't need bridges, that it was through the pcie and can mix vendors? Or is that just in aots? I'm getting confused by all the different mgpu types...
According to the spec yes you do not need bridges when mgpu is used.
 
I have heard that using a Freesync monitor help mitigate the effects of the frametime issue but have never seen a article test it. Any thoughts on this?
 
How long do you think people should wait before buying one?

I'd wait for AIB custom cards that won't have any uncertainty in regard to the power issues. AMD, so far this year, has been very responsive in regard to patches so I don't think we'd need to wait too much longer.
 
I'd wait for AIB custom cards that won't have any uncertainty in regard to the power issues. AMD, so far this year, has been very responsive in regard to patches so I don't think we'd need to wait too much longer.
AIB cards wont have an impact on crossfire function though.
 
That is true. If you're trying to get a 1080 on the cheap and can afford the power budget (which I can't) and are willing to a) lower the details so that the stutter isn't as noticeable (actually AIB custom cards could help here if they can clock higher so that you can keep more eye candy up whist gaming), and b) wait a bit for improved drivers to help address the mGPU frame time issues then I don't see an issue with doing it now.

Or, get one card now, enjoy it on reduced eye candy mode and pick up another card when AMD addresses the frame time issues.

Probably buried somewhere in this thread, but does having a FreeSync monitor mask or mitigate the frame stutter? I would assume so but as I have neither a FreeSync or Gsync monitor I really don't know.
 
It was a rhetorical question, soz.
Algrim acknowledged the problem but was defending the release of the card, touted as a 1080 beater in crossfire, but the support isnt there.
I wondered how long he thought was acceptable for support to arrive.


Anything Multi-GPU (Crossfire/SLI/Multi-GPU boards) are like Toyota Supras.

Dyno queens that can post some fantastic numbers in ideal circumstances that fall flat on their face in practical use in the track.

If you just run canned benchmarks, and look at peak and average framerates, Multi-GPU solutions look great. If you actually want to play games with them, not so much.

Games, are either broken and don't run at all, or stutter like hell, or suffer from crippling minimum framerates, even when their max and average framerates look good.

From early testing I've seen of DX12 multi GPU, this isn't going to change with DX12.

I've used both Crossfire and later SLI. SLI was certainly a better experience than crossfire, but in the end they both stunk. Nothing beats having a single strong GPU, and it's well worth any added cost.
 
Last edited:
That is true. If you're trying to get a 1080 on the cheap and can afford the power budget (which I can't) and are willing to a) lower the details so that the stutter isn't as noticeable (actually AIB custom cards could help here if they can clock higher so that you can keep more eye candy up whist gaming), and b) wait a bit for improved drivers to help address the mGPU frame time issues then I don't see an issue with doing it now.

Or, get one card now, enjoy it on reduced eye candy mode and pick up another card when AMD addresses the frame time issues.

Probably buried somewhere in this thread, but does having a FreeSync monitor mask or mitigate the frame stutter? I would assume so but as I have neither a FreeSync or Gsync monitor I really don't know.


Freesync barely even works with Crossfire period. It makes the stuttering 10x worse hence one of the reasons why I sold my RX 480s.

Crossfire r9 290s and R9 295X2 behaved the same way. It got fixed in one driver, then broken again in the next.
 
That is true. If you're trying to get a 1080 on the cheap and can afford the power budget (which I can't) and are willing to a) lower the details so that the stutter isn't as noticeable (actually AIB custom cards could help here if they can clock higher so that you can keep more eye candy up whist gaming), and b) wait a bit for improved drivers to help address the mGPU frame time issues then I don't see an issue with doing it now.

Or, get one card now, enjoy it on reduced eye candy mode and pick up another card when AMD addresses the frame time issues.

Probably buried somewhere in this thread, but does having a FreeSync monitor mask or mitigate the frame stutter? I would assume so but as I have neither a FreeSync or Gsync monitor I really don't know.
I agree that using multi card needs much higher framerate than the target to become smooth enough, it doesnt work in all cases though.
Its quite a penalty and in many cases is hard or impossible to achieve, especially with slower cards.
Recommending crossfire based on a driver fix is naughty unless you know for sure what is being fixed and when.
 
I agree that using multi card needs much higher framerate than the target to become smooth enough, it doesnt work in all cases though.
Its quite a penalty and in many cases is hard or impossible to achieve, especially with slower cards.
Recommending crossfire based on a driver fix is naughty unless you know for sure what is being fixed and when.

Compensating for the flakiness of multi-GPU setups with higher framerates helps eliminate the problems of input lag, and low minimum ramerates, but it does very little (if anything at all) for stutter, and if your hardware is fast enough to produce high enough framerates ion multi-GPU to avoid added input lag and minimum frame rates, you are probably better off just using one of them in single GPU mode anyway :p
 
  • Like
Reactions: Nenu
like this
Compensating for the flakiness of multi-GPU setups with higher framerates helps eliminate the problems of input lag, and low minimum ramerates, but it does very little (if anything at all) for stutter, and if your hardware is fast enough to produce high enough framerates ion multi-GPU to avoid added input lag and minimum frame rates, you are probably better off just using one of them in single GPU mode anyway :p
I agree.
A lot of work is needed to make it viable for me.
 
The frame time that is graphed is of the playable settings, so technically the frame time is what is shown for the playable settings.

Did I feel the stutter at the playable settings? No That was the point of lowering settings in the game to find what is playable and smooth.

Guys, this is what I was trying to get at... and I don't think was clear.

Brent took frametime stutter into consideration.

Last few comments makes me wonder if follks understand this. The IQ and the resolutions Brent used in testing, TOOK XFIRE STUTTER INTO CONSIDERATION. The IQ and fps you see in the review already accounted for it.

I don't mean to say it's not an issue, or to say whether it's worth it or not, just wanted to be clear on that matter for myself and everyone else. Sorry don't mean to rant, done.
 
Looong time lurker (since 2000), so forgive the noobiness but I have a question and this is obviously the best place to ask.

I see in the reviews and discussion that SLI/X-fire have input delays and stutter. My last build was a pair of gtx 280s to finally max out Crysis (1) at max setting and 1920x1200. Yes it has been a while and only recently have I had time to look at a new build. I am actually leaning 1080 and 1 or 2 based on responses...

Understand, I am not a competitive gamer, but I enjoy FPS' and I am quite sensitive to stutter and tearing, but I do not have a good feel for how sensitive I am to inout lag. That said, I have found that setting up SLI, and in the control pannel forcing v-sync and 0 prerendered frames always gave me smooth as silk performance with no noticable input lag.

So, my questions: 1) does V-sync forcing refresh rate to the monitor help with the pacing? 2) Is zero prerendered frames something you guys are using with multicards, snd do you find that it increases or decreases i put lag?
 
Looong time lurker (since 2000), so forgive the noobiness but I have a question and this is obviously the best place to ask.

I see in the reviews and discussion that SLI/X-fire have input delays and stutter. My last build was a pair of gtx 280s to finally max out Crysis (1) at max setting and 1920x1200. Yes it has been a while and only recently have I had time to look at a new build. I am actually leaning 1080 and 1 or 2 based on responses...

Understand, I am not a competitive gamer, but I enjoy FPS' and I am quite sensitive to stutter and tearing, but I do not have a good feel for how sensitive I am to inout lag. That said, I have found that setting up SLI, and in the control pannel forcing v-sync and 0 prerendered frames always gave me smooth as silk performance with no noticable input lag.

So, my questions: 1) does V-sync forcing refresh rate to the monitor help with the pacing? 2) Is zero prerendered frames something you guys are using with multicards, snd do you find that it increases or decreases i put lag?

Just buy one good 1080 and be done with it. I've sworn I'd never do CF/SLI ever again, each time I do I regret it until I dump it for one good card.
 
Looong time lurker (since 2000), so forgive the noobiness but I have a question and this is obviously the best place to ask.

I see in the reviews and discussion that SLI/X-fire have input delays and stutter. My last build was a pair of gtx 280s to finally max out Crysis (1) at max setting and 1920x1200. Yes it has been a while and only recently have I had time to look at a new build. I am actually leaning 1080 and 1 or 2 based on responses...

Understand, I am not a competitive gamer, but I enjoy FPS' and I am quite sensitive to stutter and tearing, but I do not have a good feel for how sensitive I am to inout lag. That said, I have found that setting up SLI, and in the control pannel forcing v-sync and 0 prerendered frames always gave me smooth as silk performance with no noticable input lag.

So, my questions: 1) does V-sync forcing refresh rate to the monitor help with the pacing? 2) Is zero prerendered frames something you guys are using with multicards, snd do you find that it increases or decreases i put lag?

I would wait a little longer just for prices to correct for custom AIB models, and also to see which partners release custom BIOS to increase voltage to say 1.5V to under 1.2V.
The problem is the 1080 is not well priced (same situation with the 980 at launch) and depends how much this affects you in terms of value.
Also most of us are expecting the Titan/1080ti to really benefit from having more GPC and SM, and this can be more important moving to latest generation games, but that is seen to be at least 4 months away even for the Titan model (and the Ti could be several months later).
And then the rumours of the 490/490x equivalent releasing probably in Q4 as well.

But then thinking like that would get into the mindset there is never a good time to buy a card :)
Comes down to how much value is a concern to you for getting that good performance.
At a minimum worth waiting another 3-4weeks if you can to get the best 1080 that suits you, and by then we should have a better idea if custom BIOS will be released by partners.
Asus has released their XOC bios now, but that is designed for good watercooling solution and does not have much flexibility as it is more for the extreme OCer.

Cheers
 
Last edited:
RX 480 crossfire looks bad. Looks like a single GTX 1080 is the way to go.
Or you can wait for 1080Ti or RX 490. But looking at performance of the RX 480 ~ 1060 level or slightly less, the best we could hope for is the RX 490 to compete with the 1070.
 
FWIW,

I love my Fury X crossfire pair. I never have issues in the titles I play. I'm not juggling turning crossfire on and off. It just works well! I'm assuming the real world driver experience should be equally smooth on the RX480.

The games I've played since getting the crossfire pair:

Doom
Shadow of Mordor
Path of Exile
Dirt Rally
Star Wars Battlefront
Battlefield 1
Mad Maxx
Witcher 3

I've not had to juggle crossfire on and off for any of these. Granted I only started playing with Crossfire around Black Friday of 2016, but my experience has been painless.
 
FWIW,

I love my Fury X crossfire pair. I never have issues in the titles I play. I'm not juggling turning crossfire on and off. It just works well! I'm assuming the real world driver experience should be equally smooth on the RX480.

The games I've played since getting the crossfire pair:

Doom
Shadow of Mordor
Path of Exile
Dirt Rally
Star Wars Battlefront
Battlefield 1
Mad Maxx
Witcher 3

I've not had to juggle crossfire on and off for any of these. Granted I only started playing with Crossfire around Black Friday of 2016, but my experience has been painless.

Consider yourself lucky. Crossfire had terrible scaling for NFS 2015 and had to be turned off for CSGO. It worked great in RoTR and Hitman though.
 
Back
Top