AMD Radeon RX 480 8GB CrossFire Review @ [H]

Discussion in 'AMD Flavor' started by Kyle_Bennett, Jul 11, 2016.

  1. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    Your reviews are centred around gameplay.
    I really wish this kind of information was in the review.
    This is perhaps one of the most important elements when comparing with multi card.
    We are all about the gameplay which is a large part of why we come here!
     
    The Lamb and Lord Risky like this.
  2. Biostud

    Biostud n00bie

    Messages:
    18
    Joined:
    Nov 12, 2012
    There may be hope for DX12 mGPU.


    Microsoft Refines DirectX 12 Multi-GPU with Simple Abstraction Layer

    Microsoft is sparing no efforts in promoting DirectX 12 native multi-GPU as the go-to multi-GPU solution for game developers, obsoleting proprietary technologies like SLI and CrossFire. The company recently announced that it is making it easier for game developers to code their games to take advantage of multiple GPUs without as much coding as they do now. This involves the use of a new hardware abstraction layer that simplifies the process of pooling multiple GPUs in a system, which will let developers bypass the Explicit Multi-Adapter (EMA) mode of graphics cards.

    This is the first major step by Microsoft since its announcement that DirectX 12, in theory, supports true Mixed Multi-Adapter configurations. The company stated that it will release the new abstraction layer as part of a comprehensive framework into the company's GitHub repository with two sample projects, one which takes advantage of the new multi-GPU tech, and one without. Exposed to this code, game developers' learning curve will be significantly reduced, and they will have a template on how to implement multi-GPU in their DirectX 12 projects with minimal effort. With this, Microsoft is supporting game developers in implementing API native multi-GPU, even as GPU manufacturers stated that while their GPUs will support EMA, the onus will be on game-developers to keep their games optimized.
     
    NemesisX likes this.
  3. collegeboy69us

    collegeboy69us [H]ardness Supreme

    Messages:
    4,792
    Joined:
    Jul 27, 2003
    This review mirrors my expereince with Crossfire 290's from last generation. Playing BF4 and GTA5 @ 1440p.... even though the horsepower was there (along with boatloads of heat), the game experience was still overall crappy due to the frametiming issues. Yeah yeah, it's all a software problem apparently, but in the 3 or so years I had that setup, there was *always* some sort of issue going on with either crappy drivers, or crappy implementation from the dev. When AMD or a dev says "oh yeah we are working on that issue", I laugh, they've said the same line for years and years, nothing changes.

    My requirement for next gen was a single card solution -- and I got exactly that in the form of a 1070. Words can't describe how much smoother everything is @ 1440p with a single card. G-sync is icing on the cake as well with my S2716DG. Sure, Crossfire 480's can get you close to the level of a GTX 1080, my first question though... why would you want to go that route and have so many other problems? (not to mention increased power, increased cooling needs, and heat thrown off into the room)

    Maybe it's because I've been around the block more than few times with GPUs, but raw FPS horsepower means less to me these days vs a quality smooth experience. I'm the type of consumer that would actually rather spend more money for a single card solution and not have to deal with the many issues that still plague multi-GPU setups. The issues aren't limited to AMD, i know SLI has it's own set of issues as well.
     
    GoodBoy and The Lamb like this.
  4. harmattan

    harmattan 2[H]4U

    Messages:
    3,762
    Joined:
    Feb 11, 2008
    I get you on including gameplay experience as the empirical data doesn't really describe how bad the situation is. These subjective descriptions are important since people get all googly-eyed when they see high FPS averages that mGPU can provide. It's very easy to "feel" this stuttering when you're playing, and a large frame variance completely nullifies any decent performance you may be getting.

    I do know from my last experience with a 295x2, while the FPS averages were generally terrific in games where Crossfire was supported, most games just felt jittery (some worse than others). Witcher 3 was the absolute worst, felt horrible.
     
    Zarathustra[H] and The Lamb like this.
  5. GeEl2088

    GeEl2088 n00bie

    Messages:
    6
    Joined:
    Apr 15, 2014
    A short video in 60/120fps of the stuttering vs no stuttering might help demonstrate the effect to ppl that haven't experienced it before.
    More work incoming!
     
  6. The Lamb

    The Lamb n00bie

    Messages:
    31
    Joined:
    Nov 11, 2014
    Thanks for the info guys, it's very helpful to have these anecdotes since I don't have any mGPU experience!
     
  7. 10e

    10e 2[H]4U

    Messages:
    3,370
    Joined:
    Jul 20, 2006
    Hmmm, great review guys, and thanks for highlighting the frame pacing issues. Although the RX 480 cards are much cheaper than my R9 Nanos in Crossfire, it's shocking to see them perform more slowly and use the same, or more power.

    I tested my twin R9 Nanos on my i7 4790k at 4.4 Ghz on an Asus Mark S board with 16GB of RAM and a single SSD, and the most this setup used according to my Kill-A-Watt was 470 watts.

    Other than the price and 8GB of RAM, this seems like an (only) OK setup comparatively. I'll wait for 1080Ti or Vega late this year or early next.

    And yes, when reading this review, I was thinking of HardOCP and Kyle's information put forth in an older article indicating that Polaris would run hotter and use more power than originally expected/anticipated. This put a half-smile on my face.
     
    Kyle_Bennett likes this.
  8. Armenius

    Armenius [H]ardForum Junkie

    Messages:
    9,474
    Joined:
    Jan 28, 2014
    It is said in the review that what the data shows coincides with their experience while playing the game.

    The Division 1440p
    There is quite a big difference in The Division in frametime between AMD Radeon RX 480 8GB CrossFire frametime and GeForce GTX 1080 frametime. This coincides with what we felt in this game on CrossFire. It seemed that we needed to obtain higher framerates in order for the game to feel smooth in gameplay. The frametime reveals the game is actually very erratic in frametime on CrossFire. With the single GeForce GTX 1080 the frametime is very tight and consistent.
     
  9. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000
    Going to be hard to capture it on a 30hz or even 60hz camera.
     
  10. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000
    Coincides, yes. Looking at the graphs one can see it. Also looking at the graphs it looks minor and that one wouldn't notice it much. That is far from the case with any multi-gpu solutions have ever used.

    I quite simply find any multi-gpu setup a horrible last resort, when no single GPU is fast enough for the task.
     
  11. toddw

    toddw [H]ard|Gawd

    Messages:
    1,309
    Joined:
    Sep 9, 2004
    I enjoyed the review thank you.

    Personally, as one or two others mentioned, I think the frametime analysis should have been its own separate article among multiple xfire/SLI setups.

    Who would think frametime on a multi-gpu setup would compete with a single card? Has that ever been the case? If not, why build a review around it?

    It's okay to say (if warranted) that AMD frametime sucks butt compared to Nvidia frametime. It's also okay to back up your user experience with frametime data if that user experience simply sucked due to the issue. ...but to include a chart and discuss it on every game? It then becomes the focus and In my mind, changes into a xfire frametime review.

    Review page 4:
    "After disabling Wind Effected Snow, "Ultra" settings were very enjoyable in this game on RX 480 CrossFire at 1440p with great framerates."

    .... and further down, when bringing up frametime:
    "There is quite a big difference in The Division in frametime between AMD Radeon RX 480 8GB CrossFire frametime and GeForce GTX 1080 frametime. This coincides with what we felt in this game on CrossFire."

    Which is it? I'd argue if the play was "very enjoyable.... ....with great framerates" then putting excessive focus on discussing frametime seems disingenuous.
     
    primetime likes this.
  12. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000
    That would not be a bad idea.

    Then it could be linked in all caps every single time anything multi-GPU is reviewed. Something like:

    WARNING: FRAMERATE NUMBERS IN THIS ARTICLE ARE NOT NECESSARILY REFLECTIVE OF THE ACTUAL GAMING EXPERIENCE DUE TO ISSUES ASSOCIATED WITH MULTI-GPU SETUPS

    This statement should be in every SLI and Crossfire article, including articles reviewing products which have multiple GPU's on the same board, as they work EXACTLY the same way, and for some reason many people seem to think they are somehow different than regular SLI/Crossfire.

    I don't know, maybe because AMD specifically made this claim in their launch?


    [​IMG]


    Normally I would agree. SLI/Crossfire compared to single GPU is never fair, because while you might get higher average and max framerates, the min framerates usually suffer, and the frametime inconsistencies/stutter are obnoxious as is the added input lag. Once AMD put this on the table as their way of competing with Nvidia on the high end, it instantly became a fair test though.
     
    Last edited: Jul 12, 2016
    AlexisRO, primetime, kalston and 2 others like this.
  13. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    43,630
    Joined:
    May 18, 1997
    If you think comparing the two is unfair, I would suggest you take that up with AMD as they were the ones that put forth this comparison first. Obviously AMD thinks it is a fair comparison, so I am unsure as why you think it is unfair for HardOCP to hold AMD to its own marketing.

    The point is that we had to turn off features to get the game to feel fluid. There are issues with AMD RX 480 CF frametimes that make the game feel NOT fluid at framerates that would fine on a single card. We have documented this for years, and was why NVIDIA invented FCAT.
     
    Algrim likes this.
  14. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    8,865
    Joined:
    Jul 16, 2000
    HardOCP reviews have focused on "real world gaming" for a very long time. Frame rates can be deceiving if you don't also have frame time data, because it's possible that a game will have a very playable framerate in raw benchmarking but in practice be plagued with stutters. There is also a palpable difference in feel between a game running 60fps with a consistent 16.6ms frametime (perfect timing @ 60hz) and one that shows 60fps but has frametimes swing up to 30-40ms.

    The point of the article was vetting AMD's claim that 2x RX480s delivers better performance than the GTX 1080. Frame times are an important metric in validating that.
     
    Kyle_Bennett likes this.
  15. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,660
    Joined:
    Apr 17, 2000
    Sorry for the confusion, what I meant was, I could tell the difference when I put the cards in and started gaming. Once I found the highest playable settings (which required me to get the framerates high enough) then it smoothed out. If we had reversed the settings, and put the RX 480 CF at the playable settings that GTX 1080 is playable at, then for sure you would be able to tell the difference just by feel. I started out each gaming by trying to play at the highest in-game settings, and at the highest in-game settings, those games were laggy and choppy, until I dialed in what was playable. This has happened before with CrossFire in the past, the need to obtain high FPS to negate the difference in the choppiness to smooth it out and make it playable.
     
  16. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,660
    Joined:
    Apr 17, 2000
    The game had poor FPS with wind effect snow, it for some reason affected frame rates and created some stutter. Turning it off gave a big boost to both, and as we stated we had great frame rates then. However, the frame times revealed it wasn't exactly perfect still. Before finding our playable settings, the game was very choppy and it took a high FPS to smooth it out. Knowing how bad the frame times were after testing was one reason why we felt the game to be choppy in CrossFire and needing to adjust the settings like wind effected snow to be playable.
     
  17. toddw

    toddw [H]ard|Gawd

    Messages:
    1,309
    Joined:
    Sep 9, 2004
    Are you saying that you choose the final IQ taking the choppy/frametime/latency into consideration? i.e, IQ used in your review did not produce frametime induced stutter?

    Or are you saying the IQ led to enjoyable playing, but frametime still affected play on occasion?
     
  18. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,660
    Joined:
    Apr 17, 2000
    The frame time that is graphed is of the playable settings, so technically the frame time is what is shown for the playable settings.

    Did I feel the stutter at the playable settings? No That was the point of lowering settings in the game to find what is playable and smooth.
     
  19. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000

    From what I have seen of DX12 multi-GPU thus far, it is cool that it works with mixed GPU's from mixed vendors, but the stuttering, and other multi-GPU issues (as tested in AotS) are actually WORSE than in SLI and Crossfire.
     
  20. noko

    noko 2[H]4U

    Messages:
    2,274
    Joined:
    Apr 14, 2010
    I think there is another aspect hard to pin down is that different folks sense different with frame time changes. What is smooth or comfortable with Brent maybe not with you. To me dual GPU's if they are not very consistent in producing the frame rates then you are not gaining much in that you have to lower settings and get even a much higher frame rate (if you can) to have a smooth game play defeating the purpose of the second card.

    Also, WTH happen to AMD frame pacing? It looks like it went out the window and back to the poorer quality variances which only looks good with fps numbers but not actually benefiting the user (actually making it worst for the user).

    I also notice, at least with the 16.6.2 drivers (these for me were bad drivers, doing funky stuff with the Nano fan) that frame rate target basically did not work on anything I tested. This helps smooth out CFX game play but you shouldn't have to do that if you have good frame pacing.

    I wonder if this is just growing pains with the 480x with the frame times and other CFX configurations have better smoother frame times. Yes maybe a separate CFX/SLI review dealing with smoothness, frame times etc. would help identify a broader aspect. Does Figi GPU's have this gross problem with frame pacing? Nvidia cards? etc.
     
  21. Algrim

    Algrim Gawd

    Messages:
    657
    Joined:
    Jun 1, 2016
    Likely frame pacing will be better once AMD refines the drivers for the cards. They've only been out about two weeks so there's still probably a few tweaks that AMD can make.
     
  22. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    How long do you think people should wait before buying one?
     
  23. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    8,865
    Joined:
    Jul 16, 2000
    If you're only buying one, it won't matter. Frame pacing is only a problem in CFX.
     
  24. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    That much was certain.
    What about crossfire?

    edit: I spose I did say one, my error.
     
  25. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000
    I would argue that Never is a good idea.

    A 1080 may cost $200 more, but you will wish you went that route every single day, if you don't.
     
  26. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    It was a rhetorical question, soz.
    Algrim acknowledged the problem but was defending the release of the card, touted as a 1080 beater in crossfire, but the support isnt there.
    I wondered how long he thought was acceptable for support to arrive.
     
  27. B2BigAl

    B2BigAl [H]ard|Gawd

    Messages:
    2,039
    Joined:
    Mar 23, 2003
    Nope, 1070 or 1080 is a lot more appealing. CFX and SLI have been getting pushed to the side the last few years (SLI scaling on the 1070 is a joke so far), I'm going back to single card. If CF 480's just destroyed the competition it would be one thing, but they dont, no reason to saddle myself with multi-card problems anymore. Bring on the 1080ti and let's call it a day.
     
  28. noko

    noko 2[H]4U

    Messages:
    2,274
    Joined:
    Apr 14, 2010
    Unfortunately the review did not test DX 12 EMA with ROTTR recent update. I wonder if the frame times with DX 12 will be a big improvement. At least two data points could be tested AoTs and ROTTR. Opportunity with the 1060 to do this though if HardOCP actually gets two 1060's for review.
     
  29. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    mGPU relies on the cards being SLI capable.
     
  30. pendragon1

    pendragon1 [H]ardness Supreme

    Messages:
    4,581
    Joined:
    Oct 7, 2000
    I thought EMA mgpu didn't need bridges, that it was through the pcie and can mix vendors? Or is that just in aots? I'm getting confused by all the different mgpu types...
     
  31. Presbytier

    Presbytier Gawd

    Messages:
    708
    Joined:
    Jun 21, 2016
    According to the spec yes you do not need bridges when mgpu is used.
     
  32. quiktake2009

    quiktake2009 n00bie

    Messages:
    53
    Joined:
    Jun 11, 2016
    I have heard that using a Freesync monitor help mitigate the effects of the frametime issue but have never seen a article test it. Any thoughts on this?
     
  33. Algrim

    Algrim Gawd

    Messages:
    657
    Joined:
    Jun 1, 2016
    I'd wait for AIB custom cards that won't have any uncertainty in regard to the power issues. AMD, so far this year, has been very responsive in regard to patches so I don't think we'd need to wait too much longer.
     
  34. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    AIB cards wont have an impact on crossfire function though.
     
  35. Algrim

    Algrim Gawd

    Messages:
    657
    Joined:
    Jun 1, 2016
    That is true. If you're trying to get a 1080 on the cheap and can afford the power budget (which I can't) and are willing to a) lower the details so that the stutter isn't as noticeable (actually AIB custom cards could help here if they can clock higher so that you can keep more eye candy up whist gaming), and b) wait a bit for improved drivers to help address the mGPU frame time issues then I don't see an issue with doing it now.

    Or, get one card now, enjoy it on reduced eye candy mode and pick up another card when AMD addresses the frame time issues.

    Probably buried somewhere in this thread, but does having a FreeSync monitor mask or mitigate the frame stutter? I would assume so but as I have neither a FreeSync or Gsync monitor I really don't know.
     
  36. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000

    Anything Multi-GPU (Crossfire/SLI/Multi-GPU boards) are like Toyota Supras.

    Dyno queens that can post some fantastic numbers in ideal circumstances that fall flat on their face in practical use in the track.

    If you just run canned benchmarks, and look at peak and average framerates, Multi-GPU solutions look great. If you actually want to play games with them, not so much.

    Games, are either broken and don't run at all, or stutter like hell, or suffer from crippling minimum framerates, even when their max and average framerates look good.

    From early testing I've seen of DX12 multi GPU, this isn't going to change with DX12.

    I've used both Crossfire and later SLI. SLI was certainly a better experience than crossfire, but in the end they both stunk. Nothing beats having a single strong GPU, and it's well worth any added cost.
     
    Last edited: Jul 13, 2016
  37. Rvenger

    Rvenger [H]ard|Gawd

    Messages:
    1,132
    Joined:
    Sep 12, 2012

    Freesync barely even works with Crossfire period. It makes the stuttering 10x worse hence one of the reasons why I sold my RX 480s.

    Crossfire r9 290s and R9 295X2 behaved the same way. It got fixed in one driver, then broken again in the next.
     
  38. Nenu

    Nenu Pick your own.....you deserve it.

    Messages:
    16,909
    Joined:
    Apr 28, 2007
    I agree that using multi card needs much higher framerate than the target to become smooth enough, it doesnt work in all cases though.
    Its quite a penalty and in many cases is hard or impossible to achieve, especially with slower cards.
    Recommending crossfire based on a driver fix is naughty unless you know for sure what is being fixed and when.
     
  39. Algrim

    Algrim Gawd

    Messages:
    657
    Joined:
    Jun 1, 2016
    Okay, I stand corrected. Thank you!
     
  40. Zarathustra[H]

    Zarathustra[H] [H]ard|News Staff Member

    Messages:
    22,144
    Joined:
    Oct 29, 2000
    Compensating for the flakiness of multi-GPU setups with higher framerates helps eliminate the problems of input lag, and low minimum ramerates, but it does very little (if anything at all) for stutter, and if your hardware is fast enough to produce high enough framerates ion multi-GPU to avoid added input lag and minimum frame rates, you are probably better off just using one of them in single GPU mode anyway :p
     
    Nenu likes this.