AMD Radeon RX 480 8GB CrossFire Review @ [H]

Discussion in 'AMD Flavor' started by Kyle_Bennett, Jul 11, 2016.

  1. sparks

    sparks 2[H]4U

    Messages:
    2,927
    Joined:
    Jun 19, 2004
    Kyle your closing remarks were spot on.
    Thanks
     
    Kyle_Bennett likes this.
  2. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    18,756
    Joined:
    Sep 13, 2008
    with the 1060 possibly not supporting SLI i don't think we'll see the price drop on the RX480, the crossfire support becomes a driving selling point for them which will allow the price to stay where it is.

    agree, it's a good "tech demo" for showing the potential of what dx12 could be used as but doesn't mean much outside of that.. sorta the same problem supreme commander suffered from being one of the first games that had true multi-threaded and 64bit support which was really cool and great features but almost 10 years later we still have games with only dual threaded support and no 64bit usage so it really didn't change dick.
     
    Last edited: Jul 11, 2016
    The Lamb, Dayaks and DejaWiz like this.
  3. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    Hehe, I guess I should have kept my mouth shut....not. ;)

    I figured out all of AMD's GPU troubles. Seems to be me. "Because Kyle Bennet of HardOCP is a douche and we should be downvoting this post on principle."

    Hey asshole, that is "Bennett" with two E's, two N's, and two T's! LOL!
     
    Armenius, GoodBoy, DejaWiz and 2 others like this.
  4. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    An interesting thought.....NVIDIA is aggressively moving away from mGPU. No SLI on 1060, support for 3-way SLI removed on 1070 and 1080. AMD is marketing on mGPU as being an alternative product to NVIDIA's high end. Two very different ways of looking at things.
     
    Armenius, The Lamb, N4CR and 3 others like this.
  5. funkydmunky

    funkydmunky [H]ard|Gawd

    Messages:
    1,501
    Joined:
    Aug 28, 2008
    480 did surprisingly well on a large sample size of games. Good work [H]
    This is what the mid-ranger wants. Capable and affordable NOW, with the option of adding a second card down the road when they can afford it.
    Never saw the logic of high-end dual cards.
     
    Armenius, toddw and N4CR like this.
  6. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    You need to get a higher resolution screen. :)

    JS9000_First_Setup.jpg
     
  7. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    18,756
    Joined:
    Sep 13, 2008

    knowing how game developers are i think that's going to be a huge mistake on AMD's part.. if there's nothing forcing the developer to use mGPU then they'll never use it. that was always the beauty of SLI and crossfire is that they could put half ass support in their game and force AMD/nvidia to fix it.
     
    Armenius likes this.
  8. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    4,236
    Joined:
    Feb 22, 2012
    I think nVidia is being responsible and AMD is... trying to appeal to a larger market and might be self sabotaging themselves again.
     
  9. gordon152

    gordon152 n00bie

    Messages:
    45
    Joined:
    Jun 29, 2016
    That's where marketing dollars come in (if a vendor wants to sponsor the additional work to do so) and it is a PR plus for a developer to do that, as games which offer those features get more enthusiast and vendor attention, which then increases sales. This is largely for the DX12 titles though.
     
    DejaWiz likes this.
  10. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    Agreed, and NVIDIA has been much better at this overall in the past, although AMD has certainly had its moments. So if mGPU works AMD in a DX12 title does it automatically work for NVIDIA in the same title? I think that is the thinking behind this, but I honestly do not know the answer to that question.
     
  11. Quartz-1

    Quartz-1 2[H]4U

    Messages:
    3,888
    Joined:
    May 20, 2011
    Great article. I was particularly interested in the 4K figures, of course.

    I do have one question about the test setup: using the Z170 chipset means that the PCIe slots were running at x8/x8. Since AMD's Crossfire uses the PCIe bus, I wonder if using an X99 chipset with the slots running at x16/x16 might have made a difference? Perhaps you could elaborate on why you chose the Z170 setup in the article?

    Like all great articles, this one begs further questions! In this case, how do three card setups perform? I remember reading many years ago that three card setups made things a lot smoother, and I wonder if that's true for the RX 480? This might be of particular interest should the GTX 1060 induce a price drop in the 480.
     
  12. atp1916

    atp1916 2[H]4U

    Messages:
    2,463
    Joined:
    Jun 18, 2004
    Those frame times on the xfire 480s....no.

    The user experience is king, and that kind of latency really kills it. I know quite a few people who will pay the difference up to a 1080 to get a single card that does better on 90% of the benchmarks.

    Damnit AMD!
     
  13. Shogon

    Shogon Limp Gawd

    Messages:
    300
    Joined:
    May 15, 2013
    I'm 98% certain TW:Warhammer does NOT use Nitrous what so ever. It is just an updated iteration of the warscape engine from years ago. I wish it did use the Nitrous engine.
     
  14. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    That is a PCIe 3.0 bus and I would suggest that x8x8 is hardly if in any way hampering real world gaming performance. We used our standard desktop system that we do all of our GPU testing on.

    We have spent some time looking at this in the past, so I am not just blowing smoke here.

    Introduction - PCI Express 2.0 vs 3.0 GPU Gaming Performance Review



    Dunno, and I doubt we will be visiting that any time soon. Maybe if we get a lot of dead time to fill. We have purchased 3 RX 480 cards now so we do have those on hand.
     
    DejaWiz likes this.
  15. sirmonkey1985

    sirmonkey1985 [H]ard|DCer of the Month - July 2010

    Messages:
    18,756
    Joined:
    Sep 13, 2008
    if one of the writers is bored and wants to just do one game for shits and giggles that would be cool but yeah i don't think an entire article on triple xfire is really needed given the market the RX480 is intended for. but knowing the members here on [H] i'm sure some one will post some benchmarks of RX480 triple xfire eventually.
     
  16. GeEl2088

    GeEl2088 n00bie

    Messages:
    4
    Joined:
    Apr 15, 2014
    Wouldn't it also be possible that the GTX 1060 in mgpu explicit mode wouldn't require the use of a SLI bridge?
    While there would be no SLI there could be mgpu modes.
     
  17. Kyle_Bennett

    Kyle_Bennett HardOCP MasterChef Editor Staff Member

    Messages:
    42,544
    Joined:
    May 18, 1997
    From my understanding that would be absolutely correct. But I think that seeing NVIDIA back away from any control over SLI at this point says something about the future.
     
  18. primetime

    primetime [H]ardness Supreme

    Messages:
    4,751
    Joined:
    Aug 17, 2005
    Kyle or Brent.....In a truly blind test can you say honestly you could detect which was running CF and which was running single gpu? (assuming the settings were already set up accordingly)
     
  19. Taldren

    Taldren Limp Gawd

    Messages:
    354
    Joined:
    Nov 28, 2006
    Any chance for 3440x1440 testing of 480 CF? There seems to be a hard divide between 1440p and 4K that 3440x1440 is right smack in the middle of.
     
  20. SGTGimpy

    SGTGimpy Limp Gawd

    Messages:
    189
    Joined:
    Oct 7, 2009
    Hey Kyle, I just saw this news about a patch for Tomb Raider that enables mGPU in DX12 and Asynchronous Compute for AMD. Did you have this patch installed and could this be the one of the reasons why the Tomb Raider Benchmark so much different then the rest?

    Rise of the Tomb Raider Gets Multi-GPU DirectX 12 Patch
     
  21. jedimasterben

    jedimasterben 2[H]4U

    Messages:
    2,288
    Joined:
    Oct 1, 2010
    To be frank, to get enough PCI-e lanes to get true x16/x16, you need a $600 CPU (the 5820K only does x16/x8), combining that with a $200-300 board and $100-200 for RAM, I doubt very much that anyone pouring that kind of money into their main components is going to pick up a pair of $200 GPUs :)
     
    Armenius likes this.
  22. Taldren

    Taldren Limp Gawd

    Messages:
    354
    Joined:
    Nov 28, 2006
    From your link: Rise of the Tomb Raider was updated today to version 1.0.638.6, which is a bigger deal than its less-than-snappy name suggests. In DirectX 12 mode Multi-GPU support is now available.
    From the test: 1.0.647.2 ... so yes.
     
  23. noko

    noko [H]ard|Gawd

    Messages:
    1,866
    Joined:
    Apr 14, 2010
    Great journalism in testing out a claim and showing the results with empirical data. I do like the frame times for the mGPU tests. AMD looks ugly here but probably need a reference point to also a SLI system in the future.

    Now to even out those frame times, particularly those very high spikes (half, quarter frames) with AMD, set the frame rate limiter to 1 hz above your monitor. This smooths out CFX and gives usually a much better smoother experience. Numbers are pointless if the experience is stutterly. Some would rather see a higher number vice a better experience for some reason.

    Speaking of Rise Of The Tomb raider, I hooked up my 1070 with a 290x, one bench mark (out of many) the 290x performed abnormally high, smooth as butter with everything maxed out with SMAA at 1440p - average was 56fps :bucktooth:. The 1070 consistently with or without the 290x was at 72 fps for the same settings. The 290x would mostly get 38fps to 41fps except for the one run. Ran out of time and really don't know if EMA was working or not. There are no obvious switches or indication, in the game you can choose which monitor/gpu to run the game. It does look like this game DX 12 exploded the 290x performance as a note (looking good).

    Anyways looking at the data I would say get a 1070 if you need that kind of performance now, if not but want options in the future get a 480. As for two 480's - no I do not think it is in the same class as a 1080 in performance (real game play experience).
     
  24. Algrim

    Algrim Gawd

    Messages:
    534
    Joined:
    Jun 1, 2016
    This test makes the 1070 a no-brainer for my wife and I. We play on the same circuit so we'd be pulling almost 1000 watts from graphics cards alone if we want to play at nice settings at 1440.
     
  25. Ultima99

    Ultima99 2[H]4U

    Messages:
    4,024
    Joined:
    Jul 31, 2004
    Good lord, what a bloodbath. :eek:
     
  26. Presbytier

    Presbytier Limp Gawd

    Messages:
    411
    Joined:
    Jun 21, 2016
    This is why I dumped Sli as well. The performance gains when they happen are just not worth the headache, so here is to hoping AMD launches a true 1070/1080 competitor soon.
     
  27. chenw

    chenw 2[H]4U

    Messages:
    2,802
    Joined:
    Oct 26, 2014
    The trend is pretty much what I expected, though the degree was quite different.

    Performance overall is pretty good, but doesn't paint a pretty picture about Crossfire itself.
     
  28. thesmokingman

    thesmokingman 2[H]4U

    Messages:
    3,410
    Joined:
    Nov 22, 2008
    As always, buy big if you can in the beginning then add another down the line. If you could afford a big card in the first place, you're not the target buyer for a rx480 and similar cards.
     
  29. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    8,832
    Joined:
    Jul 16, 2000
    It is absolutely, unequivocally mandatory - at least in my mind - to showcase / discuss frametimes in a review of multiGPU performance. I purchased AMD 5870 CF back in the day because the framerates were excellent in reviews. In practice, the microstuttering drove me insane and I had to push frame rates 20-30fps higher to get "smooth" performance vs my experience with a single 5870. Framerates don't tell the whole picture. And this is true of all multiGPU solutions. I've run 5870 CF, 580 SLI, 680 SLI, 290X CrossFire, and now a 980 Ti. All of the multi-GPU solutions had some degree of frametime issues, although NVIDIA solutions were generally smoother.

    Interestingly, my 290X CF setup was actually quite smooth in terms of frametimes, but game support got worse over time which prompted my upgrade to the 980 Ti. I wonder if the RX 480 frame pacing is on par with the Hawaii solutions or if they have regressed.
     
  30. GoodBoy

    GoodBoy Gawd

    Messages:
    856
    Joined:
    Nov 29, 2004
    The single GPU frametime was useful in this review so that we could see the massive difference in a single GPU frametime graph, i.e. what it looks like when the game is "smooth", vs what a bad frametime (mGPU) graph looks like. Just showing the shitty mGPU frametime graph by itself, some people would be like "wtf am I looking at".
     
    Armenius likes this.
  31. JustReason

    JustReason [H]ard|Gawd

    Messages:
    1,411
    Joined:
    Oct 31, 2015
    I think I understand his point though. Solely based on this review, one might concur this was an AMD vs Nvidia thing rather than a CF vs single card thing. Maybe if just on bench had a single 480 as to speak to the aforementioned vs battle then it would have appeared differently.
     
  32. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    8,832
    Joined:
    Jul 16, 2000
    I don't know how you can come to that conclusion when the premise of the article is laid out in the very first sentence of the article. That aside though, comparisons between GPU solutions at the same respective price point, including dual cards vs single cards at that price point, have been pretty common and I have regularly seen people advocate 2x cheaper GPUs in CF/SLI vs a more expensive single GPU solution before. I think it's a totally valid comparison.
     
  33. thesmokingman

    thesmokingman 2[H]4U

    Messages:
    3,410
    Joined:
    Nov 22, 2008
    The rx 480 is pretty compelling, but I'd never recommend anyone buy two to start.
     
  34. Rizen

    Rizen [H]ardForum Junkie

    Messages:
    8,832
    Joined:
    Jul 16, 2000
    IMO the only time multiGPU is worth it, is when you already have the high end cards. That way you get the best possible performance whether the game supports multiGPU or not.
     
  35. thesmokingman

    thesmokingman 2[H]4U

    Messages:
    3,410
    Joined:
    Nov 22, 2008
    Concur. On the high end with large resolutions, you can't get enough gpu lol.
     
  36. -=SOF=-WID99

    -=SOF=-WID99 [H]Lite

    Messages:
    101
    Joined:
    Nov 30, 2015
    well done Kyle and Brent ...
     
  37. Compwiz

    Compwiz n00bie

    Messages:
    56
    Joined:
    Feb 14, 2014
    Im done dealing with crossfire and with frame stuttering. I really wanted AMD to put out a single gpu that could complete with the 1070 (because of Nvidia's shady business practices and how like if feels like they f'ed over their customers with the founder's edition) but 2 RX 480s and 1 GTX 1070 are apples and oranges regardless if the prices are comparable when you add in all of extra mess that you have to deal with crossfire and frame stutter. Personally I think sacrificing some theoretical performance to not have to deal with that PITA is well worth it. I am not an Nvidia fan boy by any means and your opinion may be different but that's my 2 cents.
     
  38. Lord Risky

    Lord Risky AMDFanboy EchoChamber Member

    Messages:
    126
    Joined:
    Jun 30, 2016
    "AMD RX 480 CrossFire is much less expensive and if you are concentrated on price, then RX 480 CrossFire is a win compared to both GTX 1080 and GTX 1070. Hands down that cannot be argued! But there are caveats that go along with that value. AMD RX 480 runs hotter and consumes a lot more power than GTX 1080 and GTX 1070, 60% more and 87% more respectively. Then only sometimes equals or bests GeForce GTX 1080 performance. RX 480 CrossFire did consistently outperform GTX 1070 however. There are many gaming situations that GeForce GTX 1080 still offers more performance, and it certainly offers better frametime consistency than CrossFire if you are sensitive to that"........[H]

    As someone on the fence choosing between a 480 or 1060 this comment tilts me towards the 480 side since I don't care about (relative) hotness and power consumption with this generation of cards. Of course frametimes will compare poorly between sli/xfire and a single card so that is also not an issue for when I decide to max 1440p gaming.
     
    Last edited: Jul 12, 2016
    N4CR likes this.
  39. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,604
    Joined:
    Apr 17, 2000
    Absolutely yes. To me anyway, I experienced lag and stutter in Fallout 4, The Division, Witcher 3 that I could easily tell the difference between CF or single-GPU. In Tomb Raider, I probably would not be able to tell the difference, and BF4 was so fast the choppiness was negated.

    Therefore, the answer is depends on the game. If you can feel the stutter or lag, you'll know it's multi-GPU.

    This patch was not installed, prior patch was used, testing was done before this patch was released.
     
    Armenius, The Lamb and primetime like this.
  40. Brent_Justice

    Brent_Justice [H] Video Card Managing Editor Staff Member

    Messages:
    17,604
    Joined:
    Apr 17, 2000
    And that was the point basically.

    I wanted to make sure people saw what good frametimes are supposed to look like, versus CrossFire. You need a point of comparison, like you said. It made sense to use 1080 since that is also the performance comparison we were making, so it is all fair game.

    To me at least, having more data than you need, is better than having not enough data, if the frametime information in this review bothers you for the comparison we made, just look past it, but just know you're arbitrarily throwing out an important piece of information to the buying decision.