AMD Radeon VII Video Card Review @ [H]

Discussion in 'AMD Flavor' started by FrgMstr, Feb 14, 2019.

  1. Cranky1970

    Cranky1970 [H]Lite

    Messages:
    73
    Joined:
    Sep 2, 2017
    As per usual....a great review Kyle!
     
  2. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    Brent_Justice does all the heavy lifting.
     
    ItWasMe, Armenius, GHRTW and 2 others like this.
  3. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    Thanks, got it now.

    You can go look here and see that DX11 is actually quite a bit faster than DX12 on the RTX 2080. Hence the reason for using it to test with.

    We look at DX performance on a game by game and a card by card basis and use the API that best showcases the specific card's performance. On page 11, Deus Ex, you can see that we used DX12 for the RVII and DX11 for the RTX cards.
     
    Armenius, Marees and IdiotInCharge like this.
  4. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Damn, you guys nailed it. Even solidly answered the memory question that's been lingering about.

    And it's concerning that AMD is still running their cards relatively 'hot' at stock. Undervolting AMD GPUs has been a thing for some time, among miners especially where performance per watt is king, but also among those just trying to make the AMD's GPUs more comfortable to be around in terms of noise and heat.

    Do we know if AMD has done this because the average Radeon VII sample is simply that innefficient that they need the juice to keep the average card stable, or did AMD overshoot a bit here?
     
  5. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,497
    Joined:
    Mar 22, 2008
    Awesome review! Looks like a decent but not perfect card. Certainly nice that AMD is hanging at 4K now.

    Quick question: do you think it would be worthwhile to sell my 2x Vega 64s and buy 1x Vega VII (for use on 4K TV)?

    Looks like cost would be even, and I can avoid Crossfire issues/incompatibility but probably get less performance in the "best case" games.
     
    Maddness, noko and Dayaks like this.
  6. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    Yes, definitely.

    Do games even support Crossfire these days?
     
    Maddness and cybereality like this.
  7. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,497
    Joined:
    Mar 22, 2008
    Some games support Crossfire, yes, but it's a hit or miss.

    However, I have this FreeSync TV and previously 1x Vega 64 was not enough for 4K.

    With the VII, though, it looks like 4K is viable, so I might make the swap.
     
    Sulphademus and Maddness like this.
  8. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    you should
     
    cageymaru, Maddness and cybereality like this.
  9. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,614
    Joined:
    Sep 7, 2017
    Tomb Raider is faster with the RTX 2080 using dx12 and even faster when both using dx12.

    Is that what you wanted to hear?
     
  10. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Looks to go from Vega 64's 'almost viable' at 4k to the 1080Ti/2080's 'mostly viable'. It's not that big of an upgrade; perhaps if you were to strap on a liquid cooler?
     
    Marees and cybereality like this.
  11. sabrewolf732

    sabrewolf732 2[H]4U

    Messages:
    3,923
    Joined:
    Dec 6, 2004
    Everyone knows you hate radeon vii dude. Enough.
     
  12. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    No, I don't want to "hear" anything.

    I also want to see Radeon VII and GeForce RTX 2080 compared in Shadow of the Tomb Raider (DirectX 12).

    Other websites show Radeon VII and GeForce RTX 2080 to be much closer in Shadow of the Tomb Raider, so I am surprised to see this huge performance gap.
     
    Last edited: Feb 14, 2019
  13. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,497
    Joined:
    Mar 22, 2008
    Just looked at availability, they are sold out everywhere.

    Probably pick one up when they are back in stock.
     
  14. wootius

    wootius [H]Lite

    Messages:
    105
    Joined:
    Mar 6, 2017
    And they use the same settings other then DX?
     
  15. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    I thought the reasoning in the review was pretty clear- each GPU was tested with the fastest API for that GPU.

    What you're asking for is to have the GPU that tested faster, be retested with the API that it runs slower with.

    So the question the rest of us have to ask is, why?

    I get the idea of doing a pure 'apples to apples' comparison, but that's not really the stated goal; the goal as expressed is for the GPUs to be tested at their fastest settings just as a gamer would.
     
    mbelue, Armenius, Sulphademus and 3 others like this.
  16. Albanu1800

    Albanu1800 Limp Gawd

    Messages:
    448
    Joined:
    Mar 13, 2008
    And with a new RTX you might have to deal with space invaders.
     
    cybereality likes this.
  17. bildad

    bildad Gawd

    Messages:
    730
    Joined:
    Jun 28, 2004
    The notion that DX12 and Vulcan would be AMD’s savior died a long time ago.
     
    Armenius likes this.
  18. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    We do not comment on other websites' results. I can tell you that we have repeatable results, many times over, using real gameplay.

    EDIT: And worth mentioning, the reason we use real gameplay is because that many times shows us different results than a canned benchmark. I would expect to differences in our testing compared to many other reviews on the net. Keep in mind that we also usually evaluate the full game and go back and do our runthroughs on the most demanding maps on the game. We do look for the worst case scenario in terms of performance.
     
    Last edited: Feb 14, 2019
    mbelue, WayneJetSki, Armenius and 3 others like this.
  19. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    Yes.
     
    mbelue, Armenius, Sulphademus and 3 others like this.
  20. Riptide_NVN

    Riptide_NVN [H]ard|Gawd

    Messages:
    1,809
    Joined:
    Mar 1, 2005
    Enjoyed the review thanks.
     
  21. Gideon

    Gideon 2[H]4U

    Messages:
    2,238
    Joined:
    Apr 13, 2006
    Enjoyed the review, a shame that the card seems to struggle a bit with some games but otherwise seems to be right with a 2080 or a bit faster. Still good to have a solid second option if your using a 4K screen.
     
    Maddness likes this.
  22. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    So hand overclocking the card, I am getting very good GPU clocks, but my HBM2 clocks are tanking, does not seem to be from heat but rather power limits. Going to take a lot of playing with.
     
    mbelue and jfreund like this.
  23. bpizzle1

    bpizzle1 2[H]4U

    Messages:
    4,076
    Joined:
    Oct 27, 2007
    I'm glad AMD finally has a card that can be called "high end" after so many years. Competition is always good. That said, I don't really think anyone could really justify buying a Radeon VII at $699 unless they are just completely anti-Nvidia or have a use for 16gb of VRAM. Outside of that, a 2080 provides you with ever so slightly better performance while doing it using less power and running cooler. A 2080 also has a lot more OC headroom and all of the RTX goodies (as useless as they currently are...you still get them). Like Kyle said, I think if these cards came in about $50 cheaper, they might actually be the way to go. I really wonder if they could shave off one of the HBM units, leave the card with 12gb and 3/4 of the bandwidth, and charge $599-$649 for the card. That would be a really interesting option for a lot of people, imo.
     
  24. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    I did talk to AMD about this at CES, and to be exact, I did not get an answer, but the reaction I got did not seem to be a positive one.
     
    cybereality likes this.
  25. Stoly

    Stoly [H]ardness Supreme

    Messages:
    6,257
    Joined:
    Jul 26, 2005
    I don't think its a bad card, but its hard to recommend over a RTX2080. Its slower, hotter, noisier and draws more power, plus its costs the same and its hard to find.

    I'm really impressed as what AMD has achieved bringing Vega to this heights, but its time for something else. Unfortunately Navi may not be up to the task either, but there's hope.

    I mean, unless you really hate nvidia, then the RTX2080 is a much better choice.
     
    GoldenTiger and Fleat like this.
  26. noko

    noko [H]ardness Supreme

    Messages:
    4,277
    Joined:
    Apr 14, 2010
    Finally a sane review with an actual human having some real game time with the hardware and noting the experience. Vice just hitting a button and coming back later for some numbers not noticing any other issue. Great job! Thanks.

    Shadow Of The Tomb Raider plays like crap on Vega 64 and FE. It’s Dx 12 MGpu is outstanding with NVidia 1080 Ti with around 95% scaling at 4K. Vega FE MGPU just crashes. This game gave me the best visual experience I’ve ever seen using HDR. Speaking of HDR, it will use more ram, about 1gb more at 4K in this title, hit memory bandwidth more as well. Not sure if it will make a significant performance difference for lower ram cards. Do request HDR testing be considered in the future. Many of the games tested in this review does offer HDR rendering. While they are many crap HDR monitors out there, there are now good ones available which from what I’ve seen HDR improves IQ more than current RTX games.

    Look forward to OCing results in a future review. Silver seems about right, at least it worked, no space invaders etc. sad if standards fall low on reliability in the industry as a note.

    Personally I don’t see buying any new current generation card, nor do I feel like I am missing anything. Both NVidia and AMD I think fail short, if anything AMD caught up some to Nvidia for an enthusiast gaming card.
     
    Pieter3dnow and Marees like this.
  27. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    That's why it's called Radeon Vega II.

    You literally just described a rerun of the Radeon RX Vega 64's launch.
     
    Riptide_NVN likes this.
  28. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    Have seen some sustained clocks over 2K, but that was far and away from any kind of real testing.
     
    psyclist and sabrewolf732 like this.
  29. twzTechman

    twzTechman Limp Gawd

    Messages:
    224
    Joined:
    Apr 14, 2011
    Thanks guys for getting this review out.

    If I were buying a $700 GPU today, I would consider this card pretty even with the 2080 and would probably get the Radeon Vega II just because NVIDIA has been nasty lately. While the RTX2080 has had plenty of time for the drivers to mature, I think AMD will see significant improvements in the next few months. It is just history repeating itself.
    Consider the 1080 vs the Vega 64. When the Vega 64 came out, all the reviews had it just a bit slower than the 1080, but everything I see now, has it just a bit faster. You can see the same thing going back 2-3 generations where AMD was playing catch-up. They delivered a card just a bit slower on launch, but over time, it surpassed it's competition.
     
    Last edited: Feb 15, 2019
  30. DNelson

    DNelson n00b

    Messages:
    5
    Joined:
    May 6, 2017
    What a freakin awesome review. read every word. Unfortunately i still cannot decide. i need a new card, seemed to have messed up my rx580 in a mining operation. don't mind giving an atta boy to amd for all the usual reasons (more competition, etc etc), even spending 50-100$ more than a card is worth next to its competition, all i want is for my 21:10 monitor to be as glorious as its potential. in the late night one night i almost talked myself into a gratuitously priced 2080 ti, until i read 13 pages on these forums about the space invaders. wish i could wait a while for navi or cheaper prices (assuming no crazed crypto charging bulls) but that rx580 is fried for games (though normal desktop functions are there).
     
    noko and Marees like this.
  31. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    To explore this idea a bit, admittedly somewhat off-topic, a real improvement would be to shave off two HBM stacks so that the size of the interposer could also be shrunk. That'll drop the cost both in terms of bill of materials and very likely in terms of increasing yields.

    Now the problem with that is that you'd also be cutting memory back to 8GB, but more importantly you'd be cutting bandwidth to Vega 64 levels as well, basically half of what the Radeon VII has to work with. This will likely erase much of the performance benefit of the Radeon VII.
     
    Armenius likes this.
  32. _mockingbird

    _mockingbird Gawd

    Messages:
    992
    Joined:
    Feb 20, 2017
    AMD would need to significantly update its delta color compression (which hasn't been updated since Tonga) to be able to sustain that performance with half the memory bandwidth.
     
    Armenius likes this.
  33. bpizzle1

    bpizzle1 2[H]4U

    Messages:
    4,076
    Joined:
    Oct 27, 2007
    That's why I said to leave it at 12gb HBM. It might keep the bandwidth high enough not to neuter the gains.
     
  34. FrgMstr

    FrgMstr Just Plain Mean Staff Member

    Messages:
    48,298
    Joined:
    May 18, 1997
    Let's get back on topic. Enough threads about hypotheticals already.
     
    mbelue and Nightfire like this.
  35. harmattan

    harmattan [H]ardness Supreme

    Messages:
    4,225
    Joined:
    Feb 11, 2008
    Excellent review. Worth the wait.

    And I dont think AMD is going to be re-running this card with 12 or 8GB HBM: that would take engineering and the point of this card is to move m150s that didn't cut muster without much work.

    Good to see AMD has been rectifying issues quickly. Also, as is tradition with Vega, seeing many reports of good underclocking results. Will be interesting to see where this card is in a few months.
     
  36. noko

    noko [H]ardness Supreme

    Messages:
    4,277
    Joined:
    Apr 14, 2010
    Testing using the fastest api for the system is the most honest way to evaluate. Anyways in my 4.1gbz 2700, DX12 is usually the faster api and at times way faster. That is not universal. Very fast Intel processors seems to do better with DX 11. I think [H] already proven more cores does not mean better performance over a quad core in general if clocked high as in speed is still king for games.
     
    Legendary Gamer, Marees and FrgMstr like this.
  37. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Also a bit off-topic, but while this is currently true for machines set up for benchmarking, once you start loading up all of the stuff a regular user might have running in the background, more cores can keep stuff from bumping game processes out of focus and causing performance inconsistencies.

    While using an 8700K with its six Skylake cores or a 9900K with its eight Skylake cores would be useful for a non-benchmarking system, at the same clockspeed as a 7700K, they're not going to push the Radeon VII or anything other GPU harder with the games used in this review.
     
    Hakaba and noko like this.
  38. NKD

    NKD [H]ardness Supreme

    Messages:
    7,589
    Joined:
    Aug 26, 2007
    nice review. Yea its surprising in some games it just falls behind all of sudden and in some games it beats 2080 lol. Maybe driver updates will close the gap in those games but honestly seems some games will never really be optimized for AMD. Maybe they just don't have the man power to optimize for every game.
     
  39. noko

    noko [H]ardness Supreme

    Messages:
    4,277
    Joined:
    Apr 14, 2010
    Shadow of The Tomb Raider benchmark that other sites uses is mostly a flyby. Larger areas are shown which in actual gameplay NVidia is better at discarding and not rendering unseen items. Real game play is on the ground and not flying around where you have more stuff blocking each other. In other words NVidia does this game better rendering what is needed while AMD is rendering wastefuly more stuff not needed.

    Dirt 4 is the game that the Vega VII kicked even a 2080Ti. High memory bandwidth can help in certain games.
     
    cybereality likes this.
  40. Nightfire

    Nightfire [H]ard|Gawd

    Messages:
    1,614
    Joined:
    Sep 7, 2017
    Brent & Kyle, thanks for all of the hard work doing this review. It was well worth the wait!

    Now take your team out for a few beers and get some rest!
     
    AlphaQup likes this.