AMD Radeon RX Vega 56 Video Card Review @ [H]

At least AMD fans looking at 1070 performance aren't too disappointed. Thanks for the review Kyle. I assume once the AIB cards start being reviewed, you'll test vs their 1070 counterparts? Can't wait to see how they hold up there.
 
Very good stock vs. stock performance; almost be nice to see the 1080 plots as well, to see how close the Vega 56 gets.
 
dang I guess a 1000w psu isn't overkill after all.
Overclock a TR and one of these in the same system and it might even be to small.
 
dang I guess a 1000w psu isn't overkill after all.
Overclock a TR and one of these in the same system and it might even be to small.

Come on man. Last I checked, I believe Threadripper is pulling something like 150w from the wall more than a 7700k when both are overclocked. The entire RX 56/7700k system was pulling a bit over 500w when overclocked. A 750W PSU should be more than enough for an overclocked RX 56 and Threadripper.
 
I sure want one! Hopefully there is plenty of stock on monday!
 
Hmm - one of the few times where I can't follow the reasoning in the conclusion. True - Vega56 looks much better compared to it's conterpart from NV than 64, but at roughly the same price and performance as the 1070 it still looks kind of bad because of power consumption and the resulting issues (heat dumped into the case/room, noise and cost of operation). For some people that might not be an issue, but when you pay around 30 cent (euro) for a kWh that can change quickly.

And than there is the whole AMD pricing nonsense going on right now, where with NV cards I get the impression that the inflated prices are starting to come down a bit.
 
Great review!

So, I was curious about how the extra 100w power draw would affect my PC, before I mentally commit to buying this for my freesync setup. I've got a fairly good airflow and have an AIO cooler for the cpu. I grabbed a light socket, ran an extension cord, and stuck a 100w light bulb into my PC case. Sitting here (running an R9 390 on this particular build), I turned on the light bulb. Yep, lottsa hot air came out the top exhausts. There was an added bonus, however. If I laid a cold poptart on top of the exhaust (on aluminum foil, of course!), by level 7, the poptart was toasted! Awesome gaming and hot snacks! Woot!

Thanks for the in-depth look. ;)
 
Thanks for the review. I read the conclusion that "We would be remiss however, if we did not mention the fact that add-in-board partner GeForce GTX 1070 cards with custom cooling typically run at much higher GPU frequencies than the Founders Edition." And that was my exact thinking throughout the review given the power discrepancy. So is the statement that "we feel AMD Radeon RX Vega 56 already has a very good performance advantage that it can keep up with factory overclocked GeForce GTX 1070 video cards." a lead in for a future review?
 
Nice review. Do you think the MSAA issues with the Vega are driver related or hardware related?
 
If performance between AIB cards is symmetrical to performance between launch/FE cards, then the availability of nice freesync monitors for under $250 CAD is more than enough to sway me into the red camp.

Adaptive refresh is incredible to experience, and given all other factors are roughly equal (performance, cost) then the very large difference in cost between Freesync monitors and G-Sync monitors is going to drive a lot of people back to ATi.
 
Heat and power don't confront me none. I'm still running a pair of 1300mHz firmware modded, voltage bumped 780ti cards and a 5gHz, 1.35v 2600K in a Z77, not unusual to see me pulling 650w+ through the PSU when gaming at 1440p (according to Corsair's PSU monitoring fed to AIDA64).

Vega don't seem so bad, comparatively. :)
 
Great review!

So, I was curious about how the extra 100w power draw would affect my PC, before I mentally commit to buying this for my freesync setup. I've got a fairly good airflow and have an AIO cooler for the cpu. I grabbed a light socket, ran an extension cord, and stuck a 100w light bulb into my PC case. Sitting here (running an R9 390 on this particular build), I turned on the light bulb. Yep, lottsa hot air came out the top exhausts. There was an added bonus, however. If I laid a cold poptart on top of the exhaust (on aluminum foil, of course!), by level 7, the poptart was toasted! Awesome gaming and hot snacks! Woot!

Thanks for the in-depth look. ;)

Considering you already have a 390 in that case, replacing it with a V56 wouldn't be much of a change at all. Your 390 is already a 1070+light bulb. The V56 is not a 390+light bulb.
 
  • Like
Reactions: N4CR
like this
This is definitely the Vega sweet spot for me The 64 is just too expensive and thirsty compared to the competition. If these are actually 400 at any point in the next few months it's compelling.

Once again the power efficiency plaguing AMD here a bit. Thanks for the review as always [H]
 
Yeah AIB Vega 56 is gonna be great honestly - a true successor to the 290 haha.
Kyle - any chance you explore HBM overclocking as a later supplement? Buildzoid mentioned that core clocking already tapers off past 1600Mhz core, and that is assuming HBM is at 1Ghz (instead of the 800Mhz stock). I believe Steve at GamerNexus got about 945-950Mhz as well.
 
That last line is the most important line in the entire review. If somehow these things become obtainable at MSRP, then it looks to be the value leader right now. however even 50$ more and you might as well just buy the 1080 GTX.
 
Wow, do I see a competitive product finally?

Lets see how bad the pricing is though... I bet that will cause the undoing of it all. Not that the 1070 isn't being gouged and marked up either. As long as they are within 50 bucks or so, it should be a wash.

Edit: What Nimisys said too. If 1070 price is not reduced and Vega is competing with that, you're so close to a 1080 already.... Which means muh mining may make these not worth while all over again.
Although, this may change again when active sync TVs are out. That's really what I am attracted to for gaming in future, cheap, reasonable quality active sync gaming TV that does TV and gaming...

I wonder (just like 10 bit desktop use) when JHH will finally allow use of active sync.
 
Last edited:
This looks like a pretty promising card. I REALLY hope AMD sticks to MSRP on this one and there aren't a bunch of bullshit shenanigans with rebates. The bulk of sales in the video card market is in the lower price tiers, so if AMD has a compelling offer at the $399 price point, they could do some nice business with this card.
 
This looks like a pretty promising card. I REALLY hope AMD sticks to MSRP on this one and there aren't a bunch of bullshit shenanigans with rebates. The bulk of sales in the video card market is in the lower price tiers, so if AMD has a compelling offer at the $399 price point, they could do some nice business with this card.

It is not a pretty picture however you want to see it. AMD is paying a good deal for HBM2, performance for Vega is not the best and does not scale, if "they" can come through on price it becomes palatable but only for a small segment of the market.

If the price is not right then why bother with purchasing a Vega based card ?
 
And that's about all HBM2 is doing for AMD with Vega: making it more expensive.

Vega couldn't use the extra bandwidth, if there is any, that HBM2 in their implementation provides over GDDR5x, though on balance, Vega would be even more inefficient.
 
Given I have a freesync monitor, I may try to sell my 1070 at some not-so-obscenely high price and get a Vega 56. I'll wait and see.
 
Nice to see AMD at least releasing something that gets a Silver Award.

Looking forward to see if their next gen stuff can be truly competitive.
 
I just may have to upgrade my RX 580 down the line. Surprised at the performance of the RX Vega 56.
 
Nice to see AMD at least releasing something that gets a Silver Award.

Looking forward to see if their next gen stuff can be truly competitive.

Their Ryzen and Threadripper CPU's are amazing as well. Not sure if they got any rewards though, can't recall. Will stick with my i5 for now, however.
 
It is not a pretty picture however you want to see it. AMD is paying a good deal for HBM2, performance for Vega is not the best and does not scale, if "they" can come through on price it becomes palatable but only for a small segment of the market.

If the price is not right then why bother with purchasing a Vega based card ?
I'm aware of the limitations of the architecture. At $399 Vega56 is still a good deal as a consumer if you can live with the power consumption - which frankly should be most of the users here, anyway - and it offers a good mid-range GPU for FreeSync users.

For AMD's bottom line, yeah, I'm sure the profit margin is razor thin because of the GPU size + HBM, but they need market share badly.
 
Kyle, do you have any insight into why the drivers for Vega feel so rushed? Not having fully functional CrossFire isn't entirely unexpected, but the lack of overclocking functionality in the release drivers is unusual. They were working on this for a very long time, so what gives?
 
Well this is more like the performance I was expecting from Vega56. Please do a OC follow up against the GTX1070. My GTX1070FE can easily do 2ghz core/9.2ghz mem while drawing less power than Vega56 stock.

I'm wondering why nvidia chose not to let the GTX1070 have 9ghz memory like the GTX1060.
 
Well this is more like the performance I was expecting from Vega56. Please do a OC follow up against the GTX1070. My GTX1070FE can easily do 2ghz core/9.2ghz mem while drawing less power than Vega56 stock.

I'm wondering why nvidia chose not to let the GTX1070 have 9ghz memory like the GTX1060.

Then the GTX 1070 would step on the GTX 1080's toes. Probably the same reason that AMD has a bios check that won't allow people to flash home brew bios onto their cards as we've done in the past. If the Vega56 gets any closer to the Vega64 there would be no reason for the Vega64 to exist.
 
Brent/Kyle any plans to undervolt test?

hardwareluxx had some interesting results with the 56, higher clock rate and using 73W less power. Heck the vega 64 had more stable clocks and lower power for more performance....makes me wonder WTF is AMD doing.


https://translate.google.de/translate?sl=de&tl=en&js=y&prev=_t&hl=de&ie=UTF-8&u=https://www.hardwareluxx.de/index.php/artikel/hardware/grafikkarten/44084-amd-radeon-rx-vega-56-und-vega-64-im-undervolting-test.html&edit-text=

The witcher 3 result is a bit of a whoa!
 
Brent/Kyle any plans to undervolt test?

hardwareluxx had some interesting results with the 56, higher clock rate and using 73W less power. Heck the vega 64 had more stable clocks and lower power for more performance....makes me wonder WTF is AMD doing.

My guess is they are using the lowest common denominator - which in this case, means whatever the required voltage to achieve stock clocks reliably across a standard bin chip. Undervolting may work differently depending on your silicon lotto number, just like overclocking may work differently.

I'm sure if AMD could reliably get away with lower voltage and higher clocks, they would...
 
And that's about all HBM2 is doing for AMD with Vega: making it more expensive.

Vega couldn't use the extra bandwidth, if there is any, that HBM2 in their implementation provides over GDDR5x, though on balance, Vega would be even more inefficient.

Someone that explains what is going on with Vega. Scaling being dependent on HBM clocks. ***timestamped
 
My guess is they are using the lowest common denominator - which in this case, means whatever the required voltage to achieve stock clocks reliably across a standard bin chip. Undervolting may work differently depending on your silicon lotto number, just like overclocking may work differently.

I'm sure if AMD could reliably get away with lower voltage and higher clocks, they would...

they don't need higher clocks though, the 64 they lowered the power consumption, improved performance and ran a less aggressive boost clock, seems to me like they are attempting to get the highest possible max FPS # rather than a more stable longer term framerate.

Binning the 56s that would underclock would probably be good for their reputation too, getting near 1070 power usage with 1080 performance would have been quite hard for anyone to question them coming to the party late.

I wonder what their memory straps are on the HBM2 as well now. Apparently 1025mhz will get great mining results, but going over that leads to performance decline, and Eth really loves aggressive memory straps as everyone is aware of and bios mods their Radeons.
 
Great review as always Kyle, thanks for the read.

Knocked it out of the park with the conclusion, especially that last line about price. It'll be the make or break point for this card.
 
I'm all for AMD but damn that power draw is crazy.. Seems like in this day and age things would get more efficient not less, but I am no expert on video cards..
 
I'll wait for AIB partner cards and to see where prices fall. I've been thinking of upgrading from the GTX 1060 that replaced my 290x when it was worth more to sell it to a miner. The only real appealing card at the moment is the 1080 ti, as the prices aren't dumb compared to MSRP, and it does provide pretty decent perf/$, I feel it's still reasonably on the curve. However, I would love to make good use of my 1440/144 freesync panel.
 
I assume once the AIB cards start being reviewed, you'll test vs their 1070 counterparts?
As we have for the last 20 years....

Thanks for the review. I read the conclusion that "We would be remiss however, if we did not mention the fact that add-in-board partner GeForce GTX 1070 cards with custom cooling typically run at much higher GPU frequencies than the Founders Edition." And that was my exact thinking throughout the review given the power discrepancy. So is the statement that "we feel AMD Radeon RX Vega 56 already has a very good performance advantage that it can keep up with factory overclocked GeForce GTX 1070 video cards." a lead in for a future review?
We always do followup reviews with AIB custom PCB and cooling cards.

Nice review. Do you think the MSAA issues with the Vega are driver related or hardware related?
I think it is hardware related.

Kyle - any chance you explore HBM overclocking as a later supplement?
As always, we will push both GPU and memory clocks with AIB cards.

Nice to see AMD at least releasing something that gets a Silver Award.
Been a good bit of awards for AMD products in the last couple of years.

Kyle, do you have any insight into why the drivers for Vega feel so rushed?
Nope. I would expect CrossFire drivers in a month or so.

Brent/Kyle any plans to undervolt test?
No.

Have you guys tinkered with undervolting?
No.
 
Just in case this is a democracy... I vote for undervolt testing as well! Would be very curious if your card can shave off 70Watts like hardwareluxx.de found.
 
  • Like
Reactions: N4CR
like this
Good luck finding a RX vega 56 at $399, never gonna happen. The real price is going to be $499, and this is at that price that the comparison should've been done. Dissapointing review from hardocp...
 
they don't need higher clocks though, the 64 they lowered the power consumption, improved performance and ran a less aggressive boost clock, seems to me like they are attempting to get the highest possible max FPS # rather than a more stable longer term framerate.

I haven't had an AMD since my 6790, but my understanding about PowerTune was that it worked nearly the opposite of nVidia Boost.

PowerTune has a default high clock that it maintains, and it only underclocks if it exceeds a power or temperature limit. Boost has a lower default clock and ramps up as power and temperature allow, up to a maximum clock speed.

PowerTune underclocks when needed, Boost overclocks when possible.

So I'm not sure, when you say a less aggressive boost clock, if you are talking about the default clock, or a less aggressive PowerTune profile (which would more or less amount to the same thing as a more aggressive Boost profile), or what.

Sure, AMD could have lowered the default voltage and clocks on any of their cards, the 56 is no exception. The state of power management in chips today is that you can pretty much pick whatever TDP/Voltage and/or temperature profile you want to run the chip at, and the properties of your design and silicon and process node and so forth will then determine what clocks you can run for that given setting. AMD obviously wants to be able to compete versus the 1070 in gaming, this review shows they succeeded there. The TDP/Voltage is what it needed to be to achieve that with an acceptable yield.
 
Back
Top