AMD Radeon RX Vega 56 versus AMD Radeon R9 Fury @ [H]

Great comparison Kyle. Still slightly disappointed at the hype, but at least AMD is heading back in the right direction.
 
Great comparison Kyle. Still slightly disappointed at the hype, but at least AMD is heading back in the right direction.

I am just wondering what they've used all the transistors for, is there something missing?
Something is up, either horribly bad or something we see in drivers over the two years.

One thing for certain: AMD cards age well, but I do have a feeling something is a bit messed up this time, anyways the price is sweet and undervolt this guy and you have a GTX1070 killer. - and it's currently priced well below any 1070 in my country thus quite a sweet card.
 
Very nice comparison for anyone on the fence of debating an upgrade. The Vega 56 is the sweet spot in AMD's lineup this year.
 
Well, thats good news for AMD then !! Lets hope they sell plenty and have enough funds in the pot to develop the next gen card and be competitive with Nvidia in a more timely manner.
 
I am just wondering what they've used all the transistors for, is there something missing?
Something is up, either horribly bad or something we see in drivers over the two years.

I think the extra transistors are for the extra HBM, Vega cards have 8GB and Fury cards had 4GB.
 
I am just wondering what they've used all the transistors for, is there something missing?
Something is up, either horribly bad or something we see in drivers over the two years.

One thing for certain: AMD cards age well, but I do have a feeling something is a bit messed up this time, anyways the price is sweet and undervolt this guy and you have a GTX1070 killer. - and it's currently priced well below any 1070 in my country thus quite a sweet card.

the extra transistor were made to enlarge the pipeline and squeeze the clocks as much as possible. that's all.

On a side note

What's in interesting is an evidence of the AMD marketing striking again fooling people by stating the TDP for the GPU Only without the rest of the card and components, so now we know for sure with a base that the TBP (Total Board Power) for Vega56 it's even more than the 275W of the Fury, 40% more clocks for ~25% more performance.
 
Right, It would be interesting to down clock first memory so BW is same and then do core clock to match Fury.
This way we could see what affects it most, considering the clocks increased 40% with Rx vega 56.
 
I second the want to look at clock vs clock to see how GCN compares to NCU in terms of geometry processing and shader throughput.
 
I second the want to look at clock vs clock to see how GCN compares to NCU in terms of geometry processing and shader throughput.

Vega56 will perform slower than Fiji at same clocks and same Bandwidth, NCU it's a marketing terminology to rename GCN as a new architecture however there are few changes to Front-end that will offer better geometry output than Fiji however that will be only noticeable in those very specifics scenarios where Geometry processing can bottleneck beyond ROPs performance (one of the reasons we don't see linear scaling with clocks)
 
Ok, so the R56 has up to a 47% clockspeed advantage over the R9?

Does it perform up to 47% faster?

There is something fishy about Vega. It’s hot, and it’s slow. If they had done a simple die shrink and a power optimise run on the R9 die, it would have worked out better than Vega.
 
I believe this is a typo:

The AMD Radeon R9 Fury X debuted on June 24th, 2015 for $649 MSRP. The AMD Radeon R9 Fury debuted on July 10th, 2015 for $399 MSRP. The AMD Radeon R9 Fury was to be an add-in-board partner supported video card. We evaluated the ASUS STRIX Radeon R9 Furywhich launched at $579 and carried reference AMD Radeon R9 Fury specifications and clock speeds.
Wasn't it $549?

yes it was, fixed -Brent
 
Last edited by a moderator:
Ok, so the R56 has up to a 47% clockspeed advantage over the R9?

Does it perform up to 47% faster?

There is something fishy about Vega. It’s hot, and it’s slow. If they had done a simple die shrink and a power optimise run on the R9 die, it would have worked out better than Vega.
you do have a point here...I mean its still a great card...but something didnt come at quite right lol
 
Clock speed is generally a trade off; you have to give up something in the architecture of the chip (lengthening the pipeline, etc) to get that increased clock speed. From what I have read on the development of Vega, they did a lot of work to the architecture to make it clock higher. But that came at a cost, so a 47% clock speed increase is not going to give a raw 47% speed increase.

That said, something clearly is off, because they did a die shrink, spent 2 years and 4 billion transistors, and the performance is relatively weak given the development time frame.
 
The conclusion of this article about architecture being improved I find dubious. I enjoyed a different article that clocked the Vega at the same clock as the Fury to see if Vega really was a design achievement. The performance was almost exactly the same. So we got a die size drop and frequency increase and a memory size increase. For those of us with overlocked fury's at 1200mhz plus. The Vega just isn't that much of an upgrade for the money. Thanks.
 
The conclusion of this article about architecture being improved I find dubious. I enjoyed a different article that clocked the Vega at the same clock as the Fury to see if Vega really was a design achievement. The performance was almost exactly the same. So we got a die size drop and frequency increase and a memory size increase. For those of us with overlocked fury's at 1200mhz plus. The Vega just isn't that much of an upgrade for the money. Thanks.

I found that article, and give or take some minor variances, the two chips performed the same.

I think AMD have lied/deliberately misled about what Vega actually is. Vega seems to be not much more than a die shrink (or is it?) with minor compute related changes. Tessellation performance is the same, and geometry culling seem no different, despite the new "geometry cores". And don't get me started on Raja and his total BS about the driver guys hating him because the architecture is so different to anything they have made before.

This explains a lot about these cards to my ears. Maybe I'm wrong, but this also explains why the cards use so much power and produce so much heat.

So bottom line is that I'm calling AMD out for BS. Maybe Vega was a true next gen architecture at one point, but something happened, maybe they ran out of money, or there were huge bugs that there was simply no time to fix etc etc, and they had to go with a die shrink with some feature bandaids added instead, and now their trying to market it as something new.

No wonder nVidia are not bothered about Vega, there will be no mystical new AMD drivers that are going to suddenly unlock 25% more performance in a month or two, because there are no new meaningful hardware features present in the die to expose!
 
Last edited:
Tbh, I won't call the fps you got as "unplayable". Yeah, it's not 60 all the time, but at least with freesync playing with average fps in 40s in 50s is no issue at all. Still much better than console 30 fps.
 
Tbh, I won't call the fps you got as "unplayable". Yeah, it's not 60 all the time, but at least with freesync playing with average fps in 40s in 50s is no issue at all. Still much better than console 30 fps.
It's kind of subjective. I'm so used to 80-90fps now that anything <60fps feels awkward and jarring. I say this as someone running GSYNC. Even if there is no tearing or input lag, there is still a noticeable decrease in fluidity at lower frame rates.
 
It's kind of subjective. I'm so used to 80-90fps now that anything <60fps feels awkward and jarring. I say this as someone running GSYNC. Even if there is no tearing or input lag, there is still a noticeable decrease in fluidity at lower frame rates.
Not that fluid, sure. But definitely not unplayable.
 
Could not find any for $399. Newegg? Amazon? BestBuy had them listed at $499. Totally disappointed with RTG.

Good comparison review.

Why are you blaming RTG for merchant gouging? They're just scalpers with a fancy e-store, they priced the 64 same as a Ti here. AMD didn't.

Few weeks later, surprise surprise they're still in stock as no one is that stupid, now down 200 bucks from launch already.
Will see how the V56 changes that further.

Great review and interesting performance changes.
 
Thanks for the review!

The 56 performs better than I expected, especially since most of the raw specs are identical. Heck of an improvement.
 
I have a 2x R9 Fury X in cross fire and a 3440x1440 freesync monitor. Do I upgrade to Vega 56/64 or wait for the next cycle?
 
The power draw seems so very high to me. What is the power draw when you undervolt it?
 
I have a 2x R9 Fury X in cross fire and a 3440x1440 freesync monitor. Do I upgrade to Vega 56/64 or wait for the next cycle?
just from having double the vram ought to be nice in it self. I couldnt go back from 8gb these days!
 
Excellent comparison, thanks. I do not game all that much anymore, even though I have a couple of hundred games on just my PC. (This does not include my Xbox One or 360.) :D Therefore, I am just going to stick with my R9 Fury just a little while longer.
 
Right, It would be interesting to down clock first memory so BW is same and then do core clock to match Fury.
This way we could see what affects it most, considering the clocks increased 40% with Rx vega 56.

It's funny because doing this with Pascal, it gets obliterated by Maxwell.
 
I have a 2x R9 Fury X in cross fire and a 3440x1440 freesync monitor. Do I upgrade to Vega 56/64 or wait for the next cycle?

Depends on if your current set-up is feeling long in the tooth. If you are experiencing any VRAM limits or game at 1440p+ with very high image quality settings and are unhappy with your current results, then upgrading to the faster 8GB parts might be warranted.
 
I have heard that crossfire is not enabled for vega in the current drivers. I am trying to decide if I should upgrade my 390 8g to crossfire vega 64 setup. Waiting for non-reference cards, driver improvements and reviews of crossfire vega specifically.

Also AMD should be flogged for disabling SR-IOV on consumer vega cards.
 
Clock vs clock should be interesting. Any links out there people have found? 1Ghz Fiji vs 1Ghz Vega.

You could downclock pascal and put it up against same clock maxwell. You would be disappointed too. Bottom line is Fiji can’t do 1.4 ghz and vega can. That all. This clock vs clock crap is getting old. Pascal wasn’t an IPC miracle either. It was a beast to reckon with regardless of that. And that’s all that matters. Higher clock is an architectural improvement.
 
You could downclock pascal and put it up against same clock maxwell. You would be disappointed too. Bottom line is Fiji can’t do 1.4 ghz and vega can. That all. This clock vs clock crap is getting old. Pascal wasn’t an IPC miracle either. It was a beast to reckon with regardless of that. And that’s all that matters. Higher clock is an architectural improvement.

Why is it "getting old"? Why are you offended by raw data?
 
Why is it "getting old"? Why are you offended by raw data?
Because it does nothing in real world. In real world you won’t downclock vega and you couldn’t make fury run at vega speed. If you know what I mean. I am not offended. It’s a scenario that seems like a waste of time. I would rather have the reviewers focus on something else then downclock and bench against fury. If you had vega would you downclock and run it like a fury? It really serves no real world purpose.
 
Back
Top