AMD Radeon RX Vega 64 versus Radeon R9 Fury X @ [H]

great comparison Kyle.

just for kicks because i'm benching a 780 Ti atm i have one in my rig so basically same specs 7700k 4.9 ghz 3600 ddr4 full custom water loop and 2x hdds 2x ssds EVGA 750G2 psu in unigine heaven i get 400w wall power use with this card at stock clocks (its a Kingpin edition card on an LN2 bios tho so its not quite stock but close enough right) maxed out on water this system will pull 500-600 watts in 3dmark (heaven will pull about 530w)
 
Straight away it is clear that AMD Radeon RX Vega 64 consumes more power than AMD Radeon R9 Fury X. Though, Fury X can reach some high wattages as well, up to 400w. AMD Radeon RX Vega 64 on averages pulls well over 400w all the time though, in every game, where Fury X hits below 400w in many games.
Given the fact that AMD Radeon RX Vega 64 is providing 30-40% higher performance, it logically makes sense. At least the power increase isn’t being wasted, AMD Radeon RX Vega 64 really is providing a much better gameplay experience. It is using that higher wattage to deliver better performance.

It might be logical if we only look AMD's standards, but if we compare with the competition's standards, we can see how much NVidia dominates the power sector:
Specifically, GTX1080 is also about 30% faster compared to 980Ti, but , on the contrary to AMD, it consumes 50watt less power than its predecessor. ( https://www.hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review/13 ). So we see that one company increases performance while decreasing power, while the other company can manage similar increase in performance, BUT only through an also increased power. (*which was already high during FuryX's period !!)

P.S.Great review, thanks:)
 
It might be logical if we only look AMD's standards, but if we compare with the competition's standards, we can see how much NVidia dominates the power sector:
Specifically, GTX1080 is also about 30% faster compared to 980Ti, but , on the contrary to AMD, it consumes 50watt less power than its predecessor. ( https://www.hardocp.com/article/2016/05/17/nvidia_geforce_gtx_1080_founders_edition_review/13 ). So we see that one company increases performance while decreasing power, while the other company can manage similar increase in performance, BUT only through an also increased power. (*which was already high during FuryX's period !!)

P.S.Great review, thanks:)

the TL;DR on power use is what we already know AMD had to push these to their max out of the box to compete and hence you see massive power use, even excessive use given that many people have found they can undervolt the cards and greatly reduce their power consumption without even taking a performance hit. for overclockers its really a catch 22, nvidia makes the best chips by far and have for awhile now but do just about everything possible to lock them down, amd seems to take the opposite path in all aspects but thing about vega and latest drivers is they also now seem to be trying to take the nvidia path of locking down control of important things on their cards in the case of amd hbm voltage..

for general end users though none of this matters just buy a 1080 or better if you want efficient performance, if you want bang for your buck maybe you can get a Vega 56 as its at least appealing at a 400 dollar price point if you can find/buy one at that price point.
 
Why weren't other cards benched along side it?

We know that one is faster than the others, but what about how it compares to the green team?
 
Just one question : Why didn't you use the 17.8.2 Drivers, which are the latest available for Vega64 ? (You could have done these tests previously, I know, just wanted a clear reason)..

He reused the number from the first V64 review.
 
A lot of us were counting on another 20% to vega through newer GCN revision and the tile based deferred rendering... Which both don't seem to work or do crap
 
While this is a nice review and all, I guess I dont get why it's relevent.

One would expect a new GPU with new architecture to be better and faster.

Unfortunately, AMD has come out with a product that can't compete across the board with nvidia, so the only thing left to do
is compare their last generation with their new one???? (yeah, I get that the 56 might be able to go toe to toe with a GTX 1070)

And lately price is out of the question......you can't fine either an AMD or nvidia GPU at anywhere near it's MSRP, if they are in stock at all.

I was lucky to hit the nvidia website 3 weeks ago when they actually had a few 1080 Ti FE in stock for the true MSRP.
 
He reused the number from the first V64 review.

OK. I can't see where that is stated in the text, so I must have missed it.

Its just weird having a section in the conclusion that goes on about API and Drivers needing work, and testing with a Month old Beta driver.
The Fury X on the other hand is using the latest driver.
 
Unfortunately, AMD has come out with a product that can't compete across the board with nvidia, so the only thing left to do
is compare their last generation with their new one???? (yeah, I get that the 56 might be able to go toe to toe with a GTX 1070)
The Vega 64 is comparable (give or take) to the GTX 1080. Vega 56 is beating GTX 1070 by somewhere in the range of 20%. Not "toe to toe". I'd consider that competitive.
 
Its just weird having a section in the conclusion that goes on about API and Drivers needing work, and testing with a Month old Beta driver.
I've tried all the drivers, and there wasn't really much performance difference outside of a couple games that Kyle wasn't using for the review.
 
They did that in the original review, and everyone at this point kind of knows where things landed.

https://www.hardocp.com/article/2017/08/14/amd_radeon_rx_vega_64_video_card_review/1

yeah and they left out the 1080Ti which actually cost the same and in lot of cases even cheaper than Vega64..

They compare to last Gen flagship but they don't do the same to current competitor flagship.. it's a point where it's almost cynical and hypocritical, when the 290X launched it was even TESTED AGAINST the much more expensive GTX Titan.. you know because it was Da titan killer, there prices weren't any concern right?...

don't anyone remember?

Brent said:
The R9 290X is performance comparable to the GeForce GTX TITAN at 2560x1600 (1600p) and lower (1080p.) In every game we played we were able to match the same gameplay experience as the GeForce GTX TITAN. That means that every game played the same between both video cards, the same performance, same in-game settings, and same smoothness. Performance in terms of frames per second was either right under, or right at TITAN FPS performance. The performance differences were so insignificant that those did not translate to any noticeable gameplay differences at all.

The situation changes when the Radeon R9 290X is pushed up to Ultra HD 4K display resolutions. Whatever magic AMD is doing at 4K, it is working. We found that the Radeon R9 290X was faster in every game compared to the GeForce GTX TITAN at 3840x2160. It wasn't just a little bit faster either, it was anywhere from 10-23% faster. This is a significant difference. At 4K resolutions, you need every bit of performance you can get to achieve playable performance at high in-game settings on a single video card. The Radeon R9 290X is clearly the card for Ultra HD 4K.

they did even a 4K exclusive review to show how better it was than the GTX Titan and the same movement was made with the Fury X, it was tested against the 980Ti and Titan X, price weren't a concern either, just performance is what matters against competitor high-end.... right?.

Brent said:
However, as we are exploring, as a bigger picture, the impact of single-GPU video card performance at 4K we also cannot leave out the much more expensive, but also marketed for 4K gaming, NVIDIA GeForce GTX TITAN X 12GB video card. This video card is nowhere near the same pricing as the others. It is a $999 video card in its least expensive form. However, it is also a single-GPU video card just like the other two and is the fastest thing NVIDIA has at the moment.

so yeah.. the scenario is not the same with Vega, it seems more and more that every review made lately it's handled carefully by Kyle to avoid conflicts with [H] vs AMD, we all know kyle have stated several times to be playing nice with AMD, but at this point every review just seems to be more "delicate" content and more of apologist due the failure AMD represent right know in the High-end Segment. they excuse with the "everyone knows how 1080Ti and TitanX(P)/XP compare against GTX 1080" but they don't have the balls to compare it directly to avoid conflicts with AMD, its even funny some times.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
yeah and they left out the 1080Ti which actually cost the same and in lot of cases even cheaper than Vega64..

They compare to last Gen flagship but they don't do the same to current competitor flagship.. it's a point where it's almost cynical and hypocritical, when the 290X launched it was even TESTED AGAINST the much more expensive GTX Titan.. you know because it was Da titan killer, there prices weren't any concern right?...

don't anyone remember?



they did even a 4K exclusive review to show how better it was than the GTX Titan and the same movement was made with the Fury X, it was tested against the 980Ti and Titan X, price weren't a concern either, just performance is what matters against competitor high-end.... right?.



so yeah.. the scenario is not the same with Vega, it seems more and more that every review made lately it's handled carefully by Kyle to avoid conflicts with [H] vs AMD, we all know kyle have stated several times to be playing nice with AMD, but at this point every review just seems to be more "delicate" content and more of apologist due the failure AMD represent right know in the High-end Segment. they excuse with the "everyone knows how 1080Ti and TitanX(P)/XP compare against GTX 1080" but they don't have the balls to compare it directly to avoid conflicts with AMD, its even funny some times.

To be fair that review was done pre-launch and off the SEP. It would have been nice to see flagship vs. flagship though. Even now I see other places comparing it to the 1080 which is asinine at this point.

Give me the OC vs OC reviews I love...
 
It seems like some people are missing the point of this review and the previous Vega 56 v Fury review, it was simply to compare the flagship cards from the previous generation with the latest generation to see how performance had increased from the previous generation cards. That is why there are no Nvidia cards included.

On the Vega 56 v Fury review, there were comments about the fact that the Vega cards were only faster than the Fury/Fury X because they have higher clock speeds. While that may be true if they were tested clock for clock, the fact is that the Fury cards would not overclock anywhere near to Vega clocks. It is only through architectural development that the Vega cards can achieve better clock speeds and therefore better performance.
 
I was hoping for some comparisons on same clock scenarios to further compare the new architecture's improvements (if any) over fury. I found it really interesting that almost all of the graphs look identical for both cards regarding hi and low performance only shifted accordingly with the increased core clocks and architectural/driver improvements. Obviously games have certain areas that are going to stress gpus specifically based on demands for that scene but most of these could be copy and pasted up 25-30% higher from fury to give you your vega readings. This is certainly speaks to me that there is more of the same, and less of the new in this iteration of chips. It seems to coincide with the quotes from AMD that they had to focus this design on data centers, not having the R/D of Nvidia to create multiple parallel architectures, and that games were not going to see as much drastic engineering changes. Thanks for the article Brent.
 
I was hoping for some comparisons on same clock scenarios to further compare the new architecture's improvements (if any) over fury. I found it really interesting that almost all of the graphs look identical for both cards regarding hi and low performance only shifted accordingly with the increased core clocks and architectural/driver improvements. Obviously games have certain areas that are going to stress gpus specifically based on demands for that scene but most of these could be copy and pasted up 25-30% higher from fury to give you your vega readings. This is certainly speaks to me that there is more of the same, and less of the new in this iteration of chips. It seems to coincide with the quotes from AMD that they had to focus this design on data centers, not having the R/D of Nvidia to create multiple parallel architectures, and that games were not going to see as much drastic engineering changes. Thanks for the article Brent.

the first stance on frame consistency +/- each GPU performance state on how good it's brent to find consistent, repetitive and reproducible real world gameplay scenarios, if any it's a praise for his excellent work..

About cards being tested on same clocks, people have to understand that increased clock speed on VEGA it's also part of architectural changes. Couple of billions of transistors were used to enlarge pipeline and squeeze the clocks in +40%, the architecture itself received changes to be compatible with the increased clock speed, it was the same Nvidia did with Maxwell and Pascal. a similar arranged in SM levels (Cores/ROPS/TMUs) Pascal architecture card to Maxwell at the same clocks and Maxwell will perform mostly better, with Vega the same scenario will apply to Fiji.
 
Given the fact that AMD Radeon RX Vega 64 is providing 30-40% higher performance, it logically makes sense

Actually no, it doesn't as Vega64 was supposed to be more power efficient. Not to mention that Pascal is faster and more power efficient than Maxwell
 
The Vega 64 is comparable (give or take) to the GTX 1080. Vega 56 is beating GTX 1070 by somewhere in the range of 20%. Not "toe to toe". I'd consider that competitive.

OK......then compare flagship to flagship.....or 1080 Ti and see what you have then? These cards are only competitive with the lower echelon NVidia models......I get that too....dollar to dollar comparison.....meh...

Why are we trying to spit polish a turd here? Vega vs a Fury X ???? why???

I mean really....might just as well dust off an R9 290X and see how the Vega does against it.....it was the best AMD had for a while....what the heck.
 
OK......then compare flagship to flagship.....or 1080 Ti and see what you have then?

Why are we trying to spit polish a turd here? Vega vs a Fury X ???? why???

I mean really....might just as well dust off an R9 290X and see how the Vega does against it.....it was the best AMD had for a while....what the heck.
Think you missed the whole point of the test... to see if Vega is an updated architecture by clocking fury and Vega equal. If Vega outputs higher frames then it is updated/improved. If equal then it is not improved architecturally however there would be some debate as to features being implemented or not in the case of equal performance. This test was to show that not a comparison of generations per se.
 
Think you missed the whole point of the test... to see if Vega is an updated architecture by clocking fury and Vega equal. If Vega outputs higher frames then it is updated/improved. If equal then it is not improved architecturally however there would be some debate as to features being implemented or not in the case of equal performance. This test was to show that not a comparison of generations per se.

I don't see any evidence posted that the same clocks were used. As a matter of fact, I would be aghast if Vega 64 drew over 100 additional watts if it were underclocked so significantly...

What you're suggesting is an IPC test. Neither the Fury vs. Vega 56 or the Fury X vs. Vega 64 tests were conducted that way.
 
I don't see any evidence posted that the same clocks were used. As a matter of fact, I would be aghast if Vega 64 drew over 100 additional watts if it were underclocked so significantly...

What you're suggesting is an IPC test. Neither the Fury vs. Vega 56 or the Fury X vs. Vega 64 tests were conducted that way.
Well oops. Thought that was this review. Been reading too damn many of these. But you are right they didn't clock them the same, definitely a different article.
 
Just one question : Why didn't you use the 17.8.2 Drivers, which are the latest available for Vega64 ? (You could have done these tests previously, I know, just wanted a clear reason).



The latest drivers, 17.8.2, allow voltage modification for the HBM2, and for the GPU.
I have undervolted the HBM2 down to 1000mV (1050mV original on my Vega64), AND overclocked it to 1100MHz (From 945MHz stock), perfectly stable.
The GPU on the other hand can't have more than 1200mV pushed through it, but seeing as Max clocks increase when you drop the GPU voltage, 1200mV is too high anyway....

I re-used the numbers from the original review after making sure performance did not change from the press driver to 17.8.2 in the games we are using. Since performance didn't change, and the goal of this review was performance, and not overclocking, there was no need to re-test everything. We often re-use data when we can, if nothing has changed. I do check to make sure however.
 
Well oops. Thought that was this review. Been reading too damn many of these. But you are right they didn't clock them the same, definitely a different article.

Yeah, that article would have put Vega in a very different light. A very bad light...

I STILL maintain that AMD are somehow lying about what Vega is.
 
I re-used the numbers from the original review after making sure performance did not change from the press driver to 17.8.2 in the games we are using. Since performance didn't change, and the goal of this review was performance, and not overclocking, there was no need to re-test everything. We often re-use data when we can, if nothing has changed. I do check to make sure however.
Thank you for the reply Brent. Good to know that the numbers all stack up :)

Thanks a lot for the article, and taking the time to do it.
 
Something is seriously wrong with Fallout 4 on Vega if it uses more power than any other game in your lineup but barely improves performance over the Fury X.
 
Something is seriously wrong with Fallout 4 on Vega if it uses more power than any other game in your lineup but barely improves performance over the Fury X.

Obviously worse dips too. Vega goes significantly under Fury X... really interesting the avg is higher and dips are way under. Not sure I've ever seen that before.
 
Brent_Justice

I appreciate this review. I was a previous owner of a pair of Fury X. I now have a single liquid cooled Vega. The overall performance gap between the Vega and Fury X turned out bigger than I thought. Thanks for the article.
 
man, all of these whiny babies on hard now. "Why are you testing against the prior generation, why aren't you comparing flagship vs flagship, why why why why"

Feel free to link to your articles where you test what you want, I'll wait.
 
Vega feels more and more like an overclocked Fury with HBM2, shrunk down to 14nm. Yeah yeah, I know the sob story - AMD is poor, no money for R&D, boohoohoo. As if it is someone else's fault. Maybe they should just fire their marketing personal and put THAT into R&D, at least for a year or so. I'm sure social media and the free publicity provided by the fanboys will be enough to let them coast through that time.
 
man, all of these whiny babies on hard now. "Why are you testing against the prior generation, why aren't you comparing flagship vs flagship, why why why why"

Feel free to link to your articles where you test what you want, I'll wait.

moronic statement.. [H] have always reviewed flagship versus flagship each generation. why link to another sites or articles when we have right now here their own example, on what they used to do and what are they doing right now. [H] tested 290X versus all Nvidia's flagship, they tested Fury X versus all Nvidia's flagship, regardless of price and suddenly they test only Vega according competitor's price. yeah well... they don't want to expose publicly the disaster and failure that is VEGA we get it but is pretty much a shame coming from a site who used to expose harshly how bad a product could be, without soft worlds, without beautify anything, that's how always [H] has been known, and now what we have are all of these apologist statements and reviews..
 
  • Like
Reactions: magoo
like this
moronic statement.. [H] have always reviewed flagship versus flagship each generation. why link to another sites or articles when we have right now here their own example, on what they used to do and what are they doing right now. [H] tested 290X versus all Nvidia's flagship, they tested Fury X versus all Nvidia's flagship, regardless of price and suddenly they test only Vega according competitor's price. yeah well... they don't want to expose publicly the disaster and failure that is VEGA we get it but is pretty much a shame coming from a site who used to expose harshly how bad a product could be, without soft worlds, without beautify anything, that's how always [H] has been known, and now what we have are all of these apologist statements and reviews..

I agree mostly, a little harsh in the wording, but still rings pretty true.....

While the article in question was very well done and interesting (and no doubt took a bunch of work) it's just such a diversion from what
I usually see here.
I just don't understand the relevance, I'm sorry if I'm missing something...maybe this was just a novel approach ........I'm not trying to be hypercritical.....I have no standing to criticize anything.
 
Something is seriously wrong with Fallout 4 on Vega if it uses more power than any other game in your lineup but barely improves performance over the Fury X.

I agree, my opinion is that it's a driver issue. Either optimization, or due to some feature of Vega disabled at the moment. While the following games aren't as bad, they too are a little lower than expected, Watch Dogs 2, GTAV.
 
moronic statement.. [H] have always reviewed flagship versus flagship each generation. why link to another sites or articles when we have right now here their own example, on what they used to do and what are they doing right now. [H] tested 290X versus all Nvidia's flagship, they tested Fury X versus all Nvidia's flagship, regardless of price and suddenly they test only Vega according competitor's price. yeah well... they don't want to expose publicly the disaster and failure that is VEGA we get it but is pretty much a shame coming from a site who used to expose harshly how bad a product could be, without soft worlds, without beautify anything, that's how always [H] has been known, and now what we have are all of these apologist statements and reviews..

The 290x was competitive with nvidias flagship. The fury x was competitive with nvidias flagship.

The vega is NOT competitive with the 1080ti. It is competitive with the 1080 in performance but pales in comparison in regards to power consumption, which is what they stated if you actually read the review.
 
Back
Top