AMD Radeon RX Vega 64 Video Card Review @ [H]

there are a lot of things that factor into Noise from a GPU even being both technically equal, Environment Noise, Ambient Temp, case options, etc, if you were testing in a place different than where Brent was testing then noise results won't be directly comparable, if brent tested in Open bench and you inside a case results will be entirely different too. but of course, objectively 45.5db its Awfully and terribly loud specially for a GPU.
Not too too long ago, I ran Phase Change cooling. I'd read reviews about GPU fan noise. I always found it hard to hear the GPU fan above the noise created by the compressor of the phase change unit :)
 
Whelp... Can't say I'm surprised, but still disappointed. Good review, gents.
 
But it's barely any faster than a 1070 while consuming massively more power?

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9WL0wvNzAyMjczL29yaWdpbmFsL21pbmluZy5wbmc=

They tested stock, others said it's around 40Mh/s with Mem OC

also they aren't undervolting the core etc etc. This seems more like a fake chart meant to keep cards available.

 
you can't blow smoke up a companies ass to make them feel better, they knew the numbers before it was even released. they had the cards.

I remember there was one review site ( forgot the name and product) was posting a bad performance on AMD product, and AMD was pissed off and removed that review website.
 
Considering that AMD made reviewers wait until the day of the AMD Vega 64 release to put up their reviews, I had already made up my mind that it was going to be bad. It's good & bad to see that I was spot on with that assumption. My GTX 970 still holds up for the most part but it looks like I will be throwing my name into the pool of those that will be waiting for Volta.
 
I remember there was one review site ( forgot the name and product) was posting a bad performance on AMD product, and AMD was pissed off and removed that review website.

tech report or tpu i think.

they withheld the Nano on them.
 
substantially hotter?

when using the same idle wattage and only 1c hotter at load then 1080s at 85 vs 84c..am I missing something?

The thing is, IMO, if this Vega has that much more under the hood so it is a more fledged out product for mining and such, sure its raw gaming performance currently with am quite sure anything but mature drivers leaves much to be desired, at very least it has everything there in hardware, so maybe AMD was more "future looking" like it was/is with Ryzen, and basically always has been with Radeon in general, so cost some now, for "hopes" that will be tuned for future ahead of time...Kind of like tesselation was (but MSFT in their infinite wisdom shafted AMD and allowed Nv to "rewrite the rules")

Alas the issue with MSAA sucks, but is the games that the MSAA used is more biased towards Nvidia architectures or non biased?

Guess the reports of them overclocking the HBM2 shows that it probably was not a great idea balloning the power required in no small way O.O
 
nVidia didn't seem to have a problem with it for the 1050 and 1050ti...

That's because they aren't clocked near the heatwall. GF/Samsung is fine for non high-frequency stuff which is the exact opposite of what a high end GPU requires.
 
I can forgive the power usage but not the price. We keep our snake and hedgehog in the office where I game so having a spaceheater for a GPU helps them as they prefer warmer temps but where I live a Vega 64 is $840 plus tax where I could get a GTX 1080 aftermarket for $780-800. I have a 1200 watt PSU so to hell with power efficient but I couldn't justify the Vega 64 easily on performance per dollar. Luckily for me I am not GPU shopping until late September/early October and was going to be deciding between the Vega 56 and the GTX 1070 so maybe AMD can win me over there.
 
As usual, thx for the review. I will pass this round and stick with my Fury X's for the foreseeable future.
 
its all OOS on the website unless you want to pay the jacked up price from Microcenter etc. I am just wondering, for Miners, how many cards the have ordered.
 
18 minutes and only 5 other responses?!?!

I guess rumor is better than reality?

I didn't know [H]OCP had gotten off of AMD's blacklist; and wasn't expecting their review until a few days from now - after they were able to get a retail card.
 
Last edited:
I was pleased to see the comments about noise. As a 4K gamer I was, however, disappointed to not see any 4K testing done.

With regards to the power draw, the measurement was power draw from the wall, right? Because one of the key questions going to be asked by a prospective purchaser is, "Will I need to upgrade my PSU?" So it would be helpful to know the efficiency of the PSU at that power draw. I note that the maximum draw was 476W; if the PSU was 80% efficient at that draw - and I note that you haven't reviewed the CX850 - then it is outputting 381W to the PC so a 450W PSU will be just fine. At 90% efficiency it's 428W and that's rather too close for comfort.

As I said, I was disappointed to not see any 4K testing. But with 4K testing I would like to see a GTX 1080 with a GSync monitor against a Radeon Vega with a Freesync monitor.
 
havent been impressed with amd since radeon 9800pro.

good greif how many fails is it gonna take before nvidia buys out amd... or have they already and just use the name as a fake competitior,,,


hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm
 
At last !! the one-year-hype is finally over!! I still can't believe that we have an actual review for this card.... seems like i'm dreaming!!:ROFLMAO:
-So, i wanted to make a comment, about this card's performance but , at his conclusion Brent said it all :
1) 1 year later than GTX1080 , 2) +150watt more power in order to reach the GTX1080, 3) exactly the same price with GTX1080.
Really, I don't think that anything more is need to be said !!
( AMD keeps FuryX's tradition to the letter, meaning an entire year of Hype for no reason!! :yawn: )
 
Last edited:
Given most [H]'ers hardware specs already, I would say that Vega will end up being a power hungry sidegrade (except in a very few use cases). I was all on the Vega train (just to have one AMD build box) but given the difficulty in buying, (marked up) retail price to performance ratio and the excess power consumption / heat production - I think I'm going to wait and see how Volta / Navi turn out. The 1080 in my main box and the Titan X Maxwell in my NCase (which is 1060-1070 level performance) should hold me over until next year.
 
AMD needs to massively overhaul their GCN architecture, and maybe take a page from NVIDIA's playbook: be practical, not elitist. What I mean by that is AMD being stubborn about using HBM 2. It doesn't take an engineer to see that GDDR5X would have been a viable option to use and launch their VEGA GPUs much earlier, possibly last year. When I saw the Frontier Edition benchmarks, I knew that VEGA 56 and 64 will not fare well against NVIDIA. They might be excellent computer GPUs, and I can see how AMD wants to save money by using the same silicone for their enterprise and consumer products, however, this strategy cannot be applied across the board. I am not a hardware engineer, but I believe that there are a bunch of transistors in these VEGA GPUs that have no practical use for gaming, yet they consume power. AMD needs to trim the hedges and get rid of those, just like NVIDIA did with Maxwell when they were forced to release a competitive GPU on an outdated 28nm node. Just my two cents.

That being said, every single VEGA 56 and 64 GPU is sold out. It's either miners or enthusiasts buying them, or both. If AMD wants to be disruptive with these non-competitive GPUs, then they need to price them accordingly: $299 for the VEGA 56, $399 for the VEGA 64 and $499 for the VEGA 64 LC.

For as long as SLI and CrossFire are supported, I still like to run my GPUs in pairs. If I get better performance and lower power consumption, saving $200 to $300 on a pair of GPUs that I will keep for the next 3 years won't sway me to buy AMD over NVIDIA. For example buying a pair of VEGA 64 GPUs at a total cost of $999.98 over a pair of GeForce GTX 1080 Ti cards at a total cost of $1399.98, NVIDIA has my money. Never mind when making this comparison against the $599 VEGA 64 "Silver" GPUs. And if we wring the GTX 1080 8GB into the discussion, then there is absolutely no reason to buy a VEGA GPU at all.

The same comparison is valid for VEGA Frontier Edition vs. GeForce Titan Xp. I would spend $200 more to get the best GPU that I will keep for the next 3 years and use for productivity and gaming. I can only imagine that in AMD's mind customers that purchase high-end components will pick their part over the competitor's over a couple hundred bucks. AMD needs to realize that the proposition needs to be compelling for that to happen.

Ryzen and Ryzen Threadripper are a different story. These two CPUs offer very good value over Intel, especially for the folks out there who use them for work, not just for gaming. I already got a Threadripper 1950X CPU and I am waiting for the Gigabyte Aorus Gaming 7 X399 motherboard to arrive. I believe that for what I do this CPU offers better value than Intel's 6950X and 7900X. Virtualization projects, video editing, and casual gaming. And while I do like excess when I can afford it, I don't like it so much that I would give Intel $1700 for a 16 core CPU, while still struggling to pick and choose how to best make use of the available PCI-E slots and sometimes (and how stupid is that) give up SATA ports just so I can use an extra M.2 drive.

Conclusion: AMD Ryzen and Ryzen Threadripper are awesome. VEGA 56, 64 and Frontier Edition - not so much. These GPUs belong in 2016.
 
havent been impressed with amd since radeon 9800pro.

good greif how many fails is it gonna take before nvidia buys out amd... or have they already and just use the name as a fake competitior,,,


hmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmmm

you haven't been impressed with amd since their greatest gpu ever made?

7970 was/is a monster.
 
I think the subpar DX11 performance is intentional. They have mentioned before they don't have the manpower to optimize their drivers for older API and where looking at the future. DX12 and Vulkan seem to be catching on, so it might not be a bad move on their part in the long run. THis of course tends to be AMD problem.
 
Thanks for the review!
I had hoped for better results but after checking some other reviews, it seems Vega 64 is a good deal faster than my Fury X, so I bought one.
I could have gone for a 1080 instead but considering that my monitor is more expensive than either card and supports FreeSync, staying with AMD is the better option for me.
 
I agree that there is a FreeSync advantage, but that is like saying a Lincoln is better than a Aston Martin because the roads are better in the U.S. than the U.K. (ok I struggled making an analogy, but hopefully you get the point).
With an AIO cooler, it will be too close to a 1080ti. With aftermarket air, it will not o/c as good as a 1080 11gbps. Also, you would be dumping a TON of heat in your case.

Still, not the worse card AMD has release despite some goof ball claims on this thread. I guess the new Corvette sucks because it does not compete with the Ford GT (I like cars). AMD hasn't competed at the ultra high end.
 
So even though Vega has the same number shaders, tmus, and rops as a Fury X and is also running about 50% faster than a Fury X, the performance is only what 10-20% greater?

you haven't been impressed with amd since their greatest gpu ever made?

7970 was/is a monster.

The 7970 was released severely underclocked and, unfortunately for me, over priced. Overclocking at least fixed part of that.
 
I read the reviews (HardOCP for gaming and Anand for compute.) So far the card is great from a compute/non-gaming standpoint. I just question why bring this card over a big Polaris GPU? I remember people on this particular forum salivating over the Tesla P-100; a GPU I believe would have had a similar fate.

The only thing I can think of is R&D budgets.
 
They tested stock, others said it's around 40Mh/s with Mem OC

also they aren't undervolting the core etc etc. This seems more like a fake chart meant to keep cards available.



Others who? It seems HBM2 memory doesn't overclock much.
 
I read the reviews (HardOCP for gaming and Anand for compute.) So far the card is great from a compute/non-gaming standpoint. I just question why bring this card over a big Polaris GPU? I remember people on this particular forum salivating over the Tesla P-100; a GPU I believe would have had a similar fate.

The only thing I can think of is R&D budgets.
Because I don't think they can engineer a big Polaris card for the money. Polaris is allready a power eating monster and it is a mid range GPU. Now imagine how much more power it would need to compete against a 1080.
 
I agree that there is a FreeSync advantage, but that is like saying a Lincoln is better than a Aston Martin because the roads are better in the U.S. than the U.K. (ok I struggled making an analogy, but hopefully you get the point).
With an AIO cooler, it will be too close to a 1080ti. With aftermarket air, it will not o/c as good as a 1080 11gbps. Also, you would be dumping a TON of heat in your case.

Still, not the worse card AMD has release despite some goof ball claims on this thread. I guess the new Corvette sucks because it does not compete with the Ford GT (I like cars). AMD hasn't competed at the ultra high end.
I bought a Freesync monitor earlier in the year thinking maybe Vega could pull out ahead, but I allready owned a 1080. I don't get to use Freesync but running most games v-sync to 144hz works fine for the games that can hit it, and for those that fall under neath I notice no screen tearing with v-sync off as long as the game has stable framerates.
 
Ultimately I've drawn two conclusions from today:

The first is that GCN needs to be scrapped at this point. It was fantastic when it was released in 2012 and I have no doubt that it continues to be a strong contender in the compute space, but it clearly cannot scale and it is not flexible enough to meet the needs of both compute AND gaming.

The second is that we are finally seeing the realizations of AMD's financial situation. NVIDIA has the cash to develop multiple chips for each segment - one for HPC, one for gaming, and one further cut down for budget space. AMD cannot do this, and they are trying to make do with a one-size-fits-all approach. I admire how strong this card is when you realize that NVIDIA spends more in R&D than AMD is worth, but clearly Vega has suffered because it's intended as a dual purpose card.

I really hope Navi is a clean-slate design or at least a significant departure from GCN 4.0.
 
Ultimately I've drawn two conclusions from today:

The first is that GCN needs to be scrapped at this point. It was fantastic when it was released in 2012 and I have no doubt that it continues to be a strong contender in the compute space, but it clearly cannot scale and it is not flexible enough to meet the needs of both compute AND gaming.

The second is that we are finally seeing the realizations of AMD's financial situation. NVIDIA has the cash to develop multiple chips for each segment - one for HPC, one for gaming, and one further cut down for budget space. AMD cannot do this, and they are trying to make do with a one-size-fits-all approach. I admire how strong this card is when you realize that NVIDIA spends more in R&D than AMD is worth, but clearly Vega has suffered because it's intended as a dual purpose card.

I really hope Navi is a clean-slate design or at least a significant departure from GCN 4.0.

I would say Nvidia's advantage with their architecture is that it is designed to scale heavily or be cut heavily; in Essence the GP102 is Tesla/Quadro/Geforce, the GP104 is Tesla/Quadro/Geforce,etc.

The point being there is not much difference even between the various segmented GPUs even on Nvidia, the exception being the largest die P100 and V100 purely designed for HPC and these are true multi-precision GPUs; FP16/32/64 with ideal scaling.
Also Nvidia is better at segmenting this architecture for its various markets.
The GCN architecture works well up to a point.
Cheers
 
I read this review, TomsHardware, and Guru3D and looked at game performance for each of the titles that overlapped between the reviews. It's crazy the performance difference between the 3. This one definitely skews toward the Nvidia side, while others seem to be even, or point to the AMD side

I noticed this discrepancy too. And looking at Anand, the Vega is faster on 4K. But Kyle said it's really not suitable at that res.

All and all, I think it's competent with after market cooling at $500. Above that is a no-go. The free-sync does make the picture more attractive. But not by that much.

I hope VR has improved.
 
Back
Top