AMD Radeon RX Vega 64 Video Card Review @ [H]

I think I'll wait for the drivers to finish baking and see if any holiday sales pop up in 3 months. I'm happy with my RX480/CFG70 combo so far but a few extra FPS in newer titles would be nice.

If better drivers don't improve things and the price doesn't drop I guess I will wait a year and see what the market looks like at that point.
 
I think I'll wait for the drivers to finish baking and see if any holiday sales pop up in 3 months. I'm happy with my RX480/CFG70 combo so far but a few extra FPS in newer titles would be nice.

If better drivers don't improve things and the price doesn't drop I guess I will wait a year and see what the market looks like at that point.
I know we joke about fine wine drivers, but we can't seriously be waiting for massive driver miracles can we? If you can wait till Christmas I would argue to wait a few more months as NVIDIA is likely to release something 1q of next year.
 
Thanks for the review! I'm a fan of the old review test method format where you would push each card as hard as the could go but "progress" I guess. Your conclusion is bang on. If anything you were lenient on AMD. What's hilarious is the diehard AMD fanboys defending it with canned benchmarks from other review sites. You have a video card with (in all likelihood will be fixed and tweaked so in 2-3 years it will run like an outdated 1080Ti) MSAA performance bug that doubles as a heat for winter and lacks PhysX (yes i went there!@#$). What's more funny is AMD bundling so much crap with it "to fend of miners" but now you have a card close to the price of a 1080Ti. Defend that fanboys.
 
If the games you put will lower the overall performance because there is a clear bias I dont see why include them for example Computerbase has Anno 2050 which favors Nvidia heavily and I dont think many people plays that game

If Vega 56 being 100usd cheaper than 1080 isnt that great at all well I dont know what it is,It is late but people would buy a 400usd card which has 95-90% performance of one or one has +5-10% and costs 100usd more?

The results From TPU,Guru3d,TT,Anandtech for RX Vega 56 puts it most of the times close to 1080 I dont know what review you were looking at

If Vega 56 was $50-75 cheaper I might consider it for a kids rig. I am sure some kind of deal will roll arpund shortly... once they actually get stock.
 
I know we joke about fine wine drivers, but we can't seriously be waiting for massive driver miracles can we? If you can wait till Christmas I would argue to wait a few more months as NVIDIA is likely to release something 1q of next year.

Not really waiting, but if the price/performance is there during the holidays I may bite in order to max out my current 144hz Freesync monitor. If it doesn't improve via drivers or a price drop I'll survive with my current setup until OLED or FALD 144hz high res displays are available for a decent price and pick whatever is top dog at that point.
 
So.... Is me buying a Titan Xp now too close to a Volta based card launch?

I think it's fine to buy one now. Since Vega isn't a challenge, I wouldn't expect Volta until March 2018. And that's not your time to upgrade because you would want another Titan, so June-August 2018 is when you will upgrade, a full year from now. But, I would keep that card, toss it into a backup rig for work, itx portable gaming, mining etc.
 
This review makes me glad I bought my EVGA GTX 1080 Ti two months ago. No need for me to change video card or monitor until next year at the earliest (unless NVIDIA actually goes for Volta this year, or a 40" 4K monitor w/G-SYNC comes out around Black Friday).
 
This is a genuine question:

How many people out there (not just [H], but overall) are in the market for a Vega/1080 or Vega56/1070 that own neither a G-Sync or a FreeSync monitor?
 
This is a genuine question:

How many people out there (not just [H], but overall) are in the market for a Vega/1080 or Vega56/1070 that own neither a G-Sync or a FreeSync monitor?

I'm running a 40 inch 4K TV as a monitor, I'd buy a Vega 56 with an aftermarket cooler for around $400. Not because I need it, just for something different.
 
Great review guys and the rock solid honest opinion at the end. I actually read most of the AMD literature and it sounded like gains were made in efficiency but over 100W more power draw than NV isn't going to get me back on team Red. My 3 UPSes are within 50-100 W of their max as it is and I'm not in the mood to buy beefier ones even tho I have plenty of capacity left on my PSUs. Heat is also a factor and I've had to reduce threads for BOINC to keep my i7's from overheating in summer. Custom cooling can address the heat issue but not power. Maybe Vega56 will offer better power efficiency and less heat. Hafta wait and see.
 
Right, completely forgot about the TV crowd.

I was just wondering if there is actually any use in comparing Vega 64/56 and 1080/1070 directly as I thought a good chunk of the people in that market segment would either own a FreeSync or G-Sync monitor by this stage, which would mean that if they were to want keep using the VRR on their monitor, they have no choice.

Just trying to ascertain whether I am BS'ing myself, or the size of that chunk.
I'm running a 40 inch 4K TV as a monitor, I'd buy a Vega 56 with an aftermarket cooler for around $400. Not because I need it, just for something different.
 
coming from somebody that bought and used a 1080ti founders edition for a month on air with the stock and nice looking Nvidia blower cooler - don't buy a blower cooler unless you plan to add a water block, they're just too loud on air and if you're like me and you have your computer at eye level when you're seated at your desk, its gonna be right in your face and totally disrupt gaming when it gets loud.

I thought I could hack it but I thanked God and sacrificed a chicken the day my waterblock arrived.
 
coming from somebody that bought and used a 1080ti founders edition for a month on air with the stock and nice looking Nvidia blower cooler - don't buy a blower cooler unless you plan to add a water block, they're just too loud on air and if you're like me and you have your computer at eye level when you're seated at your desk, its gonna be right in your face and totally disrupt gaming when it gets loud.

I thought I could hack it but I thanked God and sacrificed a chicken the day my waterblock arrived.
I have the 1080 FE and don't really hear it. Granted it sits under the desk, but noise wise I don't think it is too bad and I use a custom aggressive fan curve to keep an over 2ghz OC.
 
Tom's Hardware is full of shit. I just installed mine. Its XFX 64 air with black shroud running on win 7. Claymore ETH is around 34. Dual ETH+SIA is 32+and just over 900 SIA. The card draws 295 watts at 1633 core and 945 memory. Reading done with a Kill a Watt meter at the wall.
The blower is much quieter than my 7970. Its barebale even at 85%
My local Microcenter had gotten 3 I was the lucky 3rd person. Will post picks and bench this evening. :D
 
coming from somebody that bought and used a 1080ti founders edition for a month on air with the stock and nice looking Nvidia blower cooler - don't buy a blower cooler unless you plan to add a water block, they're just too loud on air and if you're like me and you have your computer at eye level when you're seated at your desk, its gonna be right in your face and totally disrupt gaming when it gets loud.

I thought I could hack it but I thanked God and sacrificed a chicken the day my waterblock arrived.

I have the 1080 FE and don't really hear it. Granted it sits under the desk, but noise wise I don't think it is too bad and I use a custom aggressive fan curve to keep an over 2ghz OC.

I think for those situations, it really comes down to whether you use headphones frequently or not (or if you have a good amount of ambient noise in the room).
Of course, sometimes the game really is that good, too :)
 
All of this power consumption whining is pure fanboy hysteria. Even Kyle says he doesn't give two fawks about it. This card is great for Freesync owners, everyone else feel free to buy as many 1080's and 1080ti's as your heart desires.
 
It is late but people would buy a 400usd card which has 95-90% performance of one or one has +5-10% and costs 100usd more?

That card is called the 1070 and its been out for well over a year. Yea, the 56 beats it in a few games, loses to it in a couple others, but the 1070 has been out for a whole damn year and uses like, half the power. Why is the 56 anything to get excited about? If you're telling me they're going to ship at $399 while the 1070 is shipping at $450+, don't kid yourself... prices will be just as inflated because of mining anyway. Sorry, but I'd be pissed if I sat on a 2x0 or 3x0 series radeon to wait for this when I could have been experiencing equal quality gameplay for over a year.

All of this power consumption whining is pure fanboy hysteria. Even Kyle says he doesn't give two fawks about it. This card is great for Freesync owners, everyone else feel free to buy as many 1080's and 1080ti's as your heart desires.

Meh, its definitely a tiebreaker for me. Call me a whiner, but my room gets hot enough with 3 monitors and a 1080. I remember my 2x 290X's in the summer and it was unbearable. I've got central AC but my office runs easily 10-15f hotter than the rest of the second floor gaming on my 1080. I don't need to add another 200W of unnecessary heat unless I'm getting an experience that justifies it.
 
I don't care, I'm stoked. I bought 2 Vegas. Now looking at it, it looks like the driver won't support Crossfire at launch, but I guess I can hit the ground running once CF support drops and make a video.
 
All of this power consumption whining is pure fanboy hysteria. Even Kyle says he doesn't give two fawks about it. This card is great for Freesync owners, everyone else feel free to buy as many 1080's and 1080ti's as your heart desires.

Meh, its definitely a tiebreaker for me. Call me a whiner, but my room gets hot enough with 3 monitors and a 1080. I remember my 2x 290X's in the summer and it was unbearable. I've got central AC but my office runs easily 10-15f hotter than the rest of the second floor gaming on my 1080. I don't need to add another 200W of unnecessary heat unless I'm getting an experience that justifies it.
I live in the desert and electricity is incredibly expensive so during the day the house has to stay at around 80F between 12pm and 7pm. Power is a big deal to me.
 
All of this power consumption whining is pure fanboy hysteria. Even Kyle says he doesn't give two fawks about it. This card is great for Freesync owners, everyone else feel free to buy as many 1080's and 1080ti's as your heart desires.

I can tell you this, a lot of OEM will not pick up RX Vega because of the power consumption is so high, especially if they gonna skimp on PSU.
 
Here they come.... 1.jpg 4.jpg 2.jpg 3.jpg 20170814_232853.jpg 20170814_233005.jpg 1.jpg ,1.jpg and total system wattage .:)
 
Single mining its not but dual is ok. My guess is drivers and further tweaks will increase it slightly. Future DAG sizes will not affect it.
 
Nice review - no BS, to the point. I like how you guys don't kiss up to the companies (unlike so many others *cough*Guru3D*cough*).

No surprises with the product. Hopefully XFX and Sapphire et al can tweak the design a bit further to improve power and temps, and this could actually end up being "not bad" or maybe even "so so", especially for those stuck with FreeSync.

Kyle is the blower on the Vega 64...
What? No way! :p
 
If the games you put will lower the overall performance because there is a clear bias I dont see why include them for example Computerbase has Anno 2050 which favors Nvidia heavily and I dont think many people plays that game

If Vega 56 being 100usd cheaper than 1080 isnt that great at all well I dont know what it is,It is late but people would buy a 400usd card which has 95-90% performance of one or one has +5-10% and costs 100usd more?

The results From TPU,Guru3d,TT,Anandtech for RX Vega 56 puts it most of the times close to 1080 I dont know what review you were looking at
But you included Doom that is using proprietary low-level extensions and why AMD has better performance by a fair margin (been shown by various reviewers Async Compute is not the primary performance gain in Vulkan Doom for AMD).
One could argue that is a biased game, Nvidia has only recently started to flesh out more their own proprietary low-level Vulkan extensions as they were initially more interested in higher level compatibility functions (probably why their performance is good outside of Vulkan), the new Nvidia Vulkan extensions can only apply to new games that implement them and too late to be added into Doom.
Infinite Warfare is another game heavily favoured towards AMD, so really should include Shadow Warriors 2 for Nvidia.
And those are just the most obvious.

Cheers
 
Back
Top