AMD Radeon VII Video Card Review @ [H]

spintroniX

Gawd
Joined
Apr 7, 2009
Messages
974
Great review and it paints the card in a better light than some others I've seen. Other sites have it at around 5% slower in most titles. Here it seems to be much closer, using real world testing.
 
Joined
Apr 29, 2015
Messages
253
Thanks for the review!

I'd have liked to have seen V64 numbers thrown into the mix; it makes sense to me that one would want to directly compare the new and the old in addition to the competitor's offerings.

Also, I'm a lazy git and can't be arsed to go dig up numbers.
 

Hakaba

[H]ard|Gawd
Joined
Jul 22, 2013
Messages
1,026
Thanks for the review like normal Team [H], I too would be interested in a future article on Chill and Undervolting.
 

Pieter3dnow

Supreme [H]ardness
Joined
Jul 29, 2009
Messages
6,784
What a freakin awesome review. read every word. Unfortunately i still cannot decide. i need a new card, seemed to have messed up my rx580 in a mining operation. don't mind giving an atta boy to amd for all the usual reasons (more competition, etc etc), even spending 50-100$ more than a card is worth next to its competition, all i want is for my 21:10 monitor to be as glorious as its potential. in the late night one night i almost talked myself into a gratuitously priced 2080 ti, until i read 13 pages on these forums about the space invaders. wish i could wait a while for navi or cheaper prices (assuming no crazed crypto charging bulls) but that rx580 is fried for games (though normal desktop functions are there).

It is all about how long you think you have a use for the video card. If you know you can use it the next 3 years without to much of a compromise on settings it might not be that bad.
 
Last edited:

ManofGod

[H]F Junkie
Joined
Oct 4, 2007
Messages
12,718
Much better review. I am still curious though, what were the clock speeds on those games where AMD did not come out on top? I wonder if the GPU was even being fully utilized.
 

YeuEmMaiMai

Fully [H]
Joined
Jun 11, 2004
Messages
32,134
Good summary and as we all know the performance will improve with driver releases so in the end it is a solid win for AMD.
 

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
54,216
Much better review. I am still curious though, what were the clock speeds on those games where AMD did not come out on top? I wonder if the GPU was even being fully utilized.
Yes, we check to make sure the GPU is being utilized in each game. GPU in-game clocks are addressed on page 4.
 

Nolan7689

[H]ard|Gawd
Joined
Jun 5, 2015
Messages
1,694
for a game like Black Ops 4 where would it be selectable? I've never seen an option like that in any game.
Not all games support more than one API. Many games still only support either DX11 or DX12. Some also may have Vulkan support. Typically those that have support will have a toggle in game to change the API.
 

sabrewolf732

Supreme [H]ardness
Joined
Dec 6, 2004
Messages
4,778
for a game like Black Ops 4 where would it be selectable? I've never seen an option like that in any game.

I've never played black ops 4 so perhaps it's through command line, but in hards review they don't say they ran one path vs. the other for black ops 4.
t
 
D

Deleted member 94167

Guest
As always. Most thoughtful review on the Internets. Thanks guys!
 

sabrewolf732

Supreme [H]ardness
Joined
Dec 6, 2004
Messages
4,778
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.
 

harmattan

Supreme [H]ardness
Joined
Feb 11, 2008
Messages
5,112
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.

Ask and thou shall receive. Quoting LtMatt at OCUK forums...

Here is a benchmark of Firestrike Ultra with a nice overclock applied to each GPU.

16% difference on graphics score in this bench.

Radeon VII
2000/1200Mhz
19.2.2

SCORE
7 375 with AMD Radeon VII(1x) and AMD Ryzen Threadripper 1950X
Graphics Score 7 495
Physics Score 27 284
Combined Score 3 331
https://www.3dmark.com/3dm/33581449


Vega 64 Liquid
1802/1200Mhz
19.2.2


SCORE
6 443 with AMD Radeon RX Vega 64 Liquid(1x) and AMD Ryzen Threadripper 1950X
Graphics Score 6 456
Physics Score 27 450
Combined Score 2 980
https://www.3dmark.com/3dm/33582138

Radeon VII has further overclocking headroom with more voltage and a better cooler. I think it would be great for proper water cooling.
 

Rvenger

2[H]4U
Joined
Sep 12, 2012
Messages
2,990
That Vega 64 is nipping at 1080ti territory with that massive overclock. None of my Vega chips could come remotely close.
 

Marees

[H]ard|Gawd
Joined
Sep 28, 2018
Messages
1,198
in my 4.1gbz 2700, DX12 is usually the faster api and at times way faster

Normally most sites test GPUs using most powerful CPU so that it is GPU limited & not CPU limited

Are you asking the [H] team to test GPUs using both high performing Intel CPU & multi-core AMD CPU ?

I don't mind reading more reviews & looking at more charts :)
 

Nolan7689

[H]ard|Gawd
Joined
Jun 5, 2015
Messages
1,694
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.
 

sabrewolf732

Supreme [H]ardness
Joined
Dec 6, 2004
Messages
4,778
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.

I owned a vega 64, at stock voltage/settings the clocks were inconsistent. After tweaking you could get it to maintain consistent clock speeds.
 

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
54,216
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.
Yes, the control over the GPU clock is so incredibly granular, when you start looking at it very small time intervals, it looks like a mess, but what it seems to be to me is simply the GPU saving power/heat where it can.
 

AlphaQup

Gawd
Joined
Oct 27, 2014
Messages
669
Thanks as always for the great review [H]! Fun read, and more positive then some of the other reviews I've read so far.

Can't help but think this would have been a home-run at 50-100 bucks cheaper.

Also looking forward to the deeper dive into the software/OC'ing side of things.
 

Chebsy

Gawd
Joined
Jan 24, 2013
Messages
523
Thanks for a great review guys, I have been waiting for yours, having read others, yours are the reviews I trust the most. I hope AMD can rectify their drivers for the games where performance falls short of the 2080's. It must be a driver/patch issue with the performance matching the 2080 in other games.
 
Joined
Dec 27, 2006
Messages
983
since you guys have 2 cards, any plans to play around with crossfire/mgpu (does anything even use that yet)?
 

demondrops

Limp Gawd
Joined
Jul 7, 2016
Messages
422
far cry 5 it even say amd / ryzen or smth on loading it, long time since i could see that. but maybe some games have bias towards nvidia? i know bfv and tomb raider is prety much nvidia sponsored, but seems much more even in bfv tho then tomb raider. think last time i played bf it was bf 3 that had mantle, around the time amd started to fly out the windu. idk how many will use this card for content creation but they put expensive and huge amount of ram that most ppl wont need. if they could produce a cheaper card i think it would sell very well. they dont have dlss or rtx either so no gamer will buy this over 2080, unless amd fanboy or content creator. either way good to see amd is getting really good on alot lately. but prices are crazy now double price up for 1080ti->2080ti :( only 0.5% elite of gamers will get this card. my speccs is still 98% range on performance test lol.
 
Last edited:

cybereality

[H]F Junkie
Joined
Mar 22, 2008
Messages
8,790
That makes sense.

Honestly, that is probably best. Maybe with the legacy option gone, developers would have more reason to support the future of mGPU.

Or it will just die, I don't know, but that may be better than shoddy support.
 

britjh22

Limp Gawd
Joined
Oct 15, 2014
Messages
387
Excellent review as always from [H].

I think the thing that makes this card interesting as a product is it seems to only exist because NVIDIA's behavior with 2000 series in regards to price. If the RTX 2080 was launched at $550 or $600, we may not have ever gotten the Radeon VII. At $700 AMD can take the Instinct mi50, cut its capabilities back some, and end up with a compelling consumer product. It certainly doesn't blow away the competition, but it's performant enough that there will be a market (all sold out on Newegg at this time).

And the comparison to the RTX 2080 are quite interesting. They both have suspect value adders (RTX/DLSS vs. 16GB HBM2), and are close enough in price/performance to be comparable. If I were to replace my 1080 Ti with either (which would be a side grade at best), I think I'd probably go to the Radeon VII. The possible gains just seem more realistic to happen with the Radeon, and I don't really like how NVIDIA has acted this last generation. I am glad that there wasn't as much time between announcement and the card landing, as the hype train didn't get lots of time to reach truly retarded scenarios.
 

Cactusj

n00b
Joined
Jun 4, 2018
Messages
62
I came back from lunch, and now in stock said there was a MSI VII available from Amazon, so I took the plunge. I am betting on drivers coming out in the near future that will shore up some of the obvious issues. Funny enough, it was the game bundles that some people dont care for, that tipped the scale for me. I have less that 0% interest in BF5, but I thought Anthem would be interesting to play with for a bit. However, Division 2 is a guaranteed day 1 purchase for me, then RE2 remake looked fun as well. That made the VII $60 cheaper than the 2080.
 

Legendary Gamer

[H]ard|Gawd
Joined
Jan 14, 2012
Messages
1,263
It was a very good, deep review as always. I may be looking at one of these coming up if I can scrape together the cash.

I will be interested to see how it performs after AMD dials in the drivers like they always do after some time. It's nice to see AMD produce something competitive again, regardless of the mentioned shortcomings of heat, noise and such.
 

Nightfire

2[H]4U
Joined
Sep 7, 2017
Messages
3,279
I might draw some heat for asking, but I would like to see if the RTX 2080 would pick up a few percent if the system used 32 GB of memory instead of 16 GB.

I have seen bechmarks where this was the case for even the 11 GB 1080ti.

I doubt the R7 would be affected as it most likely doesn't 'need' the help.
 

Chris_B

Supreme [H]ardness
Joined
May 29, 2001
Messages
5,320
Seems the newer drivers have had some positive changes for overclocking, junction temp (down from 100c to 80 ish) and fan noise . Why they didn't get these to reviewers i'll never know =/
 

lostinseganet

[H]ard|Gawd
Joined
Oct 8, 2008
Messages
1,205
Maybe some VR games would make use of the whole 16gb's of vram? Using that ram for making the scenes for both eyes at high rez. Like that Pimax 5K headset?
 
Top