AMD Radeon VII Video Card Review @ [H]

Great review and it paints the card in a better light than some others I've seen. Other sites have it at around 5% slower in most titles. Here it seems to be much closer, using real world testing.
 
Thanks for the review!

I'd have liked to have seen V64 numbers thrown into the mix; it makes sense to me that one would want to directly compare the new and the old in addition to the competitor's offerings.

Also, I'm a lazy git and can't be arsed to go dig up numbers.
 
Thanks for the review like normal Team [H], I too would be interested in a future article on Chill and Undervolting.
 
What a freakin awesome review. read every word. Unfortunately i still cannot decide. i need a new card, seemed to have messed up my rx580 in a mining operation. don't mind giving an atta boy to amd for all the usual reasons (more competition, etc etc), even spending 50-100$ more than a card is worth next to its competition, all i want is for my 21:10 monitor to be as glorious as its potential. in the late night one night i almost talked myself into a gratuitously priced 2080 ti, until i read 13 pages on these forums about the space invaders. wish i could wait a while for navi or cheaper prices (assuming no crazed crypto charging bulls) but that rx580 is fried for games (though normal desktop functions are there).

It is all about how long you think you have a use for the video card. If you know you can use it the next 3 years without to much of a compromise on settings it might not be that bad.
 
Last edited:
Much better review. I am still curious though, what were the clock speeds on those games where AMD did not come out on top? I wonder if the GPU was even being fully utilized.
 
Good summary and as we all know the performance will improve with driver releases so in the end it is a solid win for AMD.
 
Much better review. I am still curious though, what were the clock speeds on those games where AMD did not come out on top? I wonder if the GPU was even being fully utilized.
Yes, we check to make sure the GPU is being utilized in each game. GPU in-game clocks are addressed on page 4.
 
for a game like Black Ops 4 where would it be selectable? I've never seen an option like that in any game.
Not all games support more than one API. Many games still only support either DX11 or DX12. Some also may have Vulkan support. Typically those that have support will have a toggle in game to change the API.
 
for a game like Black Ops 4 where would it be selectable? I've never seen an option like that in any game.

I've never played black ops 4 so perhaps it's through command line, but in hards review they don't say they ran one path vs. the other for black ops 4.
t
 
As always. Most thoughtful review on the Internets. Thanks guys!
 
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.
 
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.

Ask and thou shall receive. Quoting LtMatt at OCUK forums...

Here is a benchmark of Firestrike Ultra with a nice overclock applied to each GPU.

16% difference on graphics score in this bench.

Radeon VII
2000/1200Mhz
19.2.2

SCORE
7 375 with AMD Radeon VII(1x) and AMD Ryzen Threadripper 1950X
Graphics Score 7 495
Physics Score 27 284
Combined Score 3 331
https://www.3dmark.com/3dm/33581449


Vega 64 Liquid
1802/1200Mhz
19.2.2


SCORE
6 443 with AMD Radeon RX Vega 64 Liquid(1x) and AMD Ryzen Threadripper 1950X
Graphics Score 6 456
Physics Score 27 450
Combined Score 2 980
https://www.3dmark.com/3dm/33582138

Radeon VII has further overclocking headroom with more voltage and a better cooler. I think it would be great for proper water cooling.
 
That Vega 64 is nipping at 1080ti territory with that massive overclock. None of my Vega chips could come remotely close.
 
in my 4.1gbz 2700, DX12 is usually the faster api and at times way faster

Normally most sites test GPUs using most powerful CPU so that it is GPU limited & not CPU limited

Are you asking the [H] team to test GPUs using both high performing Intel CPU & multi-core AMD CPU ?

I don't mind reading more reviews & looking at more charts :)
 
In reading reddit it seems like quite a few people are getting a consistent 1.9-2GHz on the core. I'd be really interested in seeing how well it performs at this clocks speed vs the inconsistent clocks it has at stock.
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.
 
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.

I owned a vega 64, at stock voltage/settings the clocks were inconsistent. After tweaking you could get it to maintain consistent clock speeds.
 
I’m not sure I’d call the stock clocks inconsistent. Kyle said he took readings at 0.5 second intervals and Brent at 1 second intervals. So the line looks erratic as hell but it reads as though the GPU just downclocks when full speed isn’t necessary, as opposed to a throttle.

My Vega 64 is much the same, not hot enough to be throttling but not at the max clock all the time in certain games.
Yes, the control over the GPU clock is so incredibly granular, when you start looking at it very small time intervals, it looks like a mess, but what it seems to be to me is simply the GPU saving power/heat where it can.
 
Thanks as always for the great review [H]! Fun read, and more positive then some of the other reviews I've read so far.

Can't help but think this would have been a home-run at 50-100 bucks cheaper.

Also looking forward to the deeper dive into the software/OC'ing side of things.
 
Thanks for a great review guys, I have been waiting for yours, having read others, yours are the reviews I trust the most. I hope AMD can rectify their drivers for the games where performance falls short of the 2080's. It must be a driver/patch issue with the performance matching the 2080 in other games.
 
since you guys have 2 cards, any plans to play around with crossfire/mgpu (does anything even use that yet)?
 
far cry 5 it even say amd / ryzen or smth on loading it, long time since i could see that. but maybe some games have bias towards nvidia? i know bfv and tomb raider is prety much nvidia sponsored, but seems much more even in bfv tho then tomb raider. think last time i played bf it was bf 3 that had mantle, around the time amd started to fly out the windu. idk how many will use this card for content creation but they put expensive and huge amount of ram that most ppl wont need. if they could produce a cheaper card i think it would sell very well. they dont have dlss or rtx either so no gamer will buy this over 2080, unless amd fanboy or content creator. either way good to see amd is getting really good on alot lately. but prices are crazy now double price up for 1080ti->2080ti :( only 0.5% elite of gamers will get this card. my speccs is still 98% range on performance test lol.
 
Last edited:
That makes sense.

Honestly, that is probably best. Maybe with the legacy option gone, developers would have more reason to support the future of mGPU.

Or it will just die, I don't know, but that may be better than shoddy support.
 
Excellent review as always from [H].

I think the thing that makes this card interesting as a product is it seems to only exist because NVIDIA's behavior with 2000 series in regards to price. If the RTX 2080 was launched at $550 or $600, we may not have ever gotten the Radeon VII. At $700 AMD can take the Instinct mi50, cut its capabilities back some, and end up with a compelling consumer product. It certainly doesn't blow away the competition, but it's performant enough that there will be a market (all sold out on Newegg at this time).

And the comparison to the RTX 2080 are quite interesting. They both have suspect value adders (RTX/DLSS vs. 16GB HBM2), and are close enough in price/performance to be comparable. If I were to replace my 1080 Ti with either (which would be a side grade at best), I think I'd probably go to the Radeon VII. The possible gains just seem more realistic to happen with the Radeon, and I don't really like how NVIDIA has acted this last generation. I am glad that there wasn't as much time between announcement and the card landing, as the hype train didn't get lots of time to reach truly retarded scenarios.
 
I came back from lunch, and now in stock said there was a MSI VII available from Amazon, so I took the plunge. I am betting on drivers coming out in the near future that will shore up some of the obvious issues. Funny enough, it was the game bundles that some people dont care for, that tipped the scale for me. I have less that 0% interest in BF5, but I thought Anthem would be interesting to play with for a bit. However, Division 2 is a guaranteed day 1 purchase for me, then RE2 remake looked fun as well. That made the VII $60 cheaper than the 2080.
 
It was a very good, deep review as always. I may be looking at one of these coming up if I can scrape together the cash.

I will be interested to see how it performs after AMD dials in the drivers like they always do after some time. It's nice to see AMD produce something competitive again, regardless of the mentioned shortcomings of heat, noise and such.
 
I might draw some heat for asking, but I would like to see if the RTX 2080 would pick up a few percent if the system used 32 GB of memory instead of 16 GB.

I have seen bechmarks where this was the case for even the 11 GB 1080ti.

I doubt the R7 would be affected as it most likely doesn't 'need' the help.
 
Seems the newer drivers have had some positive changes for overclocking, junction temp (down from 100c to 80 ish) and fan noise . Why they didn't get these to reviewers i'll never know =/
 
Maybe some VR games would make use of the whole 16gb's of vram? Using that ram for making the scenes for both eyes at high rez. Like that Pimax 5K headset?
 
Back
Top