AMD Radeon VII Video Card Review @ [H]

Sure.

Since the performance difference between Radeon VII and Geforce RTX 2080 is so large in Shadow of the Tomb Raider running in DirectX 11, I wonder if the same performance gap persists when running the game in DirectX 12.
Thanks, got it now.

You can go look here and see that DX11 is actually quite a bit faster than DX12 on the RTX 2080. Hence the reason for using it to test with.

We look at DX performance on a game by game and a card by card basis and use the API that best showcases the specific card's performance. On page 11, Deus Ex, you can see that we used DX12 for the RVII and DX11 for the RTX cards.
 
Damn, you guys nailed it. Even solidly answered the memory question that's been lingering about.

And it's concerning that AMD is still running their cards relatively 'hot' at stock. Undervolting AMD GPUs has been a thing for some time, among miners especially where performance per watt is king, but also among those just trying to make the AMD's GPUs more comfortable to be around in terms of noise and heat.

Do we know if AMD has done this because the average Radeon VII sample is simply that innefficient that they need the juice to keep the average card stable, or did AMD overshoot a bit here?
 
Awesome review! Looks like a decent but not perfect card. Certainly nice that AMD is hanging at 4K now.

Quick question: do you think it would be worthwhile to sell my 2x Vega 64s and buy 1x Vega VII (for use on 4K TV)?

Looks like cost would be even, and I can avoid Crossfire issues/incompatibility but probably get less performance in the "best case" games.
 
Awesome review! Looks like a decent but not perfect card. Certainly nice that AMD is hanging at 4K now.

Quick question: do you think it would be worthwhile to sell my 2x Vega 64s and buy 1x Vega VII (for use on 4K TV)?

Looks like cost would be even, and I can avoid Crossfire issues/incompatibility but probably get less performance in the "best case" games.

Yes, definitely.

Do games even support Crossfire these days?
 
Some games support Crossfire, yes, but it's a hit or miss.

However, I have this FreeSync TV and previously 1x Vega 64 was not enough for 4K.

With the VII, though, it looks like 4K is viable, so I might make the swap.
 
Silver rating?

The fan is louder than most modern HVAC systems.

Some games were double digit frame rate less performance (one 30% less performance) than an equivalently priced card that came out 2.5 years ago.

Your personal system locked up constantly with a x399 motherboard from the same Company.

You can’t overclock without crashing.

Silver rating???

Everyone knows you hate radeon vii dude. Enough.
 
Tomb Raider is faster with the RTX 2080 using dx12 and even faster when both using dx12.

Is that what you wanted to hear?

No, I don't want to "hear" anything.

I also want to see Radeon VII and GeForce RTX 2080 compared in Shadow of the Tomb Raider (DirectX 12).

Other websites show Radeon VII and GeForce RTX 2080 to be much closer in Shadow of the Tomb Raider, so I am surprised to see this huge performance gap.
 
Last edited:
I also want to see Radeon VII and GeForce RTX 2080 compared in Shadow of the Tomb Raider (DirectX 12).

Other websites show Radeon VII and GeForce RTX 2080 to be much closer in Shadow of the Tomb Raider, so I am surprised to see this huge performance gap.

I thought the reasoning in the review was pretty clear- each GPU was tested with the fastest API for that GPU.

What you're asking for is to have the GPU that tested faster, be retested with the API that it runs slower with.

So the question the rest of us have to ask is, why?

I get the idea of doing a pure 'apples to apples' comparison, but that's not really the stated goal; the goal as expressed is for the GPUs to be tested at their fastest settings just as a gamer would.
 
Good review. I agree with the conclusion, if AMD came in a bit lower in price ($649) I would have no problem recommending this over the RTX 2080. At the same price though, I'd rather go with the RTX since it offers more consistent performance across different games and you have the option of using the RTX features (not that I think they are a big selling point at the moment, I've found DLSS pretty disappointing, but if I'm paying the same price I might as well get them).

And with a new RTX you might have to deal with space invaders.
 
No, I don't want to "hear" anything.

I also want to see Radeon VII and GeForce RTX 2080 compared in Shadow of the Tomb Raider (DirectX 12).

Other websites show Radeon VII and GeForce RTX 2080 to be much closer in Shadow of the Tomb Raider, so I am surprised to see this huge performance gap.

The notion that DX12 and Vulcan would be AMD’s savior died a long time ago.
 
What I am saying is that:

Overclocked3D tested Shadow of the Tomb Raider in DirectX 12 and the result shows the difference between Radeon VII and Geforce RTX 2080 to be much smaller.

I am wondering if this is because they were testing the game in DirectX 12.

Review from GameSpot also don't show this huge gap.

View attachment 141999
We do not comment on other websites' results. I can tell you that we have repeatable results, many times over, using real gameplay.

EDIT: And worth mentioning, the reason we use real gameplay is because that many times shows us different results than a canned benchmark. I would expect to differences in our testing compared to many other reviews on the net. Keep in mind that we also usually evaluate the full game and go back and do our runthroughs on the most demanding maps on the game. We do look for the worst case scenario in terms of performance.
 
Last edited:
Enjoyed the review, a shame that the card seems to struggle a bit with some games but otherwise seems to be right with a 2080 or a bit faster. Still good to have a solid second option if your using a 4K screen.
 
So hand overclocking the card, I am getting very good GPU clocks, but my HBM2 clocks are tanking, does not seem to be from heat but rather power limits. Going to take a lot of playing with.
 
I'm glad AMD finally has a card that can be called "high end" after so many years. Competition is always good. That said, I don't really think anyone could really justify buying a Radeon VII at $699 unless they are just completely anti-Nvidia or have a use for 16gb of VRAM. Outside of that, a 2080 provides you with ever so slightly better performance while doing it using less power and running cooler. A 2080 also has a lot more OC headroom and all of the RTX goodies (as useless as they currently are...you still get them). Like Kyle said, I think if these cards came in about $50 cheaper, they might actually be the way to go. I really wonder if they could shave off one of the HBM units, leave the card with 12gb and 3/4 of the bandwidth, and charge $599-$649 for the card. That would be a really interesting option for a lot of people, imo.
 
I really wonder if they could shave off one of the HBM units, leave the card with 12gb and 3/4 of the bandwidth, and charge $599-$649 for the card.
I did talk to AMD about this at CES, and to be exact, I did not get an answer, but the reaction I got did not seem to be a positive one.
 
I don't think its a bad card, but its hard to recommend over a RTX2080. Its slower, hotter, noisier and draws more power, plus its costs the same and its hard to find.

I'm really impressed as what AMD has achieved bringing Vega to this heights, but its time for something else. Unfortunately Navi may not be up to the task either, but there's hope.

I mean, unless you really hate nvidia, then the RTX2080 is a much better choice.
 
Finally a sane review with an actual human having some real game time with the hardware and noting the experience. Vice just hitting a button and coming back later for some numbers not noticing any other issue. Great job! Thanks.

Shadow Of The Tomb Raider plays like crap on Vega 64 and FE. It’s Dx 12 MGpu is outstanding with NVidia 1080 Ti with around 95% scaling at 4K. Vega FE MGPU just crashes. This game gave me the best visual experience I’ve ever seen using HDR. Speaking of HDR, it will use more ram, about 1gb more at 4K in this title, hit memory bandwidth more as well. Not sure if it will make a significant performance difference for lower ram cards. Do request HDR testing be considered in the future. Many of the games tested in this review does offer HDR rendering. While they are many crap HDR monitors out there, there are now good ones available which from what I’ve seen HDR improves IQ more than current RTX games.

Look forward to OCing results in a future review. Silver seems about right, at least it worked, no space invaders etc. sad if standards fall low on reliability in the industry as a note.

Personally I don’t see buying any new current generation card, nor do I feel like I am missing anything. Both NVidia and AMD I think fail short, if anything AMD caught up some to Nvidia for an enthusiast gaming card.
 
I don't think its a bad card, but its hard to recommend over a RTX2080. Its slower, hotter, noisier and draws more power, plus its costs the same and its hard to find.

I'm really impressed as what AMD has achieved bringing Vega to this heights, but its time for something else. Unfortunately Navi may not be up to the task either, but there's hope.

I mean, unless you really hate nvidia, then the RTX2080 is a much better choice.

That's why it's called Radeon Vega II.

You literally just described a rerun of the Radeon RX Vega 64's launch.
 
Thanks guys for getting this review out.

If I were buying a $700 GPU today, I would consider this card pretty even with the 2080 and would probably get the Radeon Vega II just because NVIDIA has been nasty lately. While the RTX2080 has had plenty of time for the drivers to mature, I think AMD will see significant improvements in the next few months. It is just history repeating itself.
Consider the 1080 vs the Vega 64. When the Vega 64 came out, all the reviews had it just a bit slower than the 1080, but everything I see now, has it just a bit faster. You can see the same thing going back 2-3 generations where AMD was playing catch-up. They delivered a card just a bit slower on launch, but over time, it surpassed it's competition.
 
Last edited:
What a freakin awesome review. read every word. Unfortunately i still cannot decide. i need a new card, seemed to have messed up my rx580 in a mining operation. don't mind giving an atta boy to amd for all the usual reasons (more competition, etc etc), even spending 50-100$ more than a card is worth next to its competition, all i want is for my 21:10 monitor to be as glorious as its potential. in the late night one night i almost talked myself into a gratuitously priced 2080 ti, until i read 13 pages on these forums about the space invaders. wish i could wait a while for navi or cheaper prices (assuming no crazed crypto charging bulls) but that rx580 is fried for games (though normal desktop functions are there).
 
I really wonder if they could shave off one of the HBM units, leave the card with 12gb and 3/4 of the bandwidth, and charge $599-$649 for the card.

To explore this idea a bit, admittedly somewhat off-topic, a real improvement would be to shave off two HBM stacks so that the size of the interposer could also be shrunk. That'll drop the cost both in terms of bill of materials and very likely in terms of increasing yields.

Now the problem with that is that you'd also be cutting memory back to 8GB, but more importantly you'd be cutting bandwidth to Vega 64 levels as well, basically half of what the Radeon VII has to work with. This will likely erase much of the performance benefit of the Radeon VII.
 
To explore this idea a bit, admittedly somewhat off-topic, a real improvement would be to shave off two HBM stacks so that the size of the interposer could also be shrunk. That'll drop the cost both in terms of bill of materials and very likely in terms of increasing yields.

Now the problem with that is that you'd also be cutting memory back to 8GB, but more importantly you'd be cutting bandwidth to Vega 64 levels as well, basically half of what the Radeon VII has to work with. This will likely erase much of the performance benefit of the Radeon VII.

AMD would need to significantly update its delta color compression (which hasn't been updated since Tonga) to be able to sustain that performance with half the memory bandwidth.
 
To explore this idea a bit, admittedly somewhat off-topic, a real improvement would be to shave off two HBM stacks so that the size of the interposer could also be shrunk. That'll drop the cost both in terms of bill of materials and very likely in terms of increasing yields.

Now the problem with that is that you'd also be cutting memory back to 8GB, but more importantly you'd be cutting bandwidth to Vega 64 levels as well, basically half of what the Radeon VII has to work with. This will likely erase much of the performance benefit of the Radeon VII.

That's why I said to leave it at 12gb HBM. It might keep the bandwidth high enough not to neuter the gains.
 
Excellent review. Worth the wait.

And I dont think AMD is going to be re-running this card with 12 or 8GB HBM: that would take engineering and the point of this card is to move m150s that didn't cut muster without much work.

Good to see AMD has been rectifying issues quickly. Also, as is tradition with Vega, seeing many reports of good underclocking results. Will be interesting to see where this card is in a few months.
 
I did. It said:

"In this game the differences between DX12 and DX11 on the AMD Radeon VII are much closer than any other game. However, you can still see that technically DX11 is faster. We also tested at 4K to verify, and the 4K results also show DX11 being faster than DX12. Therefore, we will test in DX11 since that shows us the highest performance."
Testing using the fastest api for the system is the most honest way to evaluate. Anyways in my 4.1gbz 2700, DX12 is usually the faster api and at times way faster. That is not universal. Very fast Intel processors seems to do better with DX 11. I think [H] already proven more cores does not mean better performance over a quad core in general if clocked high as in speed is still king for games.
 
I think [H] already proven more cores does not mean better performance over a quad core in general if clocked high as in speed is still king for games.

Also a bit off-topic, but while this is currently true for machines set up for benchmarking, once you start loading up all of the stuff a regular user might have running in the background, more cores can keep stuff from bumping game processes out of focus and causing performance inconsistencies.

While using an 8700K with its six Skylake cores or a 9900K with its eight Skylake cores would be useful for a non-benchmarking system, at the same clockspeed as a 7700K, they're not going to push the Radeon VII or anything other GPU harder with the games used in this review.
 
nice review. Yea its surprising in some games it just falls behind all of sudden and in some games it beats 2080 lol. Maybe driver updates will close the gap in those games but honestly seems some games will never really be optimized for AMD. Maybe they just don't have the man power to optimize for every game.
 
Shadow of The Tomb Raider benchmark that other sites uses is mostly a flyby. Larger areas are shown which in actual gameplay NVidia is better at discarding and not rendering unseen items. Real game play is on the ground and not flying around where you have more stuff blocking each other. In other words NVidia does this game better rendering what is needed while AMD is rendering wastefuly more stuff not needed.

Dirt 4 is the game that the Vega VII kicked even a 2080Ti. High memory bandwidth can help in certain games.
 
Brent & Kyle, thanks for all of the hard work doing this review. It was well worth the wait!

Now take your team out for a few beers and get some rest!
 
Back
Top