AMD Radeon RX Vega 64 4K Video Card Review @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,626
AMD Radeon RX Vega 64 4K Video Card Review

Does the AMD Radeon RX Vega 64 play games well at 4K resolution? What game settings work best at 4K, and how does it compare to GeForce GTX 1080 and GeForce GTX 1080 Ti? Ten games are tested, new and old, DX11, DX12, and Vulkan at playable game settings and pushed to the max in this all out 4K brawl.
 
Last edited:
Seems like this card is at best on par with a 1080, and lagging behind in many cases.
 
Seems like this card is at best on par with a 1080, and lagging behind in many cases.

An alternate way to view it...
In DX12, the V64 matches the 1080 in pretty much every case (within 1%) and in Vulcan it surpasses it. In DX11, it's a crapshoot as to whether it can keep up.

In all cases, the 1080ti just crushes everything in sight and is where you $700 *should* have gone.
 
A lot of hard work there, thank you. I don't play many of those games, but for Doom at 4K I chose to use the Nightmare textures over any sort of AA and got fluid gameplay and I wonder how much such a choice might change matters. Indeed, can you confirm that at no time were any of the games VRAM-limited?

While the article was a pure performance test, could you comment on the playability and price/performance when pairing the Vega with a 4k Freesync monitor versus a 1080 with a 4K Gsync monitor?
 
An alternate way to view it...
In DX12, the V64 matches the 1080 in pretty much every case (within 1%) and in Vulcan it surpasses it. In DX11, it's a crapshoot as to whether it can keep up.

In all cases, the 1080ti just crushes everything in sight and is where you $700 *should* have gone.
\

I agree, if 4k is something you plan to spend any real time gaming at, even if you buy the cheapest, worst cooled version of the 1080ti, it's going to be night and day at 4k max settings vs. V64. For something you may use every single day of your life, it's not a bad investment :)
 
Thanks for the hard work, thorough tests and review. I have to say it is refreshing that the statement from AMD awhile back about it comparing favorably to the 1080 is so very accurate.

For the purists this is the card to get. Otherwise I can see a good Ryzen build coupled with a TI will give best overall performance for a new 4k build for best prices.
 
Thanks Kyle for this review. It really shows how these cards stack up.

Taking a quick look at NewEgg,
1080 - $500 and up,
Vega 64 - $620 and up, and
1080ti are $730 and up.

I don't see myself paying a $120 premium for pretty much the same performance , either at 1440p or 4K. I am pretty happy with my 1080 right now. Plus, I picked up an Oculus Rift in that last sale. The 1080 runs it really smoothly. Curious how the Vega cards are in VR.
 
Thanks Kyle for this review. It really shows how these cards stack up.

Taking a quick look at NewEgg,
1080 - $500 and up,
Vega 64 - $620 and up, and
1080ti are $730 and up.

I don't see myself paying a $120 premium for pretty much the same performance , either at 1440p or 4K. I am pretty happy with my 1080 right now. Plus, I picked up an Oculus Rift in that last sale. The 1080 runs it really smoothly. Curious how the Vega cards are in VR.
VR has NOT been a strong area for AMD thus far...and i highly doubt thats changed with vega.
 
first i heard that Ryzen sucked with VR. I thought their cpu's were competing halfway decent in gaming for the clock speed they run at.

Not for high Hz, they've never been great. Thread ripper has some serious issues with VR. Ryzen does ok in most games but runs much closer to the limit than Intel.
 
first i heard that Ryzen sucked with VR. I thought their cpu's were competing halfway decent in gaming for the clock speed they run at.

That's because it doesn't. I don't need a site for reviews. I have an R7, a Vive and a Fury X.

Overall, higher end AMD has always been fine for VR. The 480 just skirted the low end. Even the reviews here on [H] primarily dinged the Fury X for being too costly for it's performance (which was true at the time). If you managed to grab one during the selloff after Pascal launched, it works just fine. It sits comfortably between a 1060 and a 1070, though closer to the 1070.
 
Great review. Pretty much what I have experienced. Vega is not a 4K card, aside from a few heavily optimized outliers like DOOM and Forza.

However, Vega Crossfire can do 4K. Sadly still not at Ultra settings, but Very High settings are possible in a number of titles (assuming Crossfire support, which we all know is a hit or miss).

For example, Prey 4K maxed settings (aside from AA, which has problems) I can get around 70 - 80 fps. I can get similar 80 fps performance on GTA V with Very High settings (except advanced).

Watch my video if you haven't seen it yet.

 
Not for high Hz, they've never been great. Thread ripper has some serious issues with VR. Ryzen does ok in most games but runs much closer to the limit than Intel.

That is flat out false. I'd like to see *any* citation about Ryzen having an issue maintaining 90Hz where an Intel does easily. High end gpu bottleneckong can occur but it happens at lower resolutions than the headsets and even when it does bottleneck, it's a difference of 110 fps vs 150 fps. Both of which are indistinguishable from each other in VR.
 
If you can please retest Forza 7 again with the new NVIDIA driver as it boosts GTX 1080 performance at all resolutions, and especially 4K, it's now faster than Vega 64.

unknown.png


https://www.computerbase.de/2017-09/forza-7-benchmark/2/
 
Outstanding job Brent did here and laid it out clearly. If you can afford a 1080 Ti, that is the most likely best gaming card for 4K.

Some of my experiences/opinions here with 4K, Vega 64 LC:
  • Frame rates over 61 fps (beyond my monitor refresh rate) degrades the gaming experience with tearing which addes jerkiness to the experience. Keeping it between a min of 45 to a max of 61 makes every game I play very smooth
    • Obviously you want a card that can pump out as many frames as possible at the highest quality settings - this review shows this clearly how that falls
    • I no longer buy more FPS is the better gaming experience is my point - it does indicate the better performing card which can potentially max out more games now and in the future but I would keep the FPS within the monitor optimized gaming experience range a.k.a my 4K 27" monitor 45fps-61fps (Freesync range is 35-61)
      • So my oddball point, maybe even trite but when I see a graph having a card within the 45-60fps range next to a card doing 80fps + -> I immediately know for me the 45-60fps range is the better gaming experience but I would also buy the faster card and limit the framerate to like 60 fps at 4K to maximize my gaming experience. If I only had the option to buy one card that is, in my case I could afford more than one card
  • With Doom and Vulkan, at 4K, all settings maxed out meaning Ultra and all nightmare settings with motion blur off -> I am getting way above the results here, different map obviously but much higher. I use chill to keep the frame rate less than 61fps for a better gaming experience in the end :wtf:
  • 8x MSAA at 4K is an utter waste for me, pointless, 2x MSAA is about perfect. I have a 160ppi monitor 2x gives over a simulated 300ppi monitor for aliasing sakes, about as good as anyone's eye would be able to distinguish. Bigger monitors will need more AA, 4xMSAA maybe but 8x not. Normally in games at 4K it is the textures that need the AA more than the edges which MSAA does not address, SMAA or SSAA at 2x (Vega 64 forget it) works great there.
    • Not to knock Brent 8x MSAA settings because that was the right way to test and show the max settings for a good gaming expereince


Other thoughts
  • My smoothest gaming experience for 60hz has been with the 1080 Ti and using Adaptive Sync which for 3440x1440p can maintain it at a virtually constant 60fps maxed out settings any game. No GSync is needed. A constant or near constant frame time in this case, no variations from one frame to the next at 60fps beats out my gaming experience for smoothness with Freesync. Both are outstanding but keeping a constant frame rate like with VR or the Vive to me is the best you can have for smoothness
  • RTG Enhanced Sync - I don't know what the hell it does, it seems to help but at times not. It is nowhere near as effective as Nvidia Adaptive Sync in my experience
  • Vega VR is much better than the Nano, slightly better than the 1070 but not nearly as good as the 1080 Ti. Ryzen does great in VR with the 1080 Ti, it maintains 90 fps max out settings in everything - period. Then again I use tweaked maxed out timing DDR4 settings and at 3500mhz which can make a significant difference with Ryzen
  • This is probably the best most meaningful review on the net for gaming experience at 4K
Thanks Brent!
 
That is flat out false. I'd like to see *any* citation about Ryzen having an issue maintaining 90Hz where an Intel does easily. High end gpu bottleneckong can occur but it happens at lower resolutions than the headsets and even when it does bottleneck, it's a difference of 110 fps vs 150 fps. Both of which are indistinguishable from each other in VR.

Go to the VR forums here. Or the [H] review. Ryzen causes some reprojection with Airzona Sunshine.

You said it yourself, 110fps vs 150 fps. What about the game where it's 90FPS vs 130? You'd be in reprojection hell. It's a known fact Ryzen does worse at high Hz. Besides CPU frame times we know it drags down the video card's FPS. Matching Vega and Ryzen seems like a bad idea for VR.
 
Last edited:
Yeah! Team red wins in every benchmark!

Oh wait, red graph is Nvidia! :sneaky:

Awesome article.

Sad that my Vega is not doing better than the 1080, but I can handle the truth.
 
Go to the VR forums here. Or the [H] review. Ryzen causes some reprojection with Airzona Sunshine.

You said it yourself, 110fps vs 150 fps. What about the game where it's 90FPS vs 130? You'd be in reprojection hell. It's a known fact Ryzen does worse at high Hz. Besides CPU frame times we know it drags down the video card's FPS. Matching Vega and Ryzen seems like a bad idea for VR.

You mean this review?
https://www.hardocp.com/article/2017/03/02/amd_ryzen_1700x_cpu_review/5

Where it says this on the conclusion page?

Our VR gaming results showed a very different processor however! While the AMD Ryzen was not the "fastest" system in our VR gameplay, it did however show that it could fully deliver a top-shelf Virtual Reality gaming experience. In the arena of VR gaming, we are seeing that these newer gaming engines are very much thread aware in general, and of course when we have that, the more CPU cores and threads, the better. When looking back at our aging 2600K we see that while sometimes, it actually had a quicker average frame rendering time, than even the Ryzen, the variance in those numbers are much more narrow when we look at Ryzen and the newer Intel CPUs. I actually find these VR gaming benchmarks much more telling than our other scores in showing that the Ryzen scheduler and instruction prefetch abilities are surely where they need to be in order to be competitive. All that said however, clock is still king in the gaming arena and no matter how you look at it, the 7700K at 5GHz is a formidable opponent.
 
You mean this review?
https://www.hardocp.com/article/2017/03/02/amd_ryzen_1700x_cpu_review/5

Where it says this on the conclusion page?

Our VR gaming results showed a very different processor however! While the AMD Ryzen was not the "fastest" system in our VR gameplay, it did however show that it could fully deliver a top-shelf Virtual Reality gaming experience. In the arena of VR gaming, we are seeing that these newer gaming engines are very much thread aware in general, and of course when we have that, the more CPU cores and threads, the better. When looking back at our aging 2600K we see that while sometimes, it actually had a quicker average frame rendering time, than even the Ryzen, the variance in those numbers are much more narrow when we look at Ryzen and the newer Intel CPUs. I actually find these VR gaming benchmarks much more telling than our other scores in showing that the Ryzen scheduler and instruction prefetch abilities are surely where they need to be in order to be competitive. All that said however, clock is still king in the gaming arena and no matter how you look at it, the 7700K at 5GHz is a formidable opponent.

It still had more reprojection in the extremely limited number of games. If you want to have up to half the FPS with the Ryzen/Vega combo than you do with a similarly priced 7700k/1080ti then have at it.

It's a fact Vega trails similarly priced nVidia significantly. It's a fact Ryzen trails Intel and hurts GPU FPS and creates reprojection itself.
 
Last edited:
It still had more reprojection in the extremely limited number of games. If you want to have up to half the FPS with the Ryzen/Vega combo than you do with a similarly priced 7700k/1080ti then have at it.

It's a fact Vega trails similarly priced nVidia significantly. It's a fact Ryzen trails Intel and hurts GPU FPS and creates reprojection itself.

??

I wasn't touting Vega. I was saying Ryzen isn't an issue in VR, and it isn't. Did you even read what you cited?

I stated in my very first post in this thread that a 1080ti is what you should be getting if you're going to spend $700. The Vega still has driver bugs for VR, that much is obvious from the very link *you* posted (in no properly functioning app will a 1080 run 3x faster than any flavor of Vega). The Fury X does just fine these days. There are very few situations where a 1070 will demonstrably outperform a Fury X in VR. I imagine when Vega drivers are sorted, they'll probably compare fairly well with a 1080, but still not get a recommendation because it costs 20% more than a 1080. And that's the way it should be.

Like I said, I actually have these things. I use them. I don't need to someone else to tell me they aren't working when I can walk in the other room and confirm they do.
 
??

I wasn't touting Vega. I was saying Ryzen isn't an issue in VR, and it isn't. Did you even read what you cited?

I stated in my very first post in this thread that a 1080ti is what you should be getting if you're going to spend $700. The Vega still has driver bugs for VR, that much is obvious from the very link *you* posted (in no properly functioning app will a 1080 run 3x faster than any flavor of Vega). The Fury X does just fine these days. There are very few situations where a 1070 will demonstrably outperform a Fury X in VR. I imagine when Vega drivers are sorted, they'll probably compare fairly well with a 1080, but still not get a recommendation because it costs 20% more than a 1080. And that's the way it should be.

Like I said, I actually have these things. I use them. I don't need to someone else to tell me they aren't working when I can walk in the other room and confirm they do.

Sorry, Vega thread so I thought it was somehow related.

I'll keep this short and simple.

Ryzen has more reprojection.
Ryzen also reduces the FPS of the graphics card (increaes frametimes of the GPU itself).

Can you reduce settings and still technically use VR games? Sure.
 
Ryzen has more reprojection.
Ryzen also reduces the FPS of the graphics card (increaes frametimes of the GPU itself).

If you were already skating on the bare edge of playable with some low end (970, 390, 580, 1060) card with your 7700k, it is possible Ryzen could produce an issue. A Ryzen with a 1070+ or a FuryX will not have an issue with VR. You can't max everything everywhere with a Fury, but then a 1070 can't do it either.

You have to pair the Ryzen with a barely capable card in order to observe the difference, and even then it's not a universal truth. Documented, again, in the review you cited, in 5 of the 7 test cases using real play sessions , there is less than 1ms render time difference between a Ryzen and a 7700k. It's not a platform issue, it's an app issue.
 
If you can please retest Forza 7 again with the new NVIDIA driver as it boosts GTX 1080 performance at all resolutions, and especially 4K, it's now faster than Vega 64.

unknown.png


https://www.computerbase.de/2017-09/forza-7-benchmark/2/

To save anyone else from having to fire up Google Translate, computerbase.de did a gameplay benchmark instead of using the in game one. It's unlikely they benched the same gameplay that Brent did; but it does lean a bit of credence toward it being the in game benchmark that's the outlier.

We really need someone with the game, both cards, and a lot of time on their hand to do 5 or 10 gameplay comparisons; ideally at least one similar to the in game benchmark as possible.
 
To save anyone else from having to fire up Google Translate, computerbase.de did a gameplay benchmark instead of using the in game one. It's unlikely they benched the same gameplay that Brent did; but it does lean a bit of credence toward it being the in game benchmark that's the outlier.

We really need someone with the game, both cards, and a lot of time on their hand to do 5 or 10 gameplay comparisons; ideally at least one similar to the in game benchmark as possible.

is not the same test 25seconds of gameplay to 5-15minutes which is the time normally Brent take for gaming test on each card.
 
I'm running a 6850k @ 4.2 and my 1080ti at 200+mhz and although it gets a little warmer it makes everything super smooth.
 
I am wondering if the ability to max out Forza 7 is an artifact of the engine being so optimized to be able to run smoothly on the Xbox One. We have heard time and time again about engines and games being able to give us better image quality, but the option is not in release because of the console development focus.
 
1507576569da7dn48w49_2_1.png


That's a rather curious looking graph. Is the 1080ti even playable with those huge framerate swings? Then a third of the way through suddenly becomes 30-40% faster. Wouldn't be surprised if they tuned for FPS over input lag just to improve benchmarks. Only appears to apply to 1080ti and not 1080 looking at those numbers. Could just be a high driver overhead issue, but 2/3 of that game run I'm unsure the 1080ti is playable with the 60fps swings.
 
1507576569da7dn48w49_2_1.png


That's a rather curious looking graph. Is the 1080ti even playable with those huge framerate swings? Then a third of the way through suddenly becomes 30-40% faster. Wouldn't be surprised if they tuned for FPS over input lag just to improve benchmarks. Only appears to apply to 1080ti and not 1080 looking at those numbers. Could just be a high driver overhead issue, but 2/3 of that game run I'm unsure the 1080ti is playable with the 60fps swings.
Reading is fundamental.
 
1507576569da7dn48w49_2_1.png


That's a rather curious looking graph. Is the 1080ti even playable with those huge framerate swings? Then a third of the way through suddenly becomes 30-40% faster. Wouldn't be surprised if they tuned for FPS over input lag just to improve benchmarks. Only appears to apply to 1080ti and not 1080 looking at those numbers. Could just be a high driver overhead issue, but 2/3 of that game run I'm unsure the 1080ti is playable with the 60fps swings.


Most likely has to do with perfmon more than anything else.
 
This part?

I didn't see much commentary on what exactly was going on there, but only had my phone when I initially read it. The above quote is all I saw regarding that test. The 1080 and Vega seem about right with consistent FPS, but the 1080ti result seems anomalous. Might just be your play-through, but it would seem you encountered something that drastically increased performance midway through in addition to unstable framerates for most of the run. CPU bottleneck creating the unstable FPS along with leading the pack or something?
Had there been any issues with gameplay we would have certainly mentioned those.....as we always do.
 
That's a rather curious looking graph. Is the 1080ti even playable with those huge framerate swings?

That graph threw me off for a second too, but then I realized- even with the high variability, the 1080Ti is still higher than the 1080 or Vega 64. Which means that it'll be at least as smooth as those in the measured playthrough.
 
R.E. PresentMon, it samples framerate in fractions of a second, not every second, so really you are seeing framerate in not framerate, but micro seconds, or more like frame time. It's confusing, but basically ignore those huge swings, because that is not actually experienced while gaming in frame rate. PresentMon unfortunately does not have an option to only sample every second, I wish it did. Unfortunately, for DX12 Windows 10 games, it is the only application that will capture performance right now. Action! won't because the games don't run in "Exclusive" Fullscreen.

That said, the average framerate is accurate, that is why we removed the min and max framerate tables at the bottom and only leave the average framerate indicated now with PresentMon.
 
Back
Top