AMD Radeon VII 33-Game Benchmark: "It Makes the GTX 2080 Look Pretty Good"

But, but, but...what about 4k?

;)

Okay, I'm still waiting for Navi. The Radeon VII is an obvious stopgap. I say that as an AMD fan (check the sig: Vega 56 and R9 390, and very old Nvidia's).

Holiday Season 2019 will change everything. Feliz NAVI-dad...indeed.
Everything from AMD has been a stopgap since the 290X...
Guess we'll have to see what Navi does with GCN. If it's just minor tweaks and some new features it's going to be disappointing. Needs a major overhaul.
Navi is not based on GCN. AMD is, thankfully, finally putting GCN out to pasture with Navi. It will be 10 years that AMD has been on the same microarchitecture when Navi finally comes out. Even NVIDIA only milked their highly successful Tesla microarchitecture for 4 years.
 
Isn't Navi the last generation to be still based on GCN?
Yes Gcn as far as wiki said. But gcn was/ is modular or some such.. I mean if the changes are significant enough, what difference does it make.
 
As he said in the video, AMD stated that the card is performing as expected. I would not expect a 3 - 8% increase anytime soon, if at all.

Nah, it is known that GCN is very underutilized, Vega 64 LC edition barely competed with the GTX 1080 at launch, and the air cooled version consistently underperformed, now the air cooled version gives a hard time to the GTX 1080 and the Liquid Cooling edition either matches and outperforms the GTX 1080 OC more often than not. Just like the RX 480 being slower than the GTX 1060 and now it is fast. Is just that newer drivers along with more complex games are able to exploit the parallelism of GCN which suffers from underutlization, specially on DX11 titles where the Draw Calls are single threaded on AMD.
 
A lot of people here touting that the Radeon VII is a productivity card that can also game - which it might very well be, but AMD didn't market it as that. They marketed it as a gaming card and a direct competitor to the RTX 2080. During their reveal they showed benchmarks against the RTX 2080, demo'd Devil May Cry 5, brought a The Division 2 dev up on stage, and to top it off bundled the GPU with three $60 games. I believe they had only one slide covering productivity performance and that's it. While its a decent gaming card, it overall fell short of the mark when it comes to gaming performance, power consumption, and noise levels compared to the equally priced RTX 2080 and even the GTX 1080 Ti.
 
As he said in the video, AMD stated that the card is performing as expected. I would not expect a 3 - 8% increase anytime soon, if at all.

That statement doesn’t mean there’s not room for improvement. It’s performing as expected upon release but over time things ‘could’ get better. As I said in my last post that given time and in enough peoples hands we’ll more than likely see what the card is truly capable of.
 
That statement doesn’t mean there’s not room for improvement. It’s performing as expected upon release but over time things ‘could’ get better. As I said in my last post that given time and in enough peoples hands we’ll more than likely see what the card is truly capable of.
AMD's statement implies it is doing as well as it ever will. But if you want to hold your breath waiting on a 3-8% improvement that even the manufacturer does not think is coming, so be it.
3 - 8 percent..., so then it would only be 4% slower than a 2080. ;)
Card is dead to me as it probably is for everyone else except for the most diehard AMD fans.
 
Last edited:
AMD could easily eek out another 3-8% performance with driver updates, so all is not lost. The power consumption is worrisome though, their 7nm is not quite there yet? Dunno.

Considering the inconsistency in some of these benchmarks, I think AMD may have released these with half-baked drivers. Just look at battlefront II versus battlefield 5 benchmarks - both running on the same engine. If I was AMD, I would be optimizing for BFV as a priority vs something older and less played like battlefront II. The difference is a spread between -10% vs 2080 (unoptimized?) and +13% vs 2080 on BFV.
 
AMD could easily eek out another 3-8% performance with driver updates, so all is not lost. The power consumption is worrisome though, their 7nm is not quite there yet? Dunno.
From another review, they undervolted it and ended up with 284W@ 1800Mhz GPU

2080 is what 350w?
 
Well if i could get it for 699 i would be happy, but the fact is it sell for close to 899 here, and that pretty much make it not interesting, though a 2080 sell for a little higher price then that's more easy to justify with its place in performance rankings.
If i could get it for 599 i would buy it tomorrow, or should i say when cards with a better cooler appear cuz i will have to run it on air for a while.

I got mine for £649 shipped which is about the same cost as the cheapest 2080 around me. But the VII comes with three games all of which I was looking to buy - something I couldn't say for the 2080. All in all, it probably cost me £549 net (649 - the £100 I likely would have forked out for the games) so not hugely off what I'd consider the sweet spot.

Also, it does look like Sapphire is coyly hinting a Nitro+ version so may make sense to wait still.
 
$500 pricepoint and 8GB of HBM would have sold a lot of Radeon 7 gaming cards.

With 8GB HBM2, it would have half the memory bandwidth and performs worse.

I mean ... I'm just so heart broken to find this news out ... heart broken I tell you ... lol

You guys do know and understand that AMD abandoned the PC GPU market to focus what little resources they had on the Xbox One and PS4 right? They are way way behind. That story is all over the internet. They had little resources, money and they had to pick one over the other.

This is completely false.

If HBM memory is that expensive and the prices haven't gone down enough to be affordable then AMD should look into using other memory. Nvidia doesn't make extensive use of HBM and they're doing fine. AMD should stop using HBM as an excuse for poor performance per dollar.

AMD had to use HBM2.

Vega (and GCN in general) is too inefficient and, in comparison to GDDR5/X available at the time of release, HBM2 offers the additional memory bandwidth and lower power consumption that Vega needs.

Well at least it has super expensive HBM2 that doesn't actually help gaming performance but somehow justifies it having the same ludicrous nose bleed prices as the 2080 right?

It clearly does, otherwise there would be less memory bandwidth, and gaming performance would be worse.[/QUOTE]
 
Last edited:
I don't know that I totally blame AMD for lagging (somewhat) behind NVIDIA considering they seem to have been focusing on their CPU business to some extent. Their CPU business, prior to Ryzen, was dead for nearly all intents and purposes; and they likely see more growth coming out of that area as Intel doesn't have a lot to hit Ryzen/Threadripper with at the moment. They can stave off NVIDIA for a while longer with rehashes of existing vega/polaris architecture while they shore up their CPU position and then come back later in 2019 and hit NVIDIA with navi.

Additionally, from what I can tell, the Radeon VII looks like it may do well with the crypto miners that are still left standing.

It sucks because the market has been so broadly competition-starved that we'll take just about anything, but for gaming purposes I would definitely call the Radeon VII the lower end of 'just about anything.'
 
Seriously looking forward to Kyle's review. I would like to know what the clock speeds are on each game because my guess is, anything that is under performing is not fully utilizing the card and the clocks do not peak out.
 
A lot of people here touting that the Radeon VII is a productivity card that can also game - which it might very well be, but AMD didn't market it as that. They marketed it as a gaming card and a direct competitor to the RTX 2080. During their reveal they showed benchmarks against the RTX 2080, demo'd Devil May Cry 5, brought a The Division 2 dev up on stage, and to top it off bundled the GPU with three $60 games. I believe they had only one slide covering productivity performance and that's it. While its a decent gaming card, it overall fell short of the mark when it comes to gaming performance, power consumption, and noise levels compared to the equally priced RTX 2080 and even the GTX 1080 Ti.
You are incorrect on a few points.
 
Well my observations were based on the reveal presentation and launch reviews from various tech websites. I'll be interested to see what HardOCP's review will reveal especially if you have access to updated/newer drivers.
What you stated is not true about AMD and it's marketing of the part.
 
How so? I'm assuming we watched the same public release presentation. The majority of time spent during that presentation on the reveal of the Radeon VII involved gaming elements. Even AMD's website lists the Radeon VII as, "THE WORLD'S FIRST 7nm GAMING GPU". The focus has been on its gaming performance, not the productivity. I'm not saying they're ignoring the productivity.

One observation in that sentence "the world's first 7nm gaming GPU" I'd like to point out is that 7nm comes first and gaming GPU comes second :) If that makes sense.
 
AMD isn't going to come out and say look a Radeon Pro WX9100 with a die shrunk GPU for 700 bucks. Cause we know you all hated that our 16gb workstation workhorse was only $700 cheaper then the Nvidia alternative. So lets just make the VII 700 bucks. lol

They may not have talked about it as much on stage.... but trust me everyone watching thought oh my, its a 16gb workstation card for under a grand. I have been asked about it 4 or 5 times last week by clients that do video. That user base has taken notice. And now AMD just announced Radeon VII can use their pro driver. Game changer for the workstation market. AMD just made every workstation card under $2500 obsolete.
 
From another review, they undervolted it and ended up with 284W@ 1800Mhz GPU

2080 is what 350w?
The 2080 goes past 1.8 GHz out of the box and it averages 215W, peaking around 220-230W when gaming. I don't know where you got the 350W number from. Even the 2080 Ti doesn't go past 290W when gaming.
 
The 2080 goes past 1.8 GHz out of the box and it averages 215W, peaking around 220-230W when gaming. I don't know where you got the 350W number from. Even the 2080 Ti doesn't go past 290W when gaming.
System draw was the # i was going by, 2080 looks like a power hungry pig now
upload_2019-2-11_10-20-15.png
 
Seriously looking forward to Kyle's review
I think we all are. It sucks that they probably finished the review using the broken Press Kit pre-release drivers and then the "fixed" public drivers dropped and the have to do some of it all over again.
I bet Kyle and gang are pulling their hair out.:mad:


System draw was the # i was going by, 2080 looks like a power hungry pig now
Is it a Pig? or is it damn near equivalent to the R7.
 
I think we all are. It sucks that they probably finished the review using the broken Press Kit pre-release drivers and then the "fixed" public drivers dropped and the have to do some of it all over again.
I bet Kyle and gang are pulling their hair out.:mad:


Is it a Pig? or is it damn near equivalent to the R7.

And do they now test them with the radeon pro enterprise software as well ? lol joking
 
I think we all are. It sucks that they probably finished the review using the broken Press Kit pre-release drivers and then the "fixed" public drivers dropped and the have to do some of it all over again.
I bet Kyle and gang are pulling their hair out.:mad:


Is it a Pig? or is it damn near equivalent to the R7.
Brent has actually not had any real issues with his RVII card. All the issues were on my end, and AMD did warn of known issues on X399 platforms.

upload_2019-2-11_11-36-37.png
 
Steve really is a benchmarking monster. Each game has min frame rate, but nothing really stood out.

What happened to Ashes of the Singularity? I thought that was the #1 benchmark for AMD cards just a while back (Async Compute, anybody?)

Now it's not even on the list.

Kidding aside, I think the RVII is pointing things in the right direction for AMD. So it's not a grand slam, but it's got a lot of potential. Definitely turned a few heads in the DX12/4K stuff.
 
Really? They're both within 2-3W of each other stock. I'm sure the 2080 would show similar numbers if it was also undervolted.

Then they should test that, AMD does undervolting directly in their software so it's a simple feature every end user has access to, 50W less for same performance, not sure how 2080 would perform with lower voltages.

Another positive is card is cheaper, more ram and has a good 3 game package, isn't looking too bad for AMD this round
 
  • Like
Reactions: GHRTW
like this
Considering the inconsistency in some of these benchmarks, I think AMD may have released these with half-baked drivers. Just look at battlefront II versus battlefield 5 benchmarks - both running on the same engine. If I was AMD, I would be optimizing for BFV as a priority vs something older and less played like battlefront II. The difference is a spread between -10% vs 2080 (unoptimized?) and +13% vs 2080 on BFV.

The BFV optimization argument doesn't hold water as the boost in BFV was 31% while BF2 was 32% when compared to Vega64. There is no reason why AMD wouldn't have tried to optimize Vega 64 for BF2 last year as it was a big game then
 
Then they should test that, AMD does undervolting directly in their software so it's a simple feature every end user has access to, 50W less for same performance, not sure how 2080 would perform with lower voltages.

Another positive is card is cheaper, more ram and has a good 3 game package, isn't looking too bad for AMD this round

That's quite the strawman. To paraphrase. "Undervolting the R7 probably won't have problems, and undervolting the 2080 probably will. So the AMD is 50 Watts less! Definitely better."
 
I am not even sure if I know what strawman means. Is it the same as scapegoat? I feel like strawman has been overused in the last 6 months.
 
I am not even sure if I know what strawman means. Is it the same as scapegoat? I feel like strawman has been overused in the last 6 months.

"an intentionally misrepresented proposition that is set up because it is easier to defeat than an opponent's real argument."

In other words, his best argument was inventing a theory that the R7 could undervolt and save 50W, where the 2080 could not, and therefore made the R7 better. It might be true, of course, but he had no evidence. Just made something up and used that to justify his position.
 
Vega 56/64 bottleneck from the reviews I read was bandwidth

Radeon 7 speeds up Vega by giving it more bandwidth, removing the bottleneck eeking out more performance that was in Vega that couldn't be harnessed, in addition to just being faster due to a lower process

The Achilles heel is that it also only has 60 Compute Units rather than the 64 which is the max of the architecture. (perhaps to accommodate for yields?)

If this was a full 64 compute units, would be much closer if not better than the 2080

From the reviews, Drivers at launch are apparently god awful as well. Clearly this was rushed out the door. Hopefully there will be some pickup with drivers, but how much will be hard to say.

Nevertheless, I'll be waiting for Navi.
 
Again, this card was intended just to use up non-optimal chips intended for the Vega 20 cards. It was never intentionally planned or designed to be a top-tier 'gaming' GPU.
It was also never intended to make a profit... just reduce what would otherwise be a greater inventory loss of otherwise unsalable product.

As a scientific or production card it is exceptional - that is the true market for this card.

AMD was crawling in bankruptcy for several years, now their CPU's are letting them walk briskly again. Top performance GPU's will come... perhaps with Navi, but only after they're prepared to start a real footrace with nVidia.

/an RTX killing product means a price-war... which AMD isn't ready for yet
//patience
 
Let me spare everyone the suspense.

The Radeon VII was a lucky happenstance for AMD, as they originally didn't have plans to release a "high end" card. They found something they intended for content creators could get "close enough" in gaming, so they released this card.

Navi is NOT going to be the nVidia killer either - this is the Polaris replacement, although, if the rumors of it being better than a Vega 64 at $250 or cheaper end up being true, that in and of itself would make it a damn good steal at that price point.


If you want something to contend with nVidia, it is going to be the NextGen which likely will not hit until late 2020 at the earliest, and likely based on recent reports, not till 2021. Although, it will also be the first post-GCN architecture, which takes time to develop a new architecture.

That's the roadmap folks. Best to keep your expectations in check. Although even if its slower than a Radeon VII and a GTX 2080, if Navi can deliver on those rumors of better than Vega at less than $250, I'll finally have not only a reason but a means to convince my wife to let me finally upgrade my Radeon 390...
 
Nah, it is known that GCN is very underutilized, Vega 64 LC edition barely competed with the GTX 1080 at launch, and the air cooled version consistently underperformed, now the air cooled version gives a hard time to the GTX 1080 and the Liquid Cooling edition either matches and outperforms the GTX 1080 OC more often than not. Just like the RX 480 being slower than the GTX 1060 and now it is fast. Is just that newer drivers along with more complex games are able to exploit the parallelism of GCN which suffers from underutlization, specially on DX11 titles where the Draw Calls are single threaded on AMD.

AMD/ Raja/ whomever very stupidly insisted that Fury's buswidth be cut in half. This badly starved Vega 64 of memory bandwidth. Why else, with the cores being essentially the same between Fury and Vega 64, would a roughly 50% increase in core clock only lead to a roughly 25% boost in performance?

The 480 was really only slower than the 1060 in Nvidia biased games and/or when using the low power settings. Overall the two cards were fairly equal at launch with the 480 using a lot more electricity, but being so much cheaper that you would need to game daily for multiple years before reaching the break even point
 
Navi is NOT going to be the nVidia killer either - this is the Polaris replacement, although, if the rumors of it being better than a Vega 64 at $250 or cheaper end up being true, that in and of itself would make it a damn good steal at that price point.

That's false.

Navi is intended to replace Vega.

Also, the only reason AMD would price it that low is if NVIDIA release the next generation that offers Geforce RTX 2070 performance for $300
 
Back
Top