AMD Radeon VII 33-Game Benchmark: "It Makes the GTX 2080 Look Pretty Good"

I'm more interested in minimum frames and frame times. What difference does it make if you can do 150 fps+ if you have to endure any stutter.

Disclaimer: These aren't cards I'll ever buy anyhow as I made a promise to myself years ago that I would never spend more than $500 CDN on any video card.
 
This makes no sense. AMD knew what the benchmarks would show. This card would have been universally celebrated at 500 dollars, It would have replaced my 980ti. Looks like I will be waiting another year.
 
This makes no sense. AMD knew what the benchmarks would show. This card would have been universally celebrated at 500 dollars, It would have replaced my 980ti. Looks like I will be waiting another year.
I've heard they're sold out, which doesn't mean much of anything other than some people probably need 16GB of VRAM. I think the reason AMD didn't price it lower was cause it is essentially a workstation card that has been repackaged for gaming. So if the prices get too low then people who need a workstation card will buy the cheaper Radeon VII instead. That maybe the reason why AMD doesn't seem to allow overclocking.

Obviously an act of desperation from AMD to make these cards but that's been AMD's problem as they're afraid to price their products competitively, unlike Ryzen.
 
Ever since AMD bought out ATI they have literally run that division in to the ground. FFS
 
I've heard they're sold out, which doesn't mean much of anything other than some people probably need 16GB of VRAM. I think the reason AMD didn't price it lower was cause it is essentially a workstation card that has been repackaged for gaming. So if the prices get too low then people who need a workstation card will buy the cheaper Radeon VII instead. That maybe the reason why AMD doesn't seem to allow overclocking.

Obviously an act of desperation from AMD to make these cards but that's been AMD's problem as they're afraid to price their products competitively, unlike Ryzen.
Rumor has it AMD is just breaking even on these cards at $700 which is why there are no plans by AIO partners. There’s no money in it after paying $350-400 just for the HBM RAM in it.
 
Well we shall see how it shakes out in a month or so, give the drivers a chance to mature a little for the new card, and overclocking tools fixed for the new design. Card is definitely not a home run, but a solid offering to keep AMD in the high end. Imma stick with my V64, great performance when underwater and clocked, within ~5% of the stock clocked VII, but I want to see this card can do overclocked and under water

I'm still not sure just how much maturing these drivers should need. It's not like this is a new architecture or graphics process. Fury became Vega 64 after getting a significant baseclock boost and having it's memory bus width halved. Now VII is Vega 64 with a further baseclock increase, double the memory bandwidth (while even keeping the same bus width), and having 1/16th fewer cores.
 
AMD should have just slapped a vapo-chill fridge unit on the thing and sold it for $2000. Why try to compete with nvidia on perf or price, when they were going to lose both? They knew they were "lipsticking a pig", so to speak.
 
These results... not inspiring.
Those performance deltas relative to 2080/1080Ti are close to overclocked 2070 territory. Then again AMD has made it fairly clear that Radeon VII is meant to be a content creation GPU just as much as a gaming one (more than gaming even depending on how you read AMD's statements and the VII's relation to the identical-except-for-drivers Instinct MI50) so maybe we're the fools for expecting a high-end AMD gaming GPU when they have said repeatedly that they're not actively targeting that segment again until 2020.
 
This is a 16GB card at $700. Amazing for content creators who'd have to fork over $2k+ for going above 12GB. It is also a 3.5TFlops FP64 card at $700. Amazing for double precision FP applications. It just so happens it can also play current games almost as well as a 2080 (it might or might not play future games better). For gamers it is indeed rather "meh" or "overpriced", for some other types of users it is ground breaking and a "steal".
This. Some PC gamers with their narrow mindset fail the grasp the reason behind the price and keep forgetting Radeon VII is very much an all-round GPU compared to a dedicated gaming GPU 2080. In that case, let those PC gamers enjoy their outdated API based games purposely hold for properly evoling while the future switched to a modern API.
 
For quick refrence:
Screenshot_20190210-200556_YouTube.jpg


Steve really is a benchmarking monster. Each game has min frame rate, but nothing really stood out.

Radeon 7 really takes a beating in esport titles, especially 1080p. Really, the difference is 200 fps vs 150 fps which makes 25% not as damaging. It is sort of similar to the weaknesses that Ryzen had.

Give credit where credit is due. Nvidia is now doing awesome in Vulkan when AMD was already great.
 
I tried real hard to go all "red" .. but bang for the buck (I'm talking power bill buck as well) .. I have to go green .. for now (I'll hang on to my 1080[non Ti] for now)
 
I tried real hard to go all "red" .. but bang for the buck (I'm talking power bill buck as well) .. I have to go green .. for now (I'll hang on to my 1080[non Ti] for now)

The 1080 Ti would have to be right up there among some of the best GPU's ever made. It's either that or almost everything else for the past couple of years has been very average by comparison.
 
The 1080 Ti would have to be right up there among some of the best GPU's ever made. It's either that or almost everything else for the past couple of years has been very average by comparison.

Kinda funny how things go sometimes. I remember people being in an uproar over the price of the 1080 ti when it came out. Now it looks like a great deal.
 
The 1080 Ti would have to be right up there among some of the best GPU's ever made. It's either that or almost everything else for the past couple of years has been very average by comparison.
a little of both perhaps? look at 780Ti vs 970 and 980Ti vs 1070 on perf/$, and then 1080Ti vs 2070.
 
This. Some PC gamers with their narrow mindset fail the grasp the reason behind the price and keep forgetting Radeon VII is very much an all-round GPU compared to a dedicated gaming GPU 2080. In that case, let those PC gamers enjoy their outdated API based games purposely hold for properly evoling while the future switched to a modern API.

And yet it’s not that exciting as a compute card either.

Toms Hardware Radeon 7 review:
https://www.tomshardware.com/reviews/amd-radeon-vii-vega-20-7nm,5977-4.html


upload_2019-2-10_22-55-14.jpeg


upload_2019-2-10_22-55-37.jpeg


upload_2019-2-10_22-55-59.jpeg


Too little too late for $700. With bad drivers to boot.

$500 pricepoint and 8GB of HBM would have sold a lot of Radeon 7 gaming cards.
 
I mean ... I'm just so heart broken to find this news out ... heart broken I tell you ... lol

You guys do know and understand that AMD abandoned the PC GPU market to focus what little resources they had on the Xbox One and PS4 right? They are way way behind. That story is all over the internet. They had little resources, money and they had to pick one over the other.
 
Rumor has it AMD is just breaking even on these cards at $700 which is why there are no plans by AIO partners. There’s no money in it after paying $350-400 just for the HBM RAM in it.
You can do the math on the compnents....they are making money. Do you really think the shareholders are gonna be ok making no money?
 
Rumor has it AMD is just breaking even on these cards at $700 which is why there are no plans by AIO partners. There’s no money in it after paying $350-400 just for the HBM RAM in it.
If HBM memory is that expensive and the prices haven't gone down enough to be affordable then AMD should look into using other memory. Nvidia doesn't make extensive use of HBM and they're doing fine. AMD should stop using HBM as an excuse for poor performance per dollar.
 
And yet it’s not that exciting as a compute card either.
Toms Hardware Radeon 7 review:
https://www.tomshardware.com/reviews/amd-radeon-vii-vega-20-7nm,5977-4.html


View attachment 141040
Radeon VII beat 2080 as intended notably on catia-05, energy-02 and sw-07. Overall, both are even matched.

Interesting enough, the different on the bamboo is minimal while the coffee scene suggest lack of optimization on the driver side for AMD OpenCL.
Part of team is focusing on open-sourcing their OpenCL implementation via ROCm

Same driver issue and implementation on Blender Cycle side. A matter of fix.
Interestingly from the same link, Radeon VII bested both Titan class and RTX 2080Ti on LuxBall HDR while still outperforming RTX 2080 on overall LuxMark 3.1 suggesting more room for improvement.

Too little too late for $700. With bad drivers to boot.
Does that mean the similar RTX 2080 with a launch driver come too little and too late for nearly $100 more?

$500 pricepoint and 8GB of HBM would have sold a lot of Radeon 7 gaming cards.
The 16GB Radeon VII already sold out with a launch driver according to some news. Asking for less would lead to yet another complain about lack of juice compared to RTX 2080.
What those users ask is practically unreasonable.

Note the methodology from Tom Hardware failed to mention the drivers used for the test. It seems like the reviewer compared a more updated Nvidia driver with the launch driver for the Radeon VII.
Overall for the price /performance and incoming improvement on the driver, Radeon VII as all-around GPU card is a bargain.
 
Radeon VII beat 2080 as intended notably on catia-05, energy-02 and sw-07. Overall, both are even matched.


Interesting enough, the different on the bamboo is minimal while the coffee scene suggest lack of optimization on the driver side for AMD OpenCL.
Part of team is focusing on open-sourcing their OpenCL implementation via ROCm


Same driver issue and implementation on Blender Cycle side. A matter of fix.
Interestingly from the same link, Radeon VII bested both Titan class and RTX 2080Ti on LuxBall HDR while still outperforming RTX 2080 on overall LuxMark 3.1 suggesting more room for improvement.


Does that mean the similar RTX 2080 with a launch driver come too little and too late for nearly $100 more?


The 16GB Radeon VII already sold out with a launch driver according to some news. Asking for less would lead to yet another complain about lack of juice compared to RTX 2080.
What those users ask is practically unreasonable.

Note the methodology from Tom Hardware failed to mention the drivers used for the test. It seems like the reviewer compared a more updated Nvidia driver with the launch driver for the Radeon VII.
Overall for the price /performance and incoming improvement on the driver, Radeon VII as all-around GPU card is a bargain.

Vega VII was essentially cannibalized from their workstation cards in order to have something to compete with nVidia at the mid-high end. The fact that it's sold out doesn't mean people are snapping them up in droves, which I highly doubt they are. It simply means that there is a lack of supply, which ties in to all the other rumors that Vega VII supply is extremely limited.

RTX 2080 should have been $500. It offered essentially nothing over the GTX 1080 ti which launched at $700 two years ago. The fact that nVidia can get away with this is because AMD has nothing to compete, and when AMD does put out something it's this crap. The tech itself isn't junk, don't get me wrong, but the price/performance of this generation of graphics cards is beyond abysmal.
 
Good to hear the driver bugs seemed to be remedy but with some performance loss ( not much).
  • He did not mention if OCing now works even though he didn't test it?
  • Looking at his charts, did he tested all those video cards in the same system with current AMD drivers as well as Nvidia or did he use old results? That is not clear to me and really just seems impossible for all those cards, anyone knows?
Man, I am dying to hear from Brent! Want to know if there are texture artifacts, glitches, rendering issues, smoothness of actual game play, abnormal fan variations and the list goes on that none of the reviews I've plowed through even would considered. Plus any real OCing results, the Nvidia tests were with an OC card so being 5%-7% slower overall is most likely just a wash between the cards. Max OC on both would be nice to know.

It is very obvious that AMD, once again put a subpar cooling solution on their card. At least they could have had a liquid cooling addition for $100 more, encourage AIBs for a better cooling solution. It maybe just a token card to say they are in the race yet with no real long term commitment or large number of cards. If one has a Vega 64, 64LC, 1080 I don't think either the 2080 or Vega VII is a worth while upgrade. For a 1080 Ti owner looking for a worth upgrade for the same price (the use to be norm of things), the Vega VII and 2080 would be pointless. The 2080 Ti with abysmal failure rate and pricing is the only worthwhile performance upgrade card except maybe for a 1080 Ti owner. For me the 2080 Ti pricing, lack of use of the hardware, crap reliability is a bad joke, the Turning Titan is even more of a Joke. 2019 in a nutshell just sucks for the Enthusiast with nothing obvious coming later.

As for the notion that Vega VII is for content creators, I would like to see case studies that even remotely shows this to be valid. Anything Cuda based such as VRay and other programs would prove Nvidia the better Content card. While OpenCL programs AMD. AMD previously went with that route with the Vega FE, Content creator card but gaming as well. The newest Pro Drivers 19.Q1 for the Vega FE went backwards with the gaming drivers with only 18.8.1 - Thanks AMD for the support for those that support you, NOT! One can go through some acrobatics and load 19.2.1 drivers as a side note unofficially/unsupported with of course issues which one will need to work around.

Looks like 2020, probably late 2020 for a real worthwhile upgrade and maybe, I doubt it, it might be from Intel.


What method are you using to load 19.2.1?
 
with the only plausible justification a consumer has for the price being 4k editing, why not market the card as part of their Radeon Pro series. Sell it as a content creator's card that can also game. Hell, they could probably raise the price. Just doesn't make any sense to me.
 
I'm more interested in minimum frames and frame times. What difference does it make if you can do 150 fps+ if you have to endure any stutter.

Disclaimer: These aren't cards I'll ever buy anyhow as I made a promise to myself years ago that I would never spend more than $500 CDN on any video card.
I was going to post the same, as ive been reading Vega VII delivers more stable frame rates.
 
I do believe people are snapping them up... they just aren't gamers. lol

The Radeon VII is basically a Radeon Pro WX9100 with a refreshed 7nm GPU.... for a third the price. Ya no shit there sold out instantly.... I just don't imagine many of them went to gamer first folk.
 
You can do the math on the compnents....they are making money. Do you really think the shareholders are gonna be ok making no money?
Except the margins aren't something AMD (or any company) is really transparent about. So they hide the peas in the mashed potatoes and obfuscate their money losers with "hey look at our CPU sales".

They've been breakeven or worse on the chips supplied to the consoles for years, believing they'd make it up long-term by PR value or some other benefit by Sony and MS reliant on them. But those companies only go with whoever gives them components cheapest, and Nvidia with the faster chips haven't felt the need to give them away to MS/Sony.
 
Good to hear the driver bugs seemed to be remedy but with some performance loss ( not much).
  • He did not mention if OCing now works even though he didn't test it?
  • Looking at his charts, did he tested all those video cards in the same system with current AMD drivers as well as Nvidia or did he use old results? That is not clear to me and really just seems impossible for all those cards, anyone knows?
Man, I am dying to hear from Brent! Want to know if there are texture artifacts, glitches, rendering issues, smoothness of actual game play, abnormal fan variations and the list goes on that none of the reviews I've plowed through even would considered. Plus any real OCing results, the Nvidia tests were with an OC card so being 5%-7% slower overall is most likely just a wash between the cards. Max OC on both would be nice to know.

It is very obvious that AMD, once again put a subpar cooling solution on their card. At least they could have had a liquid cooling addition for $100 more, encourage AIBs for a better cooling solution. It maybe just a token card to say they are in the race yet with no real long term commitment or large number of cards. If one has a Vega 64, 64LC, 1080 I don't think either the 2080 or Vega VII is a worth while upgrade. For a 1080 Ti owner looking for a worth upgrade for the same price (the use to be norm of things), the Vega VII and 2080 would be pointless. The 2080 Ti with abysmal failure rate and pricing is the only worthwhile performance upgrade card except maybe for a 1080 Ti owner. For me the 2080 Ti pricing, lack of use of the hardware, crap reliability is a bad joke, the Turning Titan is even more of a Joke. 2019 in a nutshell just sucks for the Enthusiast with nothing obvious coming later.

As for the notion that Vega VII is for content creators, I would like to see case studies that even remotely shows this to be valid. Anything Cuda based such as VRay and other programs would prove Nvidia the better Content card. While OpenCL programs AMD. AMD previously went with that route with the Vega FE, Content creator card but gaming as well. The newest Pro Drivers 19.Q1 for the Vega FE went backwards with the gaming drivers with only 18.8.1 - Thanks AMD for the support for those that support you, NOT! One can go through some acrobatics and load 19.2.1 drivers as a side note unofficially/unsupported with of course issues which one will need to work around.

Looks like 2020, probably late 2020 for a real worthwhile upgrade and maybe, I doubt it, it might be from Intel.
Make that 2021 it takes 3 years they started in 2018 (AMD).

For people that do not understand what Radeon 7 is it is a die shrink of 14nm Vega. Expecting any kind of performance boost from just the process can do the math but in no way shape or form was anything done to improve certain aspects of the design, 14nm was having problems with power and for 7nm you see it still has it just less of an effect of the 14nm version.

It is sad that people in the industry of reviewing products still do not know that certain things because you want them to be does not happen when you know the technical limitations involving a die shrink. For anyone to claim to expect performance X, Y or Z for a die shrink product (without showing the math behind it why it should) then projects it on a product is simply an unfair way of reviewing.

Way before any of the reviews came out on Fudzilla.com reported that Radeon 7 would costs $650 to make. It makes reviews or opinion on this hardware and the price there of simply stupid. You can not sell a card for less then you make it. To project that AMD are just as greedy as Nvidia is also far from the truth. AMD sells you top end hardware that Nvidia tends to use only for the professional market.

People that tell you that this card is a failure from a performance point of view because the RTX 2080 is faster does not mean it truly is you get something far more valuable then just a framerate win you get hardware without any promises that Nvidia does not intend to keep (Ray Tracing and DLSS). Nvidia is the champion of implementing software features that cripple competitors cards but at the same rate the promised features (that have to be implement in software and optimized) are no where to be seen.
 
Last edited:
You can do the math on the compnents....they are making money. Do you really think the shareholders are gonna be ok making no money?

Component cost does not equal full product cost. Workers, packing, shipping, r&d, any licensing fees involved with the tech, whatever TSMC charges them per wafer (a cost no one really knows at this time), and any other misc costs involved all add up. AMD wasn't exactly making a ton of money off1 Vega either. Even if AMD isn't losing money on each VIII I rather doubt they're making more than a slim profit.

If HBM memory is that expensive and the prices haven't gone down enough to be affordable then AMD should look into using other memory. Nvidia doesn't make extensive use of HBM and they're doing fine. AMD should stop using HBM as an excuse for poor performance per dollar.

Wouldn't be surprised if Navi is GGR5x or GDDR6, but switching from HBM really wasn't an option they had here. It would have required respinning the chip in order to change the memory controller. 16GB of GDDR6 would have been a lot cheaper than HBM2, but it would not have offered the same memory bandwidth which might have caused problems. Not to mention the sheer cost involved in respinning the chip and making major changes like that. For something that really just seems like a stop-gap card spending that kind of time and money on it probably just isn't worth it.
 
As for the notion that Vega VII is for content creators, I would like to see case studies that even remotely shows this to be valid. Anything Cuda based such as VRay and other programs would prove Nvidia the better Content card. While OpenCL programs AMD. AMD previously went with that route with the Vega FE, Content creator card but gaming as well. The newest Pro Drivers 19.Q1 for the Vega FE went backwards with the gaming drivers with only 18.8.1 - Thanks AMD for the support for those that support you, NOT! One can go through some acrobatics and load 19.2.1 drivers as a side note unofficially/unsupported with of course issues which one will need to work around.

Just want to point out many of these content creators like GN and such (Not sure which specifically) have already confirmed that with Adobe (For example) and video editing there are many, many jobs that crash on cards with less VRAM than the Radeon VII and doesn't show any real signs of being slower due to poor scaling anyways, pointing out that crashing on a project output wastes far more time than any speed up gained elsewhere. So yes, for these people it is a steal. But lets be real, this is a very small number of people working with either extremely complex 1080 or above average 4k workflows. It's not a real big win, but to be honest, I think that kind of person is honestly what they intended it for, low volume niche market. Seems they have not even made many, so that seems probable.
 
If HBM memory is that expensive and the prices haven't gone down enough to be affordable then AMD should look into using other memory. Nvidia doesn't make extensive use of HBM and they're doing fine. AMD should stop using HBM as an excuse for poor performance per dollar.
They couldcou if they wanted too. It would draw too much power and proabably would cripple performance if they used GDDR 5x or 6.
 
Back
Top