AMD Radeon VII Benchmarks Leak Out

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
Citing the Twitter dataminers APISAK and Komachi, Videocardz spotted some Radeon VII benchmarks that leaked out ahead of the card's official launch. The card appears to have achieved a graphics score of 6688 in Fire Strike's Ultra preset, which beats out a factory overclocked RTX 2080's score of 6430, and Videocardz says the Radeon VII appears to be faster in the performance preset as well. Meanwhile, the VII appears to struggle with Final Fantasy 15's built-in benchmark. It only slightly edges out an RTX 2070 in the "standard quality" benchmark, while falling between the GTX 1070 TI and the aging 980 TI with the "high quality" preset.

AMD Radeon VII board partner models: There will be no custom models at launch. Our sources tell us to expect such designs no sooner than during Computex (late May). So far we have few reference-based models with their own branded packaging and sometimes even custom stickers.
 
I wasn't expecting custom models at all (figured just a filler product until high-end Navi iteration), so that's cool. Sapphire Nitro VII would get my attention.
 
I am surprised FF is used for the benchmark runs at all considering how Nv biased they are (BS tuning and way the benchmark runs) that being said, if Radeon 7 was much much closer to expected Nv GPU performance tier being used that shows R7 is actually quite the beast.

I can imagine once AMD launches Navi the software drivers used for R7 and Navi are likely to have their very own niche.
 
That's 1080 Ti level on Firestrike Ultra. I got >6800 with my Titan X out of the box. 1080 Ti performance at original 1080 Ti price 2 years later.
 
Friendly or not towards AMD, those scores hurt ... Only a week and change till the real reviews start coming out just have to be patient at this point I have to assume that any "Leaked" benchmarks are fodder and click-bait to drive up views as the release gets near.
 
The card appears to have achieved a graphics score of 6688 in Fire Strike's Ultra preset, which beats out a factory overclocked RTX 2080's score of 6430

Not that 3D Mark is relative to how it's going to perform in games, nor is the FF XV benchmark an indication since it's heavily nvidia biased; but it seems like it could be promising. We'll know pretty soon either way. I'm looking forward to it either way.
 
I thought the Final Fantasy XV benchmark was generally poopoo'd on most websites? Or was that strictly from a RTX card perspective?
 
Friendly or not towards AMD, those scores hurt ... Only a week and change till the real reviews start coming out just have to be patient at this point I have to assume that any "Leaked" benchmarks are fodder and click-bait to drive up views as the release gets near.

Even for Nvidia cards, FF15 is a garbage benchmark. It really has zero consistency and no one that actually cares about accuracy should ever use it in their reviews.
 
Ask that again in 5-10 years when the majority of games use real ray tracing. Nvidia's hybrid approach exists in one game with an abysmal implementation.

...and is supported in every API and every major engine and is in the pipe for dozens of games. And in five to ten years, we're likely to still be running a 'hybrid' approach. Rasterization isn't going away.
 
Ask that again in 5-10 years when the majority of games use real ray tracing. Nvidia's hybrid approach exists in one game with an abysmal implementation.
It won't be that long, most of the RTX features are already a part of the DX12 spec or being implemented in DX12 revisions at a later date, I would say it will be 3 years at most until they are in both DX12 and Vulkan as an open spec. And by that time we should have hardware on the market that can actually handle it with out shitting the the bed.
 
Ask that again in 5-10 years when the majority of games use real ray tracing. Nvidia's hybrid approach exists in one game with an abysmal implementation.

And by then amd will have an open source version.. or itll be like physx baked into game code as software...

That's 1080 Ti level on Firestrike Ultra. I got >6800 with my Titan X out of the box. 1080 Ti performance at original 1080 Ti price 2 years later.

I have no idea why people keep saying this, oh look a 2080rtx has 1080ti performance at twice the price, but look rtx will will really matter by the time the card is up for replacement.
 
Cards lowering FPS to zero because they don't even have ray tracing are worth how much?

;)

I forgot about all the games that won't launch because of missing ray tracing on the video card. Time for AMD to pack up and go home.

Seriously though. Real time ray tracing in its current form is literally worthless. While it is good that nvidia got the wheels turning, and it's going to change the gaming industry (and other industries) when it's ready and working well, it's just not practical or useful currently unless devs can get the 1% and .1% frame drops and inconsistencies under control.
 
I have no idea why people keep saying this, oh look a 2080rtx has 1080ti performance at twice the price, but look rtx will will really matter by the time the card is up for replacement.

Uhh you can get a RTX 2080 for $720 at the moment, which makes the Radeon VII in the same price category and that's how much 1080 Ti were new....
 
And by then amd will have an open source version.. or itll be like physx baked into game code as software...



I have no idea why people keep saying this, oh look a 2080rtx has 1080ti performance at twice the price, but look rtx will will really matter by the time the card is up for replacement.
Yeah don't get it either.
 
...and is supported in every API and every major engine and is in the pipe for dozens of games. And in five to ten years, we're likely to still be running a 'hybrid' approach. Rasterization isn't going away.

Doesn't matter how many engines support it unless developers are willing to implement it. Also, there are 11 games announced with RTX ray-tracing support. That isn't even a dozen, much less multiple dozens. Of those 11, one has actually followed through on that support. One game. In four months. I seriously doubt Metro is going to launch with ray-tracing. Shadow of the Tomb Raider may never add it. No sign of it yet in Assetto Corsa, but perhaps by the time its out of early access. Anthem is at a "maybe we'll do it" state, likely heavily dependent on whether or not the game bombs.
 
Uhh you can get a RTX 2080 for $720 at the moment, which makes the Radeon VII in the same price category and that's how much 1080 Ti were new....
So it will be an excellent deal if beats the 2080 even at a narrow margin.
1080ti seems super overpriced in my google searches.
 
So it will be an excellent deal if beats the 2080 even at a narrow margin.
1080ti seems super overpriced in my google searches.

They are, used ones are going for $550-700 at the moment for a 1080 ti. I would love the performance of a 1080 ti to come down to about $400-500 bracket!
 
I wanted JHH to be wrong, but for the price the performance is indeed lousy. XD
 
  • Like
Reactions: ltron
like this
Doesn't matter how many engines support it unless developers are willing to implement it. Also, there are 11 games announced with RTX ray-tracing support. That isn't even a dozen, much less multiple dozens. Of those 11, one has actually followed through on that support. One game. In four months. I seriously doubt Metro is going to launch with ray-tracing. Shadow of the Tomb Raider may never add it. No sign of it yet in Assetto Corsa, but perhaps by the time its out of early access. Anthem is at a "maybe we'll do it" state, likely heavily dependent on whether or not the game bombs.

I'm pretty sure I read that Metro Exodus will have Ray Tracing on the 25th of February. But the game launches on the 15th.
 
So it will be an excellent deal if beats the 2080 even at a narrow margin.
1080ti seems super overpriced in my google searches.
Let's not forget Radeon VII will come with 16GB of VRAM. This is where the RTX 2080 will struggle with only 8GB. I see 7-7.5GB of VRAM usage on my Vega 64 now in 4K. Upcoming titles will need more VRAM for 4K. The RTX 2080 is looking like Nvidia's Fury.
 
The only thing I am really interested in for this card is what the 16GB of HBM does for performance at 4K and VR versus the 2080. I think we all pretty much realize this card was released because Navi wasn't ready.
 
I'm pretty sure I read that Metro Exodus will have Ray Tracing on the 25th of February. But the game launches on the 15th.

Can't find anything saying that but if that's true its not too bad. Makes it weird that they didn't highlight Metro during their CES presentation though. Metro was probably the most impressive thing they had to show from the launch announcement, weird that they wouldn't show it off again if the effects really were that close to release.
 
Those of you who have seen my past posts about NVIDIA know that I don't like NVIDIA very much, and for sure I believe that the RTX series is an overpriced clusterfuck, with it's only saving grace is that it's not worse than the previous generation, yet except for the 2080 Ti and TITAN RTX, not any better either. As such, I believe that right now the best value is the GTX 1080 Ti, especially when you can find one for $600 or less brand new. Or get two of them in SLI and sure as hell they will stomp an RTX 2080 Ti. I'm just laying this out to show how hopeless the Radeon VII is.

The Radeon Vega VII is Radeon Instinct MI50, on slightly different PCB with a different cooler and no PCI-E 4.0 support. AMD either has an oversupply of MI50 chips or they are trying to pull a PR stunt to seem competitive. Either way, they are not even breaking even on a GPU that is not really competitive with NVIDIA gaming GPUs. I hope I'm wrong. As much as I hate Jensen's behavior, I believe that he was laughing on the inside about how little effort AMD put into this new product.

I was excited for about 30 seconds until I saw what it really was. I'm not buying one, that's for sure.

AMD is making good progress with Ryzen, however, it's like they are letting their GPU division die a slow death. And it's ironic and sad because AMD was in trouble for so long exactly because they bought ATI 13 years ago, and also overpaid for it. I don't think that overpriced GPUs like the ones from NVIDIA have a bright future with big expensive to produce chips at their heart, so AMD's angle would be to offer a solid midrange that is priced competitively. The big monolithic designs will come to an end sooner or later, and AMD has the expertise now to create effective MCM designs. Maybe we'll see something similar to Threadripper with their future GPUs as well. Or maybe they will tank their GPU division for good. One can only hope for the better.
 
The last conventional wisdom I heard was that this was a way to use up 'defective' leftover chips that couldn't be used in the $2k+ machine learning card and that was never really intended for gamers.
Also the cost of that 16G HBM2 memory means that they're losing money on every card they sell @ $700.

It's still probably ideal for machine learning or CAD/Engineering/Scientific apps - pc gaming, probably not so much
 
Let's not forget Radeon VII will come with 16GB of VRAM. This is where the RTX 2080 will struggle with only 8GB. I see 7-7.5GB of VRAM usage on my Vega 64 now in 4K. Upcoming titles will need more VRAM for 4K. The RTX 2080 is looking like Nvidia's Fury.
Those bolded words are doing a lot of work.
 
AMD is making good progress with Ryzen, however, it's like they are letting their GPU division die a slow death. And it's ironic and sad because AMD was in trouble for so long exactly because they bought ATI 13 years ago, and also overpaid for it. I don't think that overpriced GPUs like the ones from NVIDIA have a bright future with big expensive to produce chips at their heart, so AMD's angle would be to offer a solid midrange that is priced competitively. The big monolithic designs will come to an end sooner or later, and AMD has the expertise now to create effective MCM designs. Maybe we'll see something similar to Threadripper with their future GPUs as well. Or maybe they will tank their GPU division for good. One can only hope for the better.

Give them and Navi (Zen 3) time...
I think everyone forgets how badly in debt AMD was 2-3 years ago where they had to sell off their own manufacturing facilities in order to survive. Yes, their #1 object since has been to make a profit - which Zen 1 & 2 have done... at the cost of focusing their efforts on their higher profit product lines (like machine learning). The success of Ryzen, Threadripper, and Epyx saved AMD from extinction, but I think they are still at this point paying off debt rather than dumping all of that fresh cash into R&D. High-end GPU's for the enthusiast market is probably not a critical focus for them - probably not enough market or potential profit there to be worth throwing all of their limited eggs in that basket and incuring a likely price-war with nVidia. Zen 3 + chiplets could allow AMD to cautiously set out to control the lower/mid GPU market later this year at performance/price first.... then the RTX killer later in 2020.

AMD is playing the long game now and I think it's already paying off.
/shrug
 
Last edited:
Back
Top