AMD Radeon VII Benchmarks Leak Out

My V64 Liquid with HBM at 1180Mhz, will do 6500 in Fire Strike Ultra.

Thanks. Given that reference point, I find it very difficult to believe that they only eeked out a 1.5% improvement out of 7nm Vega VII w/ HBM2 relative to a 14nm Vega64 LC w/ HBM2. I'm not in a place where I can do a 1-to-1 comparison of stock specs, but at a glance it just doesn't add up.
 
I wanted JHH to be wrong, but for the price the performance is indeed lousy. XD

If the numbers in this are legit, it's slightly faster than the 2080 for a lower price with double the memory. The only thing that will save the 2080 from the VII will be DLSS if that ever gets implemented in games. Can I have some of what you and JHH are smoking? Asking for a friend :p
 
If the numbers in this are legit, it's slightly faster than the 2080 for a lower price with double the memory. The only thing that will save the 2080 from the VII will be DLSS if that ever gets implemented in games. Can I have some of what you and JHH are smoking? Asking for a friend :p

I suspect this might be a case of the 2080 and VII trading blows depending on game.
 
I'll be grabbing a Radeon VII on release. I really detest Nvidia as a company (L...O....N....G history of anti-consumer / anti - competitive behaviors) and only go team green when AMD is way behind the curve (as was the case in 2016 when I got my 1070). Also, the widespread quality control issues with the RTX cards puts the nail in the coffin this time for GPU's.

I generally upgrade my video card roughly every 3 years and I'll wager that the 16gb vram on the Radeon VII proves overall more useful than ray tracing and DLSS on Nvidia RTX cards. Could wait until NAVI this summer, but my sons 4GB RX480 is really struggling @ 1440p. My 1070 will be a nice bump for him and the Radeon VII will be a good bump for me as well.
 
Last edited:
My 1070 will be a nice bump for him and the Radeon VII will be a good bump for me as well.

Why not upgrade your son to a Radeon VII and you keep your 1070? I'm of the opinion that unless it's a tool that I need for work, our kids deserve the best. They need the best things in life when they are young and have time to enjoy them, not when they are adults and faced with all of life's problems. Just think about it :)
 
Why not upgrade your son to a Radeon VII and you keep your 1070? I'm of the opinion that unless it's a tool that I need for work, our kids deserve the best. They need the best things in life when they are young and have time to enjoy them, not when they are adults and faced with all of life's problems. Just think about it :)
I remember my cousin and friend were always gaming it up every chance we had. Lugging around desktops, CRTs, keyboards mouses, routers to each others houses. Our hardware was always outdated; hand me downs or stuff we could afford with what little money we had. Often our computers barely ran the games. When we could afford new hardware it was amazing seeing our favorite games running go from barely playable to running so smoothly (we would typically skip several generations of hardware).

IT WAS AWESOME.
Getting the greatest and best hardware for your kid is not gonna bring them happiness. It might make them spoiled, not sure :p. Start them off on some basic computer. Give them an allowance they have to work for and have them get their own hardware with your recommendations and help.
 
I suspect this might be a case of the 2080 and VII trading blows depending on game.
Lisa Su said as much - "we win some, we lose some." But overall, this product is going to be competitive.

Some people would only be happy if AMD makes a card that is 2x the performance of Nvidia's top card at half the price.
 
  • Like
Reactions: N4CR
like this
Getting the greatest and best hardware for your kid is not gonna bring them happiness. It might make them spoiled, not sure :p. Start them off on some basic computer. Give them an allowance they have to work for and have them get their own hardware with your recommendations and help.

This should be the way. My mother owned a small general goods store when I was younger and I never got anything free from her. If I had wanted something from the store, I would have to work for it. Made me appreciate life a lot more!
 
But overall, this product is going to be competitive.

Given AMD's alleged margins (being negative here), a US$50 rebate on the 2080 would end that quick.

Radeon VII (Vega+) will be marginally competitive. It won't make AMD any money, but it will give those that bleed AMD something to talk about until their next disappointment.
 
Cool to see something, but I’ll wait on real word measurements before judging. Also, curious how well AMD did this year for earnings, guess I’ll find out in a couple days.
 
Given AMD's alleged margins (being negative here), a US$50 rebate on the 2080 would end that quick.

Radeon VII (Vega+) will be marginally competitive. It won't make AMD any money, but it will give those that bleed AMD something to talk about until their next disappointment.
Well, I am human, so when I'm cut, my blood does bleed red.....

If you blood is green, I guess you're Invid.
 
Given AMD's alleged margins (being negative here), a US$50 rebate on the 2080 would end that quick.

Radeon VII (Vega+) will be marginally competitive. It won't make AMD any money, but it will give those that bleed AMD something to talk about until their next disappointment.

I've always been an ATI/AMD fan and had their stuff in my system whenever feasible; your second line made me literally lol at work despite my inner AMD fanboy. Good job sir!
 
Haven't bought an AMD card since the 7970. This has 1080ti-class performance two years later? Big bucket of 'meh'.

Good on AMD for trying to catch up to Nvidia despite falling behind. They aren't there yet, though.

I think they can get there. Used to be that ATI/AMD traded blows with Nvidia - for a long time, in fact. These last few years have been an aberration. Hopefully we return to the norm soon. For now... my 1080ti is fine. Turing too pricey for what you get. AMD just not quite there in performance.
 
I've always been an ATI/AMD fan and had their stuff in my system whenever feasible; your second line made me literally lol at work despite my inner AMD fanboy. Good job sir!

Being a fan of performance, I've been disappointed in AMD largely since they bought ATi- but I know a lot about disappointment, because I was a 3Dfx fan first ;)
 
Let's not forget Radeon VII will come with 16GB of VRAM. This is where the RTX 2080 will struggle with only 8GB. I see 7-7.5GB of VRAM usage on my Vega 64 now in 4K. Upcoming titles will need more VRAM for 4K. The RTX 2080 is looking like Nvidia's Fury.
They hate that VII has more and faster vram than a 1200usd card lol and will deny that it's an advantage for for a year or so.
I see misinformed posts every week here now repeating fud that v64 is 1070 speed and even that the VII is going to be 1080 speed lol.

Score doesn't look good next to the mentioned 1180mhz hbm vega64 bench earlier. Note that it's a pretty lucky v64 hbm oc, but yes they are memory bandwidth limited and VII needs it. Hope they OC well.

Comparing a 1080ti with high new prices and less driver support left, instead of a 2080 both at half the vram is really stretching. Wait for the reviews.
 
Last edited:
They hate that VII has more and faster vram than a 1200usd card lol and will deny that it's an advantage for for a year or so.

Well, it's not an advantage. It's not an emotional reaction; AMD is losing their shirt on the Radeon VII. Beyond that, it's again going to be hungrier, hotter, and louder, and it's not going to perform any better. Because it's just Vega, again; compute- instead of gaming-focused, again. They've doubled the memory to get the memory bandwidth up, which is apparently necessary; which means that their compute architecture is significantly less efficient than the competition on yet another front.

The 'hate' is that AMD refuses to produce a top-end gaming part, when we all know that they could do it with their existing intellectual property.
 
Whether it's better or not than the best nvidia can put out is a non-issue to me. All I care about is if it's open source support is good. 90% of my purchasing decisions are based on the level of linux support available or expected... which pretty much excludes nvidia since I'm not going to be hamstringed to a whatever kernel they decide to support with their closed drivers. The other 10% is based on not liking intel, so I've been on team AMD since their 486 clones.

My 480 looks like it will finally be retired soon. Assuming these are even available to purchase for their msrp anywhere come launch time and aren't gobbled up by remaining miners and what not.
 
level of linux support available or expected... which pretty much excludes nvidia

And yet Nvidia's 'closed-source' drivers are usually better than AMD's...

AMD is trying to catch up to Nvidia's drivers in Linux just like they're trying to catch up to Nvidia's hardware performance ;)
 
Why anyone would judge the card on pre-release canned benchmarks for a game that was heavily optimized towards NVIDIA is beyond me. And this is coming from me, someone who has 3 NVIDIA GPUs and zero AMD right now.

It's probably going to trade blows with the 2080, maybe the 2070 in some titles, depending on the game, API, and what features are leveraged. Nothing new to see here yet.
 
Well, it's not an advantage. It's not an emotional reaction; AMD is losing their shirt on the Radeon VII. Beyond that, it's again going to be hungrier, hotter, and louder, and it's not going to perform any better. Because it's just Vega, again; compute- instead of gaming-focused, again. They've doubled the memory to get the memory bandwidth up, which is apparently necessary; which means that their compute architecture is significantly less efficient than the competition on yet another front.

The 'hate' is that AMD refuses to produce a top-end gaming part, when we all know that they could do it with their existing intellectual property.
Considering early vram indications in dx12, 16gb is going to be an advantage, period. Res2 looks to suck it down like bf, of course no other games would do that eh. And no they don't need to use that vram, that's why it uses it eh? Poor programming can only be an excuse until we are figuring out 8gb is a limit for anything over 1440p, it anyways eventually happens, remember shadow of mordor, the only title that's ever going to use more than 6gb... Yeah look at how that turned out. Denying vram increases utilisation with game development over time is a vaginal ostritch pov (thanks to whoever used that term recently here).

Furthermore I have seen no proof the bom for 16gb of hbm is 300usd or more that would be required to make it approach being unprofitable.. Just conjecture. You really think a few small hbm chips are worth more than half a card with a relatively small 7nm die? But the 2080 at 800 gorrilion mm2 is fine? 2080 is the bottom off the bin just like vii, anything that's not loss is good in this case if you really want to boil it down.
The original vega 8gb bom was approx 160usd years or two ago and people said the same thing, yet amd kept making them.. Surely wasn't unprofitable.
Bom estimates are just that, you never know what they really pay, obviously it wasn't that much to sell 400 dollar v64s and 360 dollar V56s continuously for the last 3 months or so.
I think space invaders takes the hottest award though.. Don't see Vegas going up in flames on various forums.

AMD was killing fp16 with vega when it came out. They compete with 99% of the gaming market, almost no one has a 2080ti or titan, certainly not at 1200+ and 3k lol. What's the point? Then people will bitch about the price.
See if they crack MCM with next generation after Navi but first they got to drop the tumour that is growing as GCN.
 
And yet Nvidia's 'closed-source' drivers are usually better than AMD's...

AMD is trying to catch up to Nvidia's drivers in Linux just like they're trying to catch up to Nvidia's hardware performance ;)


there's no need to quote closed source since that's what they are. Regardless of how you want to measure good when it comes to a driver that has varied stability and performance depending on the card you're using it on... if it wont load on the kernel i'm running, then it doesn't matter. And i'm always using the latest to within couple weeks at the most. It's not like they've had decades of history behind the graphics drivers, that was ati for most of the time radeon and nvidia have existed. despite that, the radeon driver stability and support has come a long way in just the recent time since fglrx was a requirement for 3d acceleration on radeon cards. Maybe one day nvidia's open source drivers will get there too. But then you'd have to ignore their anti-competitive behavior when purchasing one. I'd prefer to not reward that.
 
You really think a few small hbm chips are worth more than half a card with a relatively small 7nm die?

Actually, the main part I've read in the past that makes it expensive isn't really the die itself- but the whole assembly. The parts themselves come at market cost, but getting them all put together successfully has been what's made HBM not be the low-cost solution it was widely expected to be. That's why Vega's memory inefficiency is an issue. If they could cut two stacks off, they'd have a significantly smaller overall assembly and cost per unit would drop dramatically as a result.

Considering early vram indications in dx12, 16gb is going to be an advantage

Don't see it. Games still aren't loading up the assets, and resolution increases don't hit memory as nearly as hard- remember, pixels are small, and even having over eight billion of them for 4k is just a few megabytes of memory. What actually eats up a bit of VRAM is the per-pixel processing- that's why ray tracing increases actual needed VRAM- but that's also not a problem here, since AMD neglected to implement that functionality ;)

See if they crack MCM with next generation after Navi but first they got to drop the tumour that is growing as GCN.

My only hope is that when they do finally jettison GCN, that they have their drivers in line and we don't have to wait for the 'fine wine' to properly age.
 
  • Like
Reactions: N4CR
like this
Even if AMD did this, people would still find an excuse to continue to buy nVidia.

Some people just like efficient, performance-focused products.

Seems like AMD fans have been complaining about us for over a decade... despite the fact that we were AMD fans first ;)
 
Given AMD's alleged margins (being negative here), a US$50 rebate on the 2080 would end that quick.

Radeon VII (Vega+) will be marginally competitive. It won't make AMD any money, but it will give those that bleed AMD something to talk about until their next disappointment.

You mean like people spouting off but mAh rtx and you saying you can find not new 1080ti's for $500?
 
In the worst possible benchmark.

Cards setting on fire are not $1200 cards
Cards lowering graphic quality of ray tracing and then claiming they did it without changing graphic quality, not $600, $800 or $1200 cards
RTX not a $200-$800 premium feature

Yes, all of those too.
 
RTX can eat a dick for the time being if i'm being perfectly blunt. And their pricing can choke on said dick.

Isn't this card basically competing with the 2080 which is roughly the same price?
 
by defacto it would be competing against the 2080...but until we see real benchmarks and actual market costs to purchase, everyone is just guessing where it will land in terms of competition.
 
That's 1080 Ti level on Firestrike Ultra. I got >6800 with my Titan X out of the box. 1080 Ti performance at original 1080 Ti price 2 years later.

did you not see this

The card appears to have achieved a graphics score of 6688 in Fire Strike's Ultra preset, which beats out a factory overclocked RTX 2080's score of 6430..
 
According to my own benchmark of my highly OCed AIB 2070 it runs the Warhammer 2 benchmark just as fast as the radeon 7. Who knows yet though? I don't have any confirmation on what level of AA they were using. Perhaps my RTX 2070 is faster in that benchmark?

In any event I don't care very much about Ray tracing right now. But I do very much care about DLSS. And right now DLSS is the biggest fake launch I've bought into for half a decade. Nvidia promised! me a whole bunch of titles would get DLSS. They are still promising this on their own web page right now.

Where is it? Nvidia where is the DLSS that you promised me before I spent my money on your video card instead of a Vega 64?
 
Isn't this card basically competing with the 2080 which is roughly the same price?

I'm referring to the ti pricing of the rtx range. It's realistically the only one fast enough to use the rtx features.
 
Back
Top