AMD Radeon RX Vega 56 Leaked Benchmarks: GTX 1070 Killer

1070 is easily found for $430 or less, and 1080s have been $465-500. Coming a year later is one thing, but coming with 2-3x the power draw for the same performance? Thats where it falls flat for me. I personally dont care about the consumption part (though many others would), but the heat/noise that it brings with it, I do.
Also Nvidia cards hold value more then AMD cards do for resale.

As for mining, these cards wont be that appealing because they use HBM2 memory which is slower for mining, and they have huge power draw. Any miner considering a $400 card will take a 1070 hands down. And "most" gamers will take the 1070 even if its $30 more just because its Nvidia that has less driver issues.


Its weird times when AMD is competitive in the CPU department, and not the GPU o_O
 
Let's say Vega 56 trade blows with the 1070 at $399, that's still cheaper than 1070 prices now. So from the 14th of August, gamers looking to spend around $399 for a GPU will have the option of Vega 56 or paying more for the 1070. Thus, these gamers benefit as a result.

Anyone trying to belittle competition, is clearly anti-consumer.
Except they won't be $399. AMD not producing decent quantities for the first few months + fanboys paying gouge prices for the psychological contentment of a Vega = nobody is getting one anywhere near MSRP. And if they have any mining ability, gamer plebs are really SOL.
 
Last edited:
1070 is easily found for $430 or less, and 1080s have been $465-500. Coming a year later is one thing, but coming with 2-3x the power draw for the same performance? Thats where it falls flat for me. I dont care about the consumption part, but the heat/noise I do.
Also Nvidia cards hold value more then AMD cards do for resale.

As for mining, these cards wont be that appealing because they use HBM2 memory which is slower for mining, and they have huge power draw. Any miner considering a $400 card will take a 1070 hands down. And "most" gamers will take the 1070 even if its $30 more just because its Nvidia that has less driver issues.

I get it. it uses more power. But some of you sound no better than amd fanboys. What the fuck is up with this 2-3x power crap? vega 56 is rated at 210w. Seriously tired of people justifying vega and tired of people overblowing the power figures.
 
I get it. it uses more power. But some of you sound no better than amd fanboys. What the fuck is up with this 2-3x power crap? vega 56 is rated at 210w. Seriously tired of people justifying vega and tired of people overblowing the power figures.

Rated is one thing, actual is different. 480s drew more then they were rated for, and Nvidia has done that in the past as well. We have been hearing anywhere from 200-300W for Vega, so that is somewhere in the 2-3x the power of 1070 which is around 100-120w.
 
Remember, Ryzen didn't beat Intel either. 7700K still supreme.

The 1080Ti will still be supreme, but once Vega is released, gamers will have more choices instead of auto defaulting to the 1070 and 1080. Heck, they may even ponder about going with a 1440p Freesync monitor instead of a GTX + Gsync build.
Could argue that Ryzen performed better than expected, given the state of AMD CPUs before Ryzen.

Vega performed about on par as expected, so the reception is a bit colder.
 
All you people going on about nobody wanting to buy this to replace their 1070 (or 1080), there's more than a few of us out there that are still a generation or two behind and ready to upgrade, and having more to chose from other than a single card in a particular performance bracket, with variations on cooling solutions, game bundles, and LED designs is a good thing.

Why would i want to upgrade my 2 year old GTX980 for a card that will give me the same performance i could have gotten a year and a half ago already for the same price?
i don`t get paid by how much FPS my gpu pushes out , i`d rather wait a few more months and see what nVidia has in-store.
 
Rated is one thing, actual is different. 480s drew more then they were rated for, and Nvidia has done that in the past as well. We have been hearing anywhere from 200-300W for Vega, so that is somewhere in the 2-3x the power of 1070 which is around 100-120w.
1070 is 120w? Man you have to be shitting me!

http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1070-review,8.html

1060 is 120w card. It amazes me how often people speak without having facts right. both for nvidia and amd. And vega power consumption is suppose to be between 200-300w, WTF? gave yourself a great range there to support your argument. Fact is no one has tested gaming vega power, especially vega 56! To speak on vega 56 power consumption without even having anything to back it up is obsurd. Lets deal with facts. I am actually excited about what you will get wrong next.
 
Vega performed about on par as expected, so the reception is a bit colder.

I'd rate this is true, if you take into account AMD's ever-hyperbolic pre-launch marketing. Expected ~1080, and it rates a bit around the 1080. Just sucks more power.
 
1070 is 120w? Man you have to be shitting me!

http://www.guru3d.com/articles-pages/nvidia-geforce-gtx-1070-review,8.html

1060 is 120w card. It amazes me how often people speak without having facts right. both for nvidia and amd. And vega power consumption is suppose to be between 200-300w, WTF? gave yourself a great range there to support your argument. Fact is no one has tested gaming vega power, especially vega 56! To speak on vega 56 power consumption without even having anything to back it up is obsurd. Lets deal with facts. I am actually excited about what you will get wrong next.
I dont know what your problem is, I posted one time in this thread.

Guru3D measures at the wall. 160W at the wall is around 130-140W depending on the efficiency of the PSU. They also state that is peak, not average. So maybe I was under a little, but not much. My 1070 with an underclock is mining at 80W, and at stock is around 120W. This will vary from card to card, and change with boost clock ranges as well.

And exactly, there are no reviews, and all we have is speculation. And I have seen numbers anywhere from 200w-300w for Vega as a whole. I never said Vega 56 specifically.


My whole point is, Vega is a year later for same performance and price at considerably higher power draw. Thats it.
 
I dont know what your problem is, I posted one time in this thread.

Guru3D measures at the wall. 160W at the wall is around 130-140W depending on the efficiency of the PSU. They also state that is peak, not average. So maybe I was under a little, but not much. My 1070 with an underclock is mining at 80W, and at stock is around 120W. This will vary from card to card, and change with boost clock ranges as well.

And exactly, there are no reviews, and all we have is speculation. And I have seen numbers anywhere from 200w-300w for Vega as a whole. I never said Vega 56 specifically.


My whole point is, Vega is a year later for same performance and price at considerably higher power draw. Thats it.

can I link tomshardware to you? They test with bad ass equipment. Its around 150w. Common man! I am mining too. Now you are bringing mining power consumptions in to this. Mining is memory intensive. Check out any review site. GTX 1070 is around 150w average for gaming. I am not bullshitting you. Yea we can undervolt and tweak stuff but gamers don't lower power limit, which lowers boost clock. Lets keep mining out. 150w is average for 1070, plenty here will agree with me here. 150w is amazing in itself for that card. At 100% power 1070 will boost to stay at 150w. I understand your point about vega, its shit! But you can't distort facts. Doesn't make it right.
 
can I link tomshardware to you? They test with bad ass equipment. Its around 150w. Common man! I am mining too. Now you are bringing mining power consumptions in to this. Mining is memory intensive. Check out any review site. GTX 1070 is around 150w average for gaming. I am not bullshitting you. Yea we can undervolt and tweak stuff but gamers don't lower power limit, which lowers boost clock. Lets keep mining out. 150w is average for 1070, plenty here will agree with me here. 150w is amazing in itself for that card. At 100% power 1070 will boost to stay at 150w. I understand your point about vega, its shit! But you can't distort facts. Doesn't make it right.

Indeed 150 watts typical usage is pretty good for near 980TI performance while consuming about 60-70 watts less than a 980TI.
 
can I link tomshardware to you? They test with bad ass equipment. Its around 150w. Common man! I am mining too. Now you are bringing mining power consumptions in to this. Mining is memory intensive. Check out any review site. GTX 1070 is around 150w average for gaming. I am not bullshitting you. Yea we can undervolt and tweak stuff but gamers don't lower power limit, which lowers boost clock. Lets keep mining out. 150w is average for 1070, plenty here will agree with me here. 150w is amazing in itself for that card. At 100% power 1070 will boost to stay at 150w. I understand your point about vega, its shit! But you can't distort facts. Doesn't make it right.
I'm not trying to distort facts. I used to spend a lot of time in the past on forums fighting misinformation. I am not biased to any company or product. I have no qualms with you either.
I also wasnt trying to bring mining in to this, I was just saying that mining I get 80W. But at full stock I get around 120W, and I guess I should have added in that was using Unigine's new benchmark, not mining. I can see where that would be read as mining at stock.

Custom board designs with more VRMs and other components, added RGB lighting and different fans, yeah they can go higher for sure. But pure reference cards should be in the neighborhood of 120W, stock, in games, average power draw. Thats what I get, and the Guru3D link you provided pretty much shows that (they use peak vs average). We have no real numbers for Vega, but anything we do get Im going to assume is for reference design unless it specifically states otherwise.

So this isnt a one person is right, another is wrong thing, its just looking at it differently and both can be right thing.
 
I'm not trying to distort facts. I used to spend a lot of time in the past on forums fighting misinformation. I am not biased to any company or product. I have no qualms with you either.
I also wasnt trying to bring mining in to this, I was just saying that mining I get 80W. But at full stock I get around 120W, and I guess I should have added in that was using Unigine's new benchmark, not mining. I can see where that would be read as mining at stock.

Custom board designs with more VRMs and other components, added RGB lighting and different fans, yeah they can go higher for sure. But pure reference cards should be in the neighborhood of 120W, stock, in games, average power draw. Thats what I get, and the Guru3D link you provided pretty much shows that (they use peak vs average). We have no real numbers for Vega, but anything we do get Im going to assume is for reference design unless it specifically states otherwise.

So this isnt a one person is right, another is wrong thing, its just looking at it differently and both can be right thing.

Understandable. But even on toms, gtx 1070 FE base target is 150w. It will boost higher in games that are not that taxing and it will also try to boost less to stay within that TDP on demanding game. 150w mark is spot on in every review for the reference model. 150w is for reference card thats its power target. I will let anyone else here confirm that, reviews don't speak any different either. Yes you can get it to 120w by lowering power limit, gaming it will target 150w.

And there is no shame in that. It kicks ass for the performance it gives with that wattage, no question!
 
Understandable. But even on toms, gtx 1070 FE base target is 150w. It will boost higher in games that are not that taxing and it will also try to boost less to stay within that TDP on demanding game. 150w mark is spot on in every review for the reference model. 150w is for reference card thats its power target. I will let anyone else here confirm that, reviews don't speak any different either. Yes you can get it to 120w by lowering power limit, gaming it will target 150w.

And there is no shame in that. It kicks ass for the performance it gives with that wattage, no question!

Yeah it's definitely one of the best cards I've ever had (not performance rating, but overall balance). It runs super cool and quiet with a blower cooler, and is like the 670 was imo.

I honestly haven't seen the Tom's review, not a fan of their site layout. Guru3d is my typical go to for 1:1 bar graphs, and here for opinion reviews. I'll take your word for it that it's 150w target, which still puts Vega far behind in consumption. Basically means it's about what a 980ti was. Overall it'll be a "strong" card in its own right, but a let down non the less.
 
And DX12 might have the slightest shred of meaningfulness if game developers gave two shits about it within this decade. They don't, and they won't.

There is also DX11.3, so the feature support part can be used both places.
 
Don't get it. If I want something better than a gtx1070, I'll buy a gtx1080. For what reason would I want a RX Vega 56 instead? I'm stumped.

Is the gap between gtx1070 and gtx1080 so large that there is actually room for RX Vega 56 to come in between? Really?
 
Yeah it's definitely one of the best cards I've ever had (not performance rating, but overall balance). It runs super cool and quiet with a blower cooler, and is like the 670 was imo.

I honestly haven't seen the Tom's review, not a fan of their site layout. Guru3d is my typical go to for 1:1 bar graphs, and here for opinion reviews. I'll take your word for it that it's 150w target, which still puts Vega far behind in consumption. Basically means it's about what a 980ti was. Overall it'll be a "strong" card in its own right, but a let down non the less.

If it ends up out performing the 1070 around 210w. It might not be that bad. It will just highlight how GCN has become inefficient despite having more cores and higher clocks. And much better buy than rx Vega 64. I'll personally be looking out for reviews on Vega 56.
 
Don't get it. If I want something better than a gtx1070, I'll buy a gtx1080. For what reason would I want a RX Vega 56 instead? I'm stumped.

Is the gap between gtx1070 and gtx1080 so large that there is actually room for RX Vega 56 to come in between? Really?

I believe about 30% gap between 1070 and 1080. So yea it has a shot at landing right in between. Given bigger Vega will show the limits of GCN despite having more cores and higher clocks.
 
nVidia 10 series has been out an entire year, and now AMD is releasing the answer to it. However, those who want performance has generally gone nVidia. Only ones left are fanboys reluctant to accept nVidia has had an uncontested Dominating spree for a year.

I don't foresee people selling there current cards for Vega at this point considering Volta is around the corner, and will Dominate AMD another year. Everyone knows this. Those who haven't upgraded yet wont be waiting for the all mighty Vega GPU's considering rumor has it, they don't surpass the Ti. If they don't Surpass the Ti, then majority are going to either squeak out a beat, or meet the same specs of its competition that has been out for a year.

Only reason I went with 2 1080 Ti was the fact I was nearing the ending of my Step Up, and said well fuck it.. next best thing is coming out but for $269 x2 I can get the current best that will last me till 4k monitors drop to prices I am WILLING to pay, for a 4K monitor. Till then, I'll have enough horsepower to last me till that point. Fair enough.

AMD has shot themselves in the foot, and if they cant bring the fight, just because its cheaper doesn't mean it's gonna be successful.
 
How well does it do in games that are killing it on the PC like PUBG? While bf1 may have a lot of attention it's pretty much a failure. Doom is irrelevant as most people will have played it already. I always ignore website reviews that focus on a narrow set of games that nobody plays.
 
Wait a minute Volta is hbm2 and not gddr5 ? ...
Anyhow there is hope yet for vega at least based on the leaks.
 
Nvidia doesn’t care what AMD does anymore, it’s been mentioned multiple times that they release on their own schedule. AMD isn’t competitive except in the eyes of their fanboys.

Is that why they enabled Quadro performance features on the Titan xp? Because they aren't worried about and.

While Nvidia is setting their own pace, only a fool ignores competition no matter how small.

Either way I will wait for official reviews before I pass judgement. But IF this is true, it represent true value and will force Nvidia to respond in kind by adjusting prices.

Competition is a good thing. Be grateful for these numbers if true.
 
Wouldn't be an issue if they released a miner centric card. Doesn't need display connections and can be PCIe x1 with appropriate power adapters so people don't need their injecting adapters, or better yet, bundle it with one. I still have a bunch of the USB3 to PCIe adapter (just use the cable to pass the signals). Design it for low power operations. Price accordingly.

Could double as a compute unit for those doing AI and/or non-mining calculations.

I think there are some in the making, at least RX 580's or such

but

they do come with no display connections and only pci-e 1x or so
and they where a whopping 20 bucks cheaper or so
it's just no big margin from those minor components
but the resale value would be about zero, no?

who would buy an old mining card that you can't use as anything else
other miners is unlikely
if it where still profitable the miner wouldn't sell them in the first place

so as long as mining cards aren't like 50% off no one would really buy them me thinks
 
Yeah well I paid $365 for my 1070, which is technically "under $400". But that was a good deal at Microcenter and can't be replicated today.

Over a 2 month period this year I found "returned" 1070's and I picked 2 up @ $270-$275 ea.

Right now I have them in SLI in my windows box at 1440p...I think I'm going to sit tight for a good, long while.
 
Remember, Ryzen didn't beat Intel either. 7700K still supreme.

The 1080Ti will still be supreme, but once Vega is released, gamers will have more choices instead of auto defaulting to the 1070 and 1080. Heck, they may even ponder about going with a 1440p Freesync monitor instead of a GTX + Gsync build.
Assuming miners don't eat them all up....
Bettter hope they suck at mining...
 
the only real advantage it would have is likely drawing less power. that difference won't be enough to justify $120 less. it has better dx12 support. CUDA is not for gamers and there is always opencl. You're reaching. Most times when people say AMD should charge less for their competitive hardware its only because they are AMD. There is no reason a newer architecture should cost less when it holds more value with its newer features. definitely not $120 less.
Problem is that we're assuming it's 'on par' with the 1080. Many leaks have it between 1070 and 1080 (e.g. a bit shy of the 1080 in performance). Furthermore, I have seen deals on 1080s in the $460-$480 range. Recognizing that reviews aren't out yet but that the info we have seems fairly solid, that makes RX 64

1) Slower than a 1080 (much of the time)
2) Drawing more power than a 1080
3) More expensive than a 1080

Given the above, there's no compelling reason for a new owner to pay $499 for an RX 64. At $399, you have a good argument I think. And it's not just because it's AMD (quite frankly, I think AMD is making a mistake by *under*charging for ThreadRipper).

And keep in mind you're talking to a guy with an RX480 in his main system, a 390 in my backup system, and who also owns three A10 HTPCs connected to each of my TVs - so I'm no AMD hater by any means. Just calling it like I see it.
 
Considering the GTX 1070 has a way smaller die size, Nvidia can afford to chop the price a few bucks and immediately make the VEGA 56 look terrible. This might be the worst launch of cards since Fermi.
 
Problem is that we're assuming it's 'on par' with the 1080. Many leaks have it between 1070 and 1080 (e.g. a bit shy of the 1080 in performance). Furthermore, I have seen deals on 1080s in the $460-$480 range. Recognizing that reviews aren't out yet but that the info we have seems fairly solid, that makes RX 64

1) Slower than a 1080 (much of the time)
2) Drawing more power than a 1080
3) More expensive than a 1080

Given the above, there's no compelling reason for a new owner to pay $499 for an RX 64. At $399, you have a good argument I think. And it's not just because it's AMD (quite frankly, I think AMD is making a mistake by *under*charging for ThreadRipper).

And keep in mind you're talking to a guy with an RX480 in his main system, a 390 in my backup system, and who also owns three A10 HTPCs connected to each of my TVs - so I'm no AMD hater by any means. Just calling it like I see it.

Lowballing with threadripper is how they are going to steak marketshare from Intel. Also their margins are reportedly VERY good on threadripper. If they priced their products to have parity with Intel, Intel could drown them in marketing or undercut them easier.
 
Lowballing with threadripper is how they are going to steak marketshare from Intel. Also their margins are reportedly VERY good on threadripper. If they priced their products to have parity with Intel, Intel could drown them in marketing or undercut them easier.

Margins actually decreased for AMD in Q2.

Also TR consist of 4 dies, 2 dead dies and 2 are working or with disabled cores. It was better to sell a single Zen die at 500$ than it is to sell 4 Zen dies, working or not for 1000$. Neither Intel or AMD is going to make any money worth writing about on their HEDT platforms. And with CF and SLI dead they only keep losing value.

And now RTG has to fight Nvidia using 500mm2 dies with stock or overclocked HBM2 and heavy VRM and water cooling against 300mm2 dies with GDDR5X.

But for the vast majority of its life, Vega wont fight GP104. It will fight GV106.
 
Last edited:
Lowballing with threadripper is how they are going to steak marketshare from Intel. Also their margins are reportedly VERY good on threadripper. If they priced their products to have parity with Intel, Intel could drown them in marketing or undercut them easier.
Understood. I think they could charge $50-$100 more for ThreadRipper and still have a lights-out offering. I was only using that particular example in response to a previous post. But this discussion is off-topic for this thread.
 
Understood. I think they could charge $50-$100 more for ThreadRipper and still have a lights-out offering. I was only using that particular example in response to a previous post. But this discussion is off-topic for this thread.

I don't think so. Wait for actual reviews... there will likely be some caveats with how they did this with TR. I doubt it'll scale nearly as well as all the cores on one die. On Techspot's preview the $999 Intel equivilant OC'd actually looks way better to me. You get the same productivity performance and wayyyy better gaming. But wait and see...

Vega... it sucks we have to wait longer. I wanted to see who was right about it! I think we have a fair bit of certainty of what Vega is. Basically it's only worth looking at if you locked yourself into freesync.
 
Speaking of which, what happened to the good old days of feuds/fights between HardOCP and other websites/firms? Man, just remembering [H] vs. Tom's (and defending Tweaktown at one point if I recall, funnily enough) and the Phantom saga brings a smile to my face. Did you guys mellow with age? :)
That was actually PCPer that I defended against THG.
 
Maybe Kyle's getting mellow in his advancing years?
 
IF and I say IF they can do the same thing to the 1080ti I will buy one

Like was said a long time ago...if you take a LONG time to let people down they don't feel so bad.
 
Over a 2 month period this year I found "returned" 1070's and I picked 2 up @ $270-$275 ea.

Right now I have them in SLI in my windows box at 1440p...I think I'm going to sit tight for a good, long while.

You hang onto those! If everything does go hbm2 and mining standards don't change they could be worth alot, these older cards.
 
Back
Top