Vega Rumors

nah they won't

vega-64-hashrate-per-watt.png


http://www.legitreviews.com/amd-radeon-rx-vega-64-vega-56-ethereum-mining-performance_197049


The difference is too great. 40% difference in profits, even small time miners won't do it. Cause it just eats into your bottom line. Its better to get 1060, cheaper and have better returns.
 
The only question is , are there any Vega specific tweaks for mining coming? It seems a bit low considering what 480/580 cards are able to hash out.
 
The only question is , are there any Vega specific tweaks for mining coming? It seems a bit low considering what 480/580 cards are able to hash out.


Eth mining is depending on memory and its bandwidth. The GPU doesn't really matter how many calcs it can do well it does as long as there is enough bandwidth and that is the problem with Vega, it doesn't have much more over current cards and that is what is holding it back. The reason why Polaris does so well over the 1060 is cause of its bandwidth that is also why the 1070 and Polaris go head to head too. The 1080 falls down cause of the latency of GDDR5x (increased latency for those cards drops the communication speed between the ram and GPU). In the 1080ti it uses GDDR5x but the timings are tighter so that problem isn't seen as much. With Pascal there is no bios editor to change the timings, with AMD Polaris there is, we don't know about Vega yet, but there maybe some room to play here. Although with Fiji they never were able to get much more out of it so...... HBM is a different beast just have to wait and see.
 

Quoting the white paper:
In a typical scene, around half of the geometry will be discarded through various techniques such as frustum culling, back-face culling, and small-primitive culling. The faster these primitives are discarded, the faster the GPU can start rendering the visible geometry. Furthermore, traditional geometry pipelines discard primitives after vertex processing is completed, which can waste computing resources and create bottlenecks when storing a large batch of unnecessary attributes. Primitive shaders enable early culling to save those resources. The “Vega” 10 GPU includes four geometry engines which would normally be limited to a maximum throughput of four primitives per clock, but this limit increases to more than 17 primitives per clock when primitive shaders are employed.

Apparently, the madmen at RTG hard launched RX Vega with this feature still disabled in drivers. Anyone have any reasonable estimates as to how much of a benefit it would be reasonable to expect from it being activated?
 
Quoting the white paper:


Apparently, the madmen at RTG hard launched RX Vega with this feature still disabled in drivers. Anyone have any reasonable estimates as to how much of a benefit it would be reasonable to expect from it being activated?


They aren't disable, they aren't accessible by programmers. AMD said they will do it in drivers, but that pretty much limits it to what ever the dev has already done in their engines. So for now don't expect anything more than what Polaris can do.
 
nah they won't

vega-64-hashrate-per-watt.png


http://www.legitreviews.com/amd-radeon-rx-vega-64-vega-56-ethereum-mining-performance_197049


The difference is too great. 40% difference in profits, even small time miners won't do it. Cause it just eats into your bottom line. Its better to get 1060, cheaper and have better returns.

Doesn't seem like they know how to adjust mining for AMD. Hint you don't lower the power limit, you up ram and lower gpu clocks, even undervolt. They increase memory OC for the 1070 but not for Vega. Trying to improve mining on Polaris by just adjusting the power limit is complete sheet, so I would expect the same for Vega.
 
Doesn't seem like they know how to adjust mining for AMD. Hint you don't lower the power limit, you up ram and lower gpu clocks, even undervolt. They increase memory OC for the 1070 but not for Vega. Trying to improve mining on Polaris by just adjusting the power limit is complete sheet, so I would expect the same for Vega.


Can't increase the frequencies of HBM 2 too much don't remember the throttling that caused for Vega FE? Yeah the ram frequency increasing of HBM 2 caused the entire GPU to throttle.

Also the GPU power % has nothing to do with the power delivery to the Ram. So by them dropping the power consumption by limiting the power to the GPU, it doesn't nothing to the vram frequencies.

And yes we do power limit the GPU when mining, how else do you think I can get my 1070's to do 2000 mhz at oh 940mV @ a total power consumption of 100 watts.

I've undervolted and power limited all my rx 580's too, -20 on the volts and -20% power limit, they still hit their max GPU frequency which is set at 1340. If I don't change the power limit the GPU will still draw 150watts +.

V =/ W, V and Amps have an inverse relationship, if you drop V and don't limit the power draw, you end up eating more Amps but the wattage stays the same. But with the GPU bios/drivers, they limit the wattage, so if you drop V and wattage is limited it can make the GPU unstable because they GPU is made to be used with certain amount of V and are speced to do so. This is why, when undervolting GPU's some GPU's can under volt more, there is a range.
 
Last edited:
Well today we crown two new performance kings, what a spectacular launch!:

The undisputed performance king:
gallery-1.png


The ultimate consumer gaming card:


Hmm this feels like deja vu, haven't we already been here for months?


Hahahhahahhahahhaha

I think it's time to close this thread and move onto the Navi thread... if it was my thread I honestly would.
 
Can't increase the frequencies of HBM 2 too much don't remember the throttling that caused for Vega FE? Yeah the ram frequency increasing of HBM 2 caused the entire GPU to throttle.

Also the GPU power % has nothing to do with the power delivery to the Ram. So by them dropping the power consumption by limiting the power to the GPU, it doesn't nothing to the vram frequencies.

And yes we do power limit the GPU when mining, how else do you think I can get my 1070's to do 2000 mhz at oh 940mV @ a total power consumption of 100 watts.

I've undervolted and power limited all my rx 580's too, -20 on the volts and -20% power limit, they still hit their max GPU frequency which is set at 1340. If I don't change the power limit the GPU will still draw 150watts +.

V =/ W, V and Watts have an inverse relationship, if you drop V and don't limit the power draw, you end up eating more watts. But with the GPU bios/drivers, they limit the wattage, so if you drop V and wattage is limited it can make the GPU unstable because they GPU is made to be used with certain amount of V and are speced to do so. This is why, when undervolting GPU's some GPU's can under volt more, there is a range.

You mean you end up eating more amps? Amps * volts = watts
 
Vega 56 seems like the best of the cards AMD has, Vega 64 is pretty Meh. Thankfully this is the last incarnation of GCN and Navi is already done so Vega should only have little over a year to exist.
 
Can't increase the frequencies of HBM 2 too much don't remember the throttling that caused for Vega FE? Yeah the ram frequency increasing of HBM 2 caused the entire GPU to throttle.

Also the GPU power % has nothing to do with the power delivery to the Ram. So by them dropping the power consumption by limiting the power to the GPU, it doesn't nothing to the vram frequencies.

And yes we do power limit the GPU when mining, how else do you think I can get my 1070's to do 2000 mhz at oh 940mV @ a total power consumption of 100 watts.

I've undervolted and power limited all my rx 580's too, -20 on the volts and -20% power limit, they still hit their max GPU frequency which is set at 1340. If I don't change the power limit the GPU will still draw 150watts +.

V =/ W, V and Watts have an inverse relationship, if you drop V and don't limit the power draw, you end up eating more watts. But with the GPU bios/drivers, they limit the wattage, so if you drop V and wattage is limited it can make the GPU unstable because they GPU is made to be used with certain amount of V and are speced to do so. This is why, when undervolting GPU's some GPU's can under volt more, there is a range.

But you can overclock the vega vram frequencies and what you can get away with for vega rx would be a nice thing to read in you know a review.

In my experience with my RX 480, I only use power limit to limit overall temp to make sure the card never burns up. Trying to rely on power limit to adjust gpu frequency and voltage is garbage. I limit gpu clock to 1125 mhz, increase mem freq to 2100 mhz after bios mod and undervolt gpu -96mv. I lower the temperature limit to 72, which lowers power limit to -12. This card mines ETH at 30 MH/s @ 75w or dual mines at @100w.

Seems like they used methods to overclock NV cards for better mining on vega, except for no memory overclock which frankly looks biased to me. Though not sure if getting near 40 Mh/s for Vega would still make it viable for mining given power requirement but then again that would have been nice to find out in a review.

btw, itn's not inverse it's linear relationship between V and wattage...also with frequency and wattage.
 
But you can overclock the vega vram frequencies and what you can get away with for vega rx would be a nice thing to read in you know a review.

In my experience with my RX 480, I only use power limit to limit overall temp to make sure the card never burns up. Trying to rely on power limit to adjust gpu frequency and voltage is garbage. I limit gpu clock to 1125 mhz, increase mem freq to 2100 mhz after bios mod and undervolt gpu -96mv. I lower the temperature limit to 72, which lowers power limit to -12. This card mines ETH at 30 MH/s @ 75w or dual mines at @100w.

Seems like they used methods to overclock NV cards for better mining on vega, except for no memory overclock which frankly looks biased to me. Though not sure if getting near 40 Mh/s for Vega would still make it viable for mining given power requirement but then again that would have been nice to find out in a review.

btw, itn's not inverse it's linear relationship between V and wattage...also with frequency and wattage.


you can't overclock HBM 2 much at all. 100 mhz at most, 10% increase, you can't do shit with that.

my rx 580's and 1070s looking at 50-75% increases in bandwidth over stock. my core clocks are as I stated before. mem clocks 2150 for the 580's, and 4400mhz of the 1070's. At those memory frequencies the clock speed increase make a difference. As I stated ya need clock speed if you can get the bandwidth up past a certain point. Its all about feeding the GPU as fast as possible when the GPU is capable of outputting the result.

RX480 will get less power draw than a RX580, but also you are probably just reading what is in afterburner or what ever program you are using to undervolt, overclock etc. That is ONLY GPU power usage! Add another 30 - 50 watts to that for the rest of the board.
 
Last edited:
you can't overclock HBM 2 much at all. 100 mhz at most, 10% increase, you can't do shit with that.

my rx 580's and 1070s looking at 50-75% increases in bandwidth over stock. my core clocks are as I stated before. mem clocks 2150 for the 580's, and 4400mhz of the 1070's. At those memory frequencies the clock speed increase make a difference. As I stated ya need clock speed if you can get the bandwidth up past a certain point. Its all about feeding the GPU as fast as possible when the GPU is capable of outputting the result.

I get a ~10% increase in hash rate with a 10% increase in mem overclock. That isn't shit.

If you want to properly setup your AMD card for mining, you first mod bios to reduce memory latency, then increase mem frequency. once you find your max mem overclock, start reducing gpu clock (leave at stock at the beginning) because you don't need gpu as much as memory bandwtidh (I know you know this but seem to forget today). Stop reducing gpu when you notice your hash rate decreases, then start undervolting.

Now some of these steps you can't do on NV.

They should have at least OC the memory by 10%. I've read can do more on Vega FE, up to 1125mhz.
 
I get a ~10% increase in hash rate with a 10% increase in mem overclock. That isn't shit.

If you want to properly setup your card for mining, you first mod bios to reduce memory latency, then increase mem frequency. once you find your max mem overclock, start reducing gpu clock (leave at stock at the beginning) because you don't need gpu as much as memory bandwtidh (I know you know this but seem to forget today). Stop reducing gpu when you notice your hash rate decreases, then start undervolting.

Now some of these steps you can't do on NV.

They should have at least OC the memory by 10%. I've read can do more on Vega FE, up to 1125mhz.


Yes I know I have 26 rigs going right now, half of them are AMD rx 580's the other half 1070's also have modded software that I'm tripling mining with, which I have been developing on my test rig, so far it looks pretty good, not very stable yet on all my rigs though, so when I know its more stable I will be releasing it ;). Other down side is Eth mining drops 25% hashrates, this is not a side effect since the third coin needs that GPU power, but doesn't need bandwidth. And it gets 40% of the original hashrate. So if I can get this software stable, that will be great! But its going to be hard, because the way I branched to program I can't find where the conflicts are currently. Suspecting drivers....... Out of the 10 systems I've tested on 3 of them crash the third coin mining, the other 7 have been running flawlessly for the past 7 days.

This is what you don't get, Claymore is making more than 30 million in coins a year with his 2% cut, you tell me what miner will ever pay that much to a programmer? They can't even mine that many coins a year without having 300 rigs up and going, all that with upkeep costs more like 500 rigs.
 
Last edited:
Can't increase the frequencies of HBM 2 too much don't remember the throttling that caused for Vega FE? Yeah the ram frequency increasing of HBM 2 caused the entire GPU to throttle.

Also the GPU power % has nothing to do with the power delivery to the Ram. So by them dropping the power consumption by limiting the power to the GPU, it doesn't nothing to the vram frequencies.

And yes we do power limit the GPU when mining, how else do you think I can get my 1070's to do 2000 mhz at oh 940mV @ a total power consumption of 100 watts.

I've undervolted and power limited all my rx 580's too, -20 on the volts and -20% power limit, they still hit their max GPU frequency which is set at 1340. If I don't change the power limit the GPU will still draw 150watts +.

V =/ W, V and Amps have an inverse relationship, if you drop V and don't limit the power draw, you end up eating more Amps but the wattage stays the same. But with the GPU bios/drivers, they limit the wattage, so if you drop V and wattage is limited it can make the GPU unstable because they GPU is made to be used with certain amount of V and are speced to do so. This is why, when undervolting GPU's some GPU's can under volt more, there is a range.

For normal silicon I believe voltage is squared and frequency linear compared to wattage. It works out almost perfect with my 5960x.

From stock to 4.8Ghz at 1.45V I calculated 365W. Actual reported was 388W. My everyday is 1.35V at 4.5 though... because $1,000 chip and all.
 
For normal silicon I believe voltage is squared and frequency linear compared to wattage. It works out almost perfect with my 5960x.

From stock to 4.8Ghz at 1.45V I calculated 365W. Actual reported was 388W. My everyday is 1.35V at 4.5 though... because $1,000 chip and all.


yeah V is to the power of 2 and the rest of it is exactly what you stated!
 
Yes I know I have 26 rigs going right now, half of them are AMD rx 580's the other half 1070's also have modded software that I'm tripling mining with, which I have been developing on my test rig, so far it looks pretty good, not very stable yet on all my rigs though, so when I know its more stable I will be releasing it ;). Other down side is Eth mining drops 25% hashrates, this is not a side effect since the third coin needs that GPU power, but doesn't need bandwidth. And it gets 40% of the original hashrate. So if I can get this software stable, that will be great! But its going to be hard, because the way I branched to program I can't find where the conflicts are currently. Suspecting drivers....... Out of the 10 systems I've tested on 3 of them crash the third coin mining, the other 7 have been running flawlessly for the past 7 days.

This is what you don't get, Claymore is making more than 30 million in coins a year with his 2% cut, you tell me what miner will ever pay that much to a program? They can't even mine that many coins a year without having 300 rigs up and going, all that with upkeep costs more like 500 rigs.

not sure how much of that related to the post you're replying to but whatevs.

You think of Claymore as a person, I always thought it was a dev team. And as far as you know dev team for large mining firm and they released claymore miner to capture some of the hash they compete against. I also don't count 200 or 500 rigs as the large mining firms.
 
not sure how much of that related to the post you're replying to but whatevs.

You think of Claymore as a person, I always thought it was a dev team. And as far as you know dev team for large mining firm and they released claymore miner to capture some of the hash they compete against. I also don't count 200 or 500 rigs as the large mining firms.


Claymore as a dev or many people doesn't matter, it goes to one entity. This is why no one miner with 1000 or even 2000 rigs can ever pay those guys. They won't. Just makes no damn sense to invest that much money into hardware and people. Just to do that you already have multiple millions of bucks man. Their Day JOB which doesn't have all the damn fluctuations the cryptomarket has pays a hell of a lot better lol.

just add up the bill for 1000 rigs...... That is 3 million bucks, the average coin you can only mine for oh 1 year? The hardware after that one year is obsolete. See where its all going.

Mining doesn't make that much money unless you pay for your hardware and electricity can be cover easily by your day job. Then add a staff to look after the software and hardware forget about it, there is no money then.
 
Claymore as a dev or many people doesn't matter, it goes to one entity. This is why no one miner with 1000 or even 2000 rigs can ever pay those guys. They won't. Just makes no damn sense to invest that much money into hardware and people. Just to do that you already have multiple millions of bucks man. Their Day JOB which doesn't have all the damn fluctuations the cryptomarket has pays a hell of a lot better lol.

just add up the bill for 1000 rigs...... That is 3 million bucks, the average coin you can only mine for oh 1 year? The hardware after that one year is obsolete. See where its all going.

Mining doesn't make that much money unless you pay for your hardware and electricity can be cover easily by your day job. Then add a staff to look after the software and hardware forget about it, there is no money then.

If my rig can mine just a little and my electric isn't too much, how would Claymore look for a casual miner?

I have an i3 I'll have an RX470 in soon that doesn't eat too much juice I can leave running all day.
 
not entirely. The PACKS were priced MSRP/SEP. They never guaranteed the non-PACK pricing.

Does it matter if we can't even buy the things? This launch is a joke. AMD delayed the release this long just to have cards go out of stock in 10 minutes?
 
If my rig can mine just a little and my electric isn't too much, how would Claymore look for a casual miner?

I have an i3 I'll have an RX470 in soon that doesn't eat too much juice I can leave running all day.


Claymore works well the second coin pays for the electricity for the most part at Pascal's current price, when Pascal was at 1 buck it also took care of the dev fee. For me its more about stability of the systems so loosing 2% is not a big deal. CCminer I have had issues with some of my overclocks and undervolts on some systems, so stuck with claymore for the most part.

just punch it into a coin profitability calc the one I use is Crytocompare, its pretty close to actuals. -5% or so off the figures you get should give ya a very close estimate of your returns.
 
Does it matter if we can't even buy the things? This launch is a joke. AMD delayed the release this long just to have cards go out of stock in 10 minutes?
The $499 Vega 64 sold out in less than a minute on Amazon. :) The $599 version lasted a few seconds longer.
 
Does it matter if we can't even buy the things? This launch is a joke. AMD delayed the release this long just to have cards go out of stock in 10 minutes?
I bought it fine @ 5:40pm est. I bought the pack WCed card. Cant speak to the individual cards.
 
I get a ~10% increase in hash rate with a 10% increase in mem overclock. That isn't shit.

If you want to properly setup your AMD card for mining, you first mod bios to reduce memory latency, then increase mem frequency. once you find your max mem overclock, start reducing gpu clock (leave at stock at the beginning) because you don't need gpu as much as memory bandwtidh (I know you know this but seem to forget today). Stop reducing gpu when you notice your hash rate decreases, then start undervolting.

Now some of these steps you can't do on NV.

They should have at least OC the memory by 10%. I've read can do more on Vega FE, up to 1125mhz.

I think the bios has a check sequence now so no flashing of user created bios allowed. Blame DRM.

Watch first 3 mins.

 
Seems to suffer real badly in DX11
90083.png

I don't believe those numbers. I have run GTAV at full 4K with max settings (except advanced) and I can get a solid 60FPS+ with 2x RX 480 (which should be worse than the 580). Even if you assume the 580 and 480 are equal, and 100% CF scaling, that only reaches 36FPS which isn't right (unless something else is terribly wrong with the testing rig)..
 
I saw this over at a certain forum (which has tons of AMD fans but some are changing their tune after today) and someone made an interesting point about AMD lying:

slides-raja-39.jpg

slides-raja-46_small.jpg


Someone also aptly pointed out that Vega is still bottlenecked by its 4 compute engines (like Fiji) as illustrated in this game:
civ6_2560_1440.png


Anandtech asked AMD about this limitation and they essentially told them that they could have added more but didn't feel like it because its more work...lol!

I'm glad [H] called them out on their b.s. by clearly labeling it a failure. The Vega 56 seems alright but I'd still pick an AIB 1070 over it any day.
 
I don't believe those numbers. I have run GTAV at full 4K with max settings (except advanced) and I can get a solid 60FPS+ with 2x RX 480 (which should be worse than the 580). Even if you assume the 580 and 480 are equal, and 100% CF scaling, that only reaches 36FPS which isn't right (unless something else is terribly wrong with the testing rig)..

Almost every review shows Vega getting its ass kicked in GTA V. Are they all lying?
 
I don't believe those numbers. I have run GTAV at full 4K with max settings (except advanced) and I can get a solid 60FPS+ with 2x RX 480 (which should be worse than the 580). Even if you assume the 580 and 480 are equal, and 100% CF scaling, that only reaches 36FPS which isn't right (unless something else is terribly wrong with the testing rig)..

It matches other reviews where they use ultra + 2xmsaa. Without msaa the 480 gets around 31 fps.

There's no conspiracy. Vega is just terrible.
 
I don't believe those numbers. I have run GTAV at full 4K with max settings (except advanced) and I can get a solid 60FPS+ with 2x RX 480 (which should be worse than the 580). Even if you assume the 580 and 480 are equal, and 100% CF scaling, that only reaches 36FPS which isn't right (unless something else is terribly wrong with the testing rig)..
Nearly all reviews where showing abysmal GTAV numbers. Vega seems to suffer in most other DX11 games, but nearly as bad as in GTAV.
 
Back
Top