NVIDIA’s Miner vs. AMD’s Whiner - CES 2022

Yep. Complaining about 4GB on this GPU is silly.

The only way to exceed 4GB Vram use is to try to render at a resolution and/or settings that this GPU is incapable of performing at anyway.
I am just hoping they both work off the PCIe slot and don't require external power, if not price difference be damned I am gonna snag me an A2000
 
It's fascinating to see the amount of vitriol amd is getting for this video card in some places, including the amd sub-reddit.

I am curious to see the performance, price development, and how intel's alternative is going to compare.
AMD in this case brought it upon themselves, but it is the reaction we were all expecting and I chuckle at it.
Intel's cards are going to be fine, price and availability are up for debate, I expect them to be less than the AMD/NVidia cards in their performance brackets but depending on how well they mine and how Intel launches them I expect them to be botted up pretty much instantly and sold at price parity with AMD/NVidia on EBay & Amazon all the same.
 
Last edited:
People have been clamoring for NAVI2 to make its way down the product stack. Now that it is, people are bitching. Oh, well. 4MB is sufficient for 1080p, and a deterrent for miners.
 
Yep. Complaining about 4GB on this GPU is silly.

The only way to exceed 4GB Vram use is to try to render at a resolution and/or settings that this GPU is incapable of performing at anyway.

Part of the problem is the 64 bit bandwidth on the AMD card, which I doubt game cache will completely help fix. Having a 96 bit 6 GB card would have been the best solution IMHO. After all, it was done on the GTX 1050 3 GB.
 
People will complain about anything. You could give them a briefcase with $25,000 USD in cash, they would find a way to bitch.

This is amazing news for AMD and for budget gamers. A $199 card that is actually decent hasn't happened in a long time. Bravo.
 
People will complain about anything. You could give them a briefcase with $25,000 USD in cash, they would find a way to bitch.

This is amazing news for AMD and for budget gamers. A $199 card that is actually decent hasn't happened in a long time. Bravo.
You can get excited about this 'booster' if you want, but fact remains we are getting essentially the same card as the 1650 S from over 2 years ago or the RX 480 from 5? years ago. Both companies have enough resources and foresight to eliminate most of the whiners and miners.
 
People will complain about anything. You could give them a briefcase with $25,000 USD in cash, they would find a way to bitch.

This is amazing news for AMD and for budget gamers. A $199 card that is actually decent hasn't happened in a long time. Bravo.
I mean the taxes on that.... are you crazy!
That said the wife's machine running an old i5 and a 1060 can hopefully get put out to pasture sometime soon. She now has my old 1440p monitor looks great but only really good at 60hz, it says it works with 144 but it flickers like mad so just a good way to get a headache fast, either way looking for something that can do 1440p at 60 that I can build into something small and cute and this looks to be the year to make it happen. I love my NVidia's but I am more than happy to put some AMD love up in there.
 
Yep. Complaining about 4GB on this GPU is silly.

The only way to exceed 4GB Vram use is to try to render at a resolution and/or settings that this GPU is incapable of performing at anyway.

In theory. Some games will push that to the test though. I think 6GB and/or more bandwidth would help though.

FM wrote:
The proof in the pudding about my theory here will be how the actual retail prices settle out. I am guessing that we will see the 6500XT stick a lot closer to its MSRP than the 3050 will.

No contest there. I figure the AIB models will push $250 for AMD and $300+ for Nvidia.
 
Wouldn't be surprised to see real pricing for the RTX 3050 hit $399 or more; 6500XT will never sniff $199's jock.. $299+
 
FM wrote:


No contest there. I figure the AIB models will push $250 for AMD and $300+ for Nvidia.


I think it also depends on availability. Some people are just fanboys of one brand or another, and the fact that AMD has had such limited production volumes this generation has led to a supply/demand mismatch that has driven up pricing of their GPU's beyond what is warranted based on the price and capability of equivalent Nvidia models.
 
Agreed on the 4gb for the 1080 segment 4gb is perfectly fine... and a nice way to shave a little off the price. (assuming that actually happens at retail)
Also agree the 3050s are all just going to end up in coin farms.
The AMD card... I mean wait for benches but could well be AMDs next RX570. The 570 was always a nice sweet spot over the 580... at one time you could get the 4gb 570s for a song compared to everything else around and they performed with in a whisker of the 580.
If the 6500 stays at least close its MSRP and can manage 75fps or better in most titles at 1080p it will probably sell decently if supply exists. The 6600 seems to average 75-120 depending on the game at 1080p... the 6500 will have to get close enough to be a good value. Its basically the I give up card... this rig will never game beyond 1080p, and I can't bring myself to play second hand GPU roulette. lol
 
Wouldn't be surprised to see real pricing for the RTX 3050 hit $399 or more; 6500XT will never sniff $199's jock.. $299+
That wouldn't be the worst, my suppliers can get me a 6GB A2000 right now for $561.99 CAD, which is around $440 USD, given the two cards should perform about the same I am not unhappy about those price points. And given it's lower memory capacity it may even be easier to actually get than the 3050, though looking at it I am now seeing a 4GB sku of the A2000 which they can supposedly get me for $219 CAD. I am wondering what that would game like at 1080p?
 
Last edited:
I would be surprised if the 8GB 3050 is actually all that good for mining. I'm thinking ~30-35 MH/s which is $1.50/day. Probably perform like a GTX1080 I would think. Plus who knows if they put their LHR nonsense in the card either. Then you'd be looking at 22 MH/s at probably 80W which is $1/day.
 
I would be surprised if the 8GB 3050 is actually all that good for mining. I'm thinking ~30-35 MH/s which is $1.50/day. Probably perform like a GTX1080 I would think. Plus who knows if they put their LHR nonsense in the card either. Then you'd be looking at 22 MH/s at probably 80W which is $1/day.
the 6GB A2000 offers a hash rate of 41 MH/s, the 3050 looks to be the same chip with more memory so I would expect it to at least handle that much possibly more depending on clock speeds.

gaming wise that same 6GB A2000 performs about 80% of the RTX 3060 so you can work you various comparisons from there.
 
You can get excited about this 'booster' if you want, but fact remains we are getting essentially the same card as the 1650 S from over 2 years ago or the RX 480 from 5? years ago.
The most popular card on Steam is still a GTX 1060. This AMD card would be a decent bump up from that for $199, assuming for 1080p gaming.
 
the 6GB A2000 offers a hash rate of 41 MH/s, the 3050 looks to be the same chip with more memory so I would expect it to at least handle that much possibly more depending on clock speeds.

gaming wise that same 6GB A2000 performs about 80% of the RTX 3060 so you can work you various comparisons from there.

An unlocked 3060 gets 48MH/s so 80% is 38.5MH/s. Plus, if they include their LHR nonsense, which nets you ~72% of that = 28MH/s.

Either way, if it's $250+, the ROI isn't great for mining as in you might not break even by the time ETH goes PoS.
 
I would be surprised if the 8GB 3050 is actually all that good for mining. I'm thinking ~30-35 MH/s which is $1.50/day. Probably perform like a GTX1080 I would think. Plus who knows if they put their LHR nonsense in the card either. Then you'd be looking at 22 MH/s at probably 80W which is $1/day.
Probably not great, but the price is low and a mining farm could be buying hundreds of them, which would offset the low hash rate.
 
An unlocked 3060 gets 48MH/s so 80% is 38.5MH/s. Plus, if they include their LHR nonsense, which nets you ~72% of that = 28MH/s.

Either way, if it's $250+, the ROI isn't great for mining as in you might not break even by the time ETH goes PoS.
The RTX A2000 is capable of 41 MH/s in Ethereum mining with a power draw of 66W after some tuning. The core was overclocked by +100 MHz and the memory by +1,500, rather impressive for a PNY workstation card. The power slider was reduced to 95% and the fan speed was increased to 100%. The following results were observed:
  • NVIDIA RTX A2000 (Tuned): 41 MH/s @ 66W (0.62 PPW)
https://www.hardwaretimes.com/nvidi...ter-ether-mining-efficiency-than-the-rx-6600/

But yeah the 6GB isn't a super attractive miner card, supposedly the 12GB one is though and that one is expensive unobtanium. But depending on what the actual availability of the 3050 is and its actual price paying the premium for an A series workstation card that is available vs potentially paying a scalper more for the 3050 it becomes a super easy choice for me in favor of the A2000 card.
 
Last edited:
The RTX A2000 is capable of 41 MH/s in Ethereum mining with a power draw of 66W after some tuning. The core was overclocked by +100 MHz and the memory by +1,500, rather impressive for a PNY workstation card. The power slider was reduced to 95% and the fan speed was increased to 100%. The following results were observed:
  • NVIDIA RTX A2000 (Tuned): 41 MH/s @ 66W (0.62 PPW)
https://www.hardwaretimes.com/nvidi...ter-ether-mining-efficiency-than-the-rx-6600/

But yeah the 6GB isn't a super attractive miner card, supposedly the 12GB one is though and that one is expensive unobtanium. But depending on what the actual availability of the 3050 is and its actual price paying the premium for an A series workstation card that is available vs potentially paying a scalper more for the 3050 it becomes a super easy choice for me in favor of the A2000 card.

That is a 192-bit bus. I'm guessing that the 8 GB 3050 card is more like 128-bit given Nvidia's history. That reduction in overall bandwidth doesn't look like it's going to be good for mining. Those memory modules are going to have to be flying to hit 41. I would expect a similar power draw though.

I haven't seen any raw bandwidth numbers though.
 
Lower power helps also..

So yeah.. these will be purchased by the pallets
depends, if they stick with a 128-bit bus like on the mobile parts then the mining rate will be trash 8GB or not, if they go with the 192 as they have for the other parts then yes for sure. But why pay $350 for something that would mine in the low 20's when you can pay $400 for something known to pull in 41. IF they gimp it with a 128-bit bus then miners won't touch it as long as the A2000 exists
 
At least AMD isn't marketing the 6500XT as a 4K card, like the Fury.
 
People have been clamoring for NAVI2 to make its way down the product stack. Now that it is, people are bitching. Oh, well. 4MB is sufficient for 1080p, and a deterrent for miners.

if supply exists.

Hope the 6nm process allows AMD to make tons of these chips. Everything depends on Availability now. So far Xbox & PS5 have placed a huge constraint on availability of AMD GPUs
 
Also, remember that not everyone is as old as us. There are teenagers today that may just be building their first PC. So the fact that there was similar performance 5 years ago doesn't matter cause those kids would have been 10.
I have kids ask me every year how to build them, one of our High Schools offers an after school program on how to assemble them and stuff, always a few kids in there, these past years have been tough though the parts just don't exist at prices the kids can afford.
 
People will complain about anything. You could give them a briefcase with $25,000 USD in cash, they would find a way to bitch.

This is amazing news for AMD and for budget gamers. A $199 card that is actually decent hasn't happened in a long time. Bravo.

You mean I got to pay taxes on that bonus???
/s

Until it is on the shelves, it is an alleged $199 GPU. ;)
 
I think there will have to be at least one card at that price, but you probably won't be able to get it.

But even in the $250 - $300 range is not bad assuming you could buy it.
 
Nothing there to disagree with. When is Ethereum scheduled to go proof of stake again? That date can't come soon enough.
Similar to how fusion power is always 20-30 years in the future, Etherium moving from Proof of Pollution to Proof of Plutocracy is always 6-12 months from now.

Call me overly cynical, but I suspect if it ever happens it'll be in the middle of either a deep and long enough crash that the miners are dumping cards on ebay; or - much less likely - at a point when a boom is large enough that the other gpu coins will be big enough to at least keep the miners in the black.
 
I really wanted rx480 perf in a half height card...oh well. Back to dreaming, I guess.
 
I really wanted rx480 perf in a half height card...oh well. Back to dreaming, I guess.
I'm hoping the 3050 is a half-height, if not then I have a bunch of A2000's in the shopping cart. I have a bunch of Dell mini-towers that need GPU upgrades, and I can get the A2000's easier than the P620's for only slightly more
 
Back
Top