NVIDIA’s Miner vs. AMD’s Whiner - CES 2022

Heck, if 4GB really works to deter miners, the college kids next door to me would probably buy a 6700 with 4GB of Ram. They are only using 1080p monitors anyway, but are using a 980ti for one of them and a 1050ti for the other.
How would you run 4GB in a 192 bit card? a 4 GB 6600 or 3050 might help.
Ah, yeah, you're right. Still 60w more than 107, though.
Yeah but total system power is like 190w vs 250w and this thing has half the vram. That is rather sorry progress in 5 years.
 
How would you run 4GB in a 192 bit card? a 4 GB 6600 or 3050 might help.

Yeah but total system power is like 190w vs 250w and this thing has half the vram. That is rather sorry progress in 5 years.
I agree, but that's about where we are with current processes and materials. Until they change those or come up with a new way to do graphics processing, it'll probably be almost side-grades for awhile.
 
oh god, you are now one of those 'new normal' people, are you?
Not sure who you are, Mr. 40 posts, but yes.

Things in the world are accelerating rapidly with the pandemic, with the economy, with the global production and distribution infrastructure, with cryptocurrency.

If you want to pretend that things are the same as 2016, do whatever you wish, but I live in reality.
 
Not sure who you are, Mr. 40 posts, but yes.

Things in the world are accelerating rapidly with the pandemic, with the economy, with the global production and distribution infrastructure, with cryptocurrency.

If you want to pretend that things are the same as 2016, do whatever you wish, but I live in reality.
Crypto is literally too big to fail for the time being, too many people have too much money invested. It could be a fad but it’s going to take a decade or so for it to die and it’s going to be kicking and screaming the whole way down. GPU’s are incredibly good at straight up number crunching it’s their one job and they do it. Industry uses for them have vastly outpaced manufacturing capacities for them and until the two reach a degree of parity paying more for what could be argued to be less is going to be the new norm. But that means that developers will be aiming at those brackets for years. And that’s where it’s at.

Back in the day we were all butthurt because the consoles and their hardware were staggering most game development because of the older standards they were keeping afloat. Now we’ll get to spend the next 5 years or so in a similar situation because the average hardware set used by PC gamers won’t be able to be advanced because of a lack of product.

I’m sure developers will still find awesome ways to make the best of this situation. As much as this sucks I look forward to what this creates for back end engine development.
 
Industry uses for them have vastly outpaced manufacturing capacities for them and until the two reach a degree of parity paying more for what could be argued to be less is going to be the new norm. But that means that developers will be aiming at those brackets for years. And that’s where it’s at.
Is industry demand that much greater than it was 5+ years ago? To me the crunch seems like a one two punch of supply chain + crypto. I have to imagine they're shipping & selling more gaming GPUs today than ever before. But to your point I could see them setting aside more high-end silicon for pro equipment.

Agree that I hope the challenges lead to clever solutions that lead to more efficiency and better prices down the line.
 
Is industry demand that much greater than it was 5+ years ago? To me the crunch seems like a one two punch of supply chain + crypto. I have to imagine they're shipping & selling more gaming GPUs today than ever before. But to your point I could see them setting aside more high-end silicon for pro equipment.

Agree that I hope the challenges lead to clever solutions that lead to more efficiency and better prices down the line.

The demand specifically for GPUs? No. It’s higher, yes, but not exponentially so (well, as long as we ignore mining for the moment). However, the demand for microprocessors has increased by orders of magnitude even over the last few years. Because of this, every single company capable of producing at a large scale (and there aren’t very many) at completely maxed out on capacity. AMD and Nvidia have to compete for line usage with companies far larger than themselves. Neither is able to scale up to meet the new demand brought by crypto because there simply isn’t any space available to expand to. On it’s own, crypto wouldn’t have had this kind of effect over this long of time. Both companies would have been able to scale production to get more products on the market. The crypto boom happened right as everything else was taking a nose dive. It’s part of the problem, but there are a lot of different factors contributing. COVID is also playing a pretty big factor as companies try to recover and also respond to new variants.

The above is a pretty simple overview of the problem, but it’s mostly how I understand it.
 
Reading the poor 6500 has a 64 bit mem interface. The more I read the more I am really unimpressed. I want to appreciate what AMD is doing and ya Nvidias low end cards is going to be impossible to find when the miners buy them off the boat.

Still I'm starting to feel pretty damn sad about getting a card that seem more and more unlikely to best a 5 year old 580/570. I look forward to benchmarks just to see how sad it is. 5 year old performance for that brand new car price seems like a terrible proposition. I mean they will be in stock... and I guess they will sell for people looking to build low end gaming machines that are at least a minor step up from using igpu. (then again AMD also has new RDNA2 igpus coming)
 
Also, $199 in 2016 dollars is worth $230 in 2022 USD, so this card is technically cheaper.
 
Last edited:
Reading the poor 6500 has a 64 bit mem interface. The more I read the more I am really unimpressed. I want to appreciate what AMD is doing and ya Nvidias low end cards is going to be impossible to find when the miners buy them off the boat.

Still I'm starting to feel pretty damn sad about getting a card that seem more and more unlikely to best a 5 year old 580/570. I look forward to benchmarks just to see how sad it is. 5 year old performance for that brand new car price seems like a terrible proposition. I mean they will be in stock... and I guess they will sell for people looking to build low end gaming machines that are at least a minor step up from using igpu. (then again AMD also has new RDNA2 igpus coming)
It gets spicer when you think about the fact that the cards will be pcie 4x, and will be fine if run at pcie 4.0, but how well is that card really going to do when socketed in a pcie 3.0 board? Every aspect of these cards is a straight downgrade except for power consumption, and I'm sorry, but 5-6 years to save 60W is just sad. The fact that anyone is defending this is mind boggling.
 
It gets spicer when you think about the fact that the cards will be pcie 4x, and will be fine if run at pcie 4.0, but how well is that card really going to do when socketed in a pcie 3.0 board? Every aspect of these cards is a straight downgrade except for power consumption, and I'm sorry, but 5-6 years to save 60W is just sad. The fact that anyone is defending this is mind boggling.
I can understand why they are being defended. Which might be sadder still. lol
 
Is industry demand that much greater than it was 5+ years ago? To me the crunch seems like a one two punch of supply chain + crypto. I have to imagine they're shipping & selling more gaming GPUs today than ever before. But to your point I could see them setting aside more high-end silicon for pro equipment.

Agree that I hope the challenges lead to clever solutions that lead to more efficiency and better prices down the line.
Microcontrollers and the likes are in more and more things, light switches, electrical sockets, heating ducts, hell based on the Palo Alto and HPE Aruba presentations I saw today my new Firewall is likely to be powered by a pair of A100, and my L3 & L2 environment will also have some sort of AI-based software controller that will likely be using an NVidia chip as well. "Smart" functionality is in just about everything now and it all needs some sort of controller, and as the cost of the raw silicon has doubled in the last 5 years if you arent using the newer smaller nodes then you arent going to be making a profit because you need to get as much out of that wafer as you can.
 
Kind of tacking on, but a big use is also going to be in Provider networks as well. While currently a lot of the things are using FPGAs for acceleration and cryptogrophy, for eCPRI situations in the 5G/Open Ran world are increasingly using GPUs, while the Cable and optical space will probably stick to the FPGA space, as more Open Systems come out (Kind of like mentioned above with firewalls) and we move from Appliance based to having General Computer boxes basically running anything, you're going to see that side of things increase hugely.

-Edit, and another large increase is the huge amount of computer that is being put into the edge as well, so the traditional datacenter use isn't going down, but GPU, AI, and Cryptographic acceleration are being implemented at the edge in increasing numbers for latency improvements and security.
 
Last edited:
Hm makes me wonder if enterprise surplus will end up filling a niche on the consumer side eventually. Like when all these miners are done with a2000s, will they run crysis?

Or put another way, is what we're witnessing the industry pushing consumer side pricing up to match enterprise? I know it's not that simple but it feels like that's what's going on the more I think about it.
 
Or put another way, is what we're witnessing the industry pushing consumer side pricing up to match enterprise? I know it's not that simple but it feels like that's what's going on the more I think about it.
More greedy cryptoshekel miners constraining the consumer market with industrial applications. The more LHR and similar crap to stratify, the better for gamers. This crap wouldn't exist if they had to buy 3k+ cards like everyone else making money.
 
I think AMD's choice for 4GB is a good one. Obviously it's not high end, but this is a $199 1080p card, so I think good enough.

But it doesn't scale. If that is the only way to block miners, what about people on 1440p/4K monitors or high end rigs.

I agree with Frank that these software locks will be cracked in 5 minutes. But maybe there is some other way to do it.

Like maybe before buying a GPU, you have to play a CS:GO match, with only once chance, and get a certain amount of kills. Then you get a ticket to buy the card, lol.

Or, they could do verification though Steam. Like maybe you will need to have an account for at least 1 year, with at least 25 games in your library, and 100 hours of total play time, in order to be allowed to buy anything.

Or, simply require a government ID or passport. You can only buy one video card every 6 months and it is tracked via your id across all websites, so no one can scalp and miners can only buy 1 card.

But honestly the only people getting screwed are us, so I doubt anyone is going to do any of this.
 
I think AMD's choice for 4GB is a good one. Obviously it's not high end, but this is a $199 1080p card, so I think good enough.

But it doesn't scale. If that is the only way to block miners, what about people on 1440p/4K monitors or high end rigs.

I
For 4k why not just get a 6900XT?

They are pretty available and I often see them at microcenter for 1599(ish). Sure, it’s not AMD msrp, but it’s still AIB MSRP and not that much over the AIB price when compared to, say, a 3080.
 
why is the focus of mining always on ETH? It's only 1 piece of the 1000 piece puzzle of mining. And it's not the most profitable coin to mine right now.

Those 4 GB AMD cards will get bought up by miners just the same to be used for other coins.
 
For 4k why not just get a 6900XT?
Well I have a 6800XT that I'm happy with (though not happy I paid $1,600 to a scalper on eBay). It's fine and I don't plan on upgrading anything for several years.

I figured I'd do one full rebuild (last full new computer for me was 8 years ago) so now everything is modern, I can hold onto this for at least like 4 years.
 
why is the focus of mining always on ETH? It's only 1 piece of the 1000 piece puzzle of mining. And it's not the most profitable coin to mine right now.

Those 4 GB AMD cards will get bought up by miners just the same to be used for other coins.
The old Nvidia 670 only had 2GB Ram, and I used it on my 1920x1200 monitor in 2012. If 4GB is too friendly to miners, perhaps 2GB would be a better size?

Edit: Anandtech reviewed the 670 at 2560x1600, so maybe 2GB would be more than fine at 1080
 
Last edited:
I can't help but think -- especially in AMD's case -- that these are laptop parts being pushed to desktop.

Both companies need products to deal with Intel's upcoming mid-tier Arc, and these would do. AMD in particular since the clocks are so high. An undervolted version seems like it would be great as a discrete mobile GPU. Also, AMD keeps hinting at 8 gig versions. Also would make sense for laptops. Not too many miners are mining on mobile.
 
Same performance but probably cheaper to produce.. Maybe when AMD uses their chiplet design on the next generation cards there will be better stock
 
I can't help but think -- especially in AMD's case -- that these are laptop parts being pushed to desktop.
Well it's the same work either way. Laptop GPUs these days aren't vastly different from desktop.

So maybe they started with the mobile idea and realized it could scale to desktop.

But I would believe they wanted a $199 desktop card to begin with, as it does fill a huge hole in the market.
 
I think its a bit too narrow to assert that the NV card is only for miners and that the AMD card is "just fine" with only 4GB, even for gamers. Yes, lots of people will play at 1080P, but if they want to be able to either push higher framerates to go along with those high refresh rate monitors, and/or intend to play more demanding titles than multiplayer shooters from 5+ years ago with minimal settings, 4GB can be limiting. If you wanted to play some sort of single player title at 60+ FPS but even something like a modded Bethesda (Elder Scrolls, Fallout etc), CDPR (Witcher), Mass Effect or a ton of other games on something like Nexus, the low VRAM will mean lower settings and limited mod compatibility. That's to say nothing for modern AA and AAA titles that even at 1080P may be very limited without 8GB of VRAM for nicer textures and the like. Emulation is another issue, especially with more recent or demanding consoles upto and including the Wii, WiiU, PS3, X360 and Switch, have been shown to benefit depending on circumstances. Furthermore, frankly even if its "the possibility of mining", AMD doesn't need to give any more reasons for people to buy its competitor instead. Gaming/general use focused types seem to be buying Nvidia significantly over AMD propelled by everything from the possibility of mining well in their downtime, to Nvidia's push for "DLSS + RTX is the only thing that matters and we have it" in the wakes of , for the first time in years, not being able to dangle the performance crowd heavily above AMD (AMD's mid and higher end are parity for the NV higher end at regular rasterized features, so of course NV makes it about raytracing and DLSS now because that's an area they still have over AMD at least at card launches).

So ultimately AMD is still seemingly on the back burner for desirability despite a great showing and value in certain segments (admittedly, lack of anything but tariff heavy AIBs makes it difficult in many areas), so they don't need to give gamers another reason to just try to spend the additional few bucks on Nvidia instead - they're already coming from behind. The additional cost of just another 4GB of RAM would help a number of gaming related cases (even if both with the same amount of RAM would still favor NV for ETH mining because of architectural changes) , and with this in mind it would still likely mean that prices wouldn't be jacked quite as high for AMD as they will be for NV. 4GB for a "media center" GPU or "non-intensive, retro style use" is one thing but for gaming in 2022 theere are enough reasons that 8GB are useful for a general 1080p gaming card I think they could have gone that direction - not to mention that they don't need any more reasons for people to doubt buying AMD and instead seek out NV.
 
I think its a bit too narrow to assert that the NV card is only for miners and that the AMD card is "just fine" with only 4GB, even for gamers.
I think the only thing that was asserted is that both cards are firmly focused at 1080P gaming and 4GB would be find for most people gaming at that resolution. Nothing more, nothing less.

Also, if you have been keeping up with VRAM costs over the last couple of years, it would be nearly impossible, and certainly not a good business decision, to do 8GB at $199 MSRP.

So yes, I agree with you, this is not the card for everyone, nor did I ever say that, nor did anyone else here either. I could churn out another few paragraphs of what it would not be good for just like you.
So ultimately AMD is still seemingly on the back burner for desirability despite a great showing and value in certain segments
AMD still behind? Yes. AMD selling every single die it can get into packaging? Yes.
 
Well Frank said 8GB cards are coming later in the pipeline. I think this was just to create an initial flood and (hopefully) get the cards into the hands of gamers.

4GB is not great, but I think okay for 1080p. You could still probably get to 144Hz on this card, with mid/low settings. The framerate is mostly about the GHz of the card, not the RAM, and this card is very fast at 2.6GHz.

And you could also use FSR/RSR, although it doesn't look great at lower resolutions, I think 900p would still look okay and allow you to run higher settings (maybe max) and still get 60 fps.

There are also a whole lot of kids that only play games like Fortnite, and I think this card would probably be enough. So I don't know, for $199 it is looking pretty good.
 
With all these cards that are coming out, I am glad that my RX 5700 is working well. If it fails and I cannot afford a new one or cannot justify a new one, maybe I will then just pick up a 5700G or whatever the equivalent is in the future. Essentially, even if the 3050 and 6500XT were at MSRP, they would not be worth it, at least to me.
 
Or, they could do verification though Steam. Like maybe you will need to have an account for at least 1 year, with at least 25 games in your library, and 100 hours of total play time, in order to be allowed to buy anything.

Or, simply require a government ID or passport. You can only buy one video card every 6 months and it is tracked via your id across all websites, so no one can scalp and miners can only buy 1 card.

But honestly the only people getting screwed are us, so I doubt anyone is going to do any of this.
I pray all that was just horribly played sarcasm
o_O
 
I pray all that was just horribly played sarcasm
o_O
It would just be easier for the government to regulate online stores and demand they do a better job at security and interfacing to the the backend. It’s poor security and design that lets the scalper bots do what they do. Mandating better security and design practices would be a more effective process, and better for everybody in the end.
 
The government can't even a bunch of stuff Soapbox reasons.
US/Cad Government can’t even soapbox a soapbox, still more likely to happen than a soapbox soapboxing a soapbox. I expect the EU will pass some regulations which will force the majority of online retailers to shape up, followed shortly by California changing a few arbitrary names and re arranging a few clauses and putting out their own version.
 
why is the focus of mining always on ETH? It's only 1 piece of the 1000 piece puzzle of mining. And it's not the most profitable coin to mine right now.

Those 4 GB AMD cards will get bought up by miners just the same to be used for other coins.

Eth gets the attention because it - the last time I looked - was worth about 90% of the dollar value in newly GPU mined coins.
 
With all these cards that are coming out, I am glad that my RX 5700 is working well. If it fails and I cannot afford a new one or cannot justify a new one, maybe I will then just pick up a 5700G or whatever the equivalent is in the future. Essentially, even if the 3050 and 6500XT were at MSRP, they would not be worth it, at least to me.
The wife and kids better hope my 5700 doesn't die.

If it did... the game would go like this;
How long till my wife realizes her machine is running on its iGPU ?
If I installed a couple extra case fans she might not notice for weeks. Maybe... she might be suspicious on the third or fourth night I head her off before firing up Anno to go Netflix with me instead. lol
 
Back
Top