AMD and Nvidia Must Do More To Stop Shortages and Gouging

I think miners have every right to buy up the cards for their uses, but damn does it suck to have to be considering used 4-5 year old graphics cards off of craigslist for $200 to be "budget" these days.

Agreed completely.

I think everyone's looking at this the wrong way round. What they need to do is release a version of their product that's better for mining than graphics cards. Tweak the memory and CPU to give better hash rates. Perhaps research into models that can help streamline the process of buffering blockchains. Sell them with specific coolers and fans that last longer. Charge a modest premium and at the same time decrease the ROI time sufficiently so that the cards are more efficient. Make interesting features, like "profiles" for specific algos. There's a million ideas that could be used.

How about selling GPU's with PCIe 1x interfaces on them. Sufficient for mining and MUCH simpler to implement on bigger rigs

Include a single display port. Then, when they're resold, cheap gaming GPU's are available.

At the moment they're charging an extortionate premium for their compute GPU's, even though they're the same cores. The supposed "mining cards" aren't available anywhere in any decent numbers, have no display port, so no resale value, and they're based on the entry level models only.

Yes there's a finite number of GPU's made, but at least then they would be two separate products and people would stop whinging like babies that they can't play games.

But for all this at the end of the day the will has to be there, and it's not. Nvidia and AMD don't give a crap about your gaming. They care about selling as many of their cards as they can. ROI.

Also completely agree. I've thought a lot about this too. I think this is a much better idea than trying to gimp normal cards. Companies tried making mining only versions of cards, but they cost as much as the cards that were full fledged cards. So given the choice, obviously, people will choose the ones they can resell.
 
Last edited:
Another half cocked story by cnbc I guess.
I am hoping to build my daughter a PC next holiday season, no way that is happening at current component prices (GPU, SSD, DRAM). If I was in the market as a 1st time PC builder I'd likely just opt for a console until this blows over.

I don't see the GPU makers doing anything about it, perhaps some slight increase in units but nothing significant.

I'll happily sell my GTX 1070 for $800 if any of you GPU crazy miners wants it.

You can easily get her a Ryzen based system with an APU ?
 
I was thinking a bit on what AMD and/or Nvidia could to to dissuade people from using their GPUs for mining... then it came to me... (especially right now)... call Intel!
 
I would like to call out CNBC to hire real journalists ;)

Tae KimInvesting Journalist

At least people who understand that:
Perhaps AMD and Nvidia can work with their retailer partners and limit sales to one or two cards per household. The companies can also incorporate better mining features for their cryptocurrency-specific mining cards or even take the drastic step of altering gaming graphics card firmware to limit mining functionality.

Is total nonsense unless you stuck in a communist country where there are other laws regarding luxury products. Or does the 1st C in CNBC stand for communist ?

If this story was about a different subject then this story would have never been written and be tagged as supply and demand and not seen the light of day.
 
I was thinking a bit on what AMD and/or Nvidia could to to dissuade people from using their GPUs for mining... then it came to me... (especially right now)... call Intel!

Yeah it would be a shame if someone wrote a code that acted like Meltdown or Spectre directed at Mining programs and destroyed hash rates and GPUs used doing it.....
 
I was thinking a bit on what AMD and/or Nvidia could to to dissuade people from using their GPUs for mining... then it came to me... (especially right now)... call Intel!
Why would Nvidia or AMD cares what particular software people choose to run their GPUs with?
 
Last edited:
So let me get this straight...CNBC - with their frequent Activist Investor Guests like Bill Ackman and Carl Icahn who can't advocate laying off people fast enough in an effort to 'extract shareholder value' from companies - is now suddenly concerned about...the people???

I'm literally in tears because I'm laughing so hard.
 
TLDR, yes GPU shortage is a problem for builders. I would suggest that stupid insane DDR4 prices are also a big problem for builders.
 
I cannot wait until the government taxes the shite out of crypto-anything.

I hope they do, and I hope they throw the book at the ones who aren't paying taxes on it. I've heard some people in this forum say things like "Its ok not to report some of your income". No, its not. Its tax evasion, a Felony. It may not catch you now, but it will years down the road when you least expect it. The house (IRS) always wins.
 
To make matters a tad more annoying for card makers and even more so for the gamers , according to Steve from Gamers nexus the ram they put on cards is seeing increases of 20-30$ iso the usual 5$ orso fluctuations which ofc will rise prices even more.

I was thinking about building a htpc/backup pc not sure now, maybe with an AMD APU if they are somewhat decent
 
It is a free market people just think about that and half your comments will go away. Instead of complaining why not just see the good in it.

If I was at AMD or Nvidia, I would jack the prices on all my chips up as high as the market would bear. Extract the absolute maximum amount of money out the crypto craze as I can right now.

In reality most of the crypto coins serve no purpose and create no value in society. If AMD and NVidia were to make a couple billion off of them then we could actually look back and say good, we just put a good amount of investment in those 2 companies they can use to fund further development of their GPUs. For AMD especially this is important as their financial situation has been bad for a long time. If they could flip this into some stability it would be very good for gamers longer term.

I'm wondering when Matrox, Sis or even Samsung will figure out there is a market for affordable GPUs that are good for modern games?
A third party stepping into the ring and saying "Fuck it, we'll aim for the gamer" is what the market needs. Some of the old GPU makers might be long in tooth but surely Samsung could pull something out of it's ass and literally corner the market.

When BTC got big what happened? ASICs, that's what happened. So why are we using GPUs? The free market of miners got all pissy that ASICs would make them obsolete so they invented other coins that made more use of GPU power. If some other company steps in and becomes a major player, some other coin is going to figure out how to make heavy use of whatever features that GPU has. And even if Samsung stepped in, if there was demand NVidia would just increase production and undercut them with their market lead. Whom ever has the best GPU is going to gouge for it, no matter what, be that Samsung, NVidia, AMD or anyone.



I'm a pessimist... This crypto bubble is never going to burst and we're all screwed.

If the crypto market doesn't burst then GPU makers will up capacity and things will correct. There is no way that humans are going to just see upper mid range cards sit at $1000 forever, sooner or later the market forces will react.
 
If the crypto market doesn't burst then GPU makers will up capacity and things will correct. There is no way that humans are going to just see upper mid range cards sit at $1000 forever, sooner or later the market forces will react.

I don't think GPU makers give a damn. But they will when game companies start losing money because people can't afford to buy high end graphics to play the games that require that much horsepower. We'll see.
 
Then there is a contract that says how many chips will be run. AMD in the Bulldozer days didn't satisfy their foundry contract. I remember them having to work out a deal where they ended up pay the penalty over quite a few years. Paying off that penalty was one of the key deals that got them somewhat profitable today so that they can invest into R&D again.

In short, ramping up production is the worst thing that they can do. If they want to cater to miners then what they should do is develop a custom mining solution that the average person can't run at home. Something that the people mining in warehouses would use over a bunch of haphazardly placed GPUs.


Watch the end of this video. I cued it to start at 6:36 and the 7:12 mark is a good place to pause it. Yes, it is about car audio, but then towards the end you see many stacks of EVGA GPUs. Even guys selling speakers on the internet are dual purposing their warehouses to mining!




Price is such a douche.
 
Why buy a g-sync monitor now? How to power VR?

New games will need to have system requirements reduced. In the 'recommended' block they'll need to have cards that are economical and available.
 
I'd like to upgrade My PC from a PNY LC gtx 580 to an Nvidia 1070 Ti FE, but the shortage makes the cards cost almost an arm and a leg in some cases, gold might be cheaper soon if this keeps up.

Or you could buy a used gtx 680, 780, 980, or even stoop to the lowly level of the 770 or 970, you can find lots of fun games to play even those cards are pure overkill. I just picked up a used ATI card from ebay 5xxx series, runs some 2011 and even newer games great. I know you can find a solution, heck there are some good RPGs that only use software and are not too bad on the eyes.

I feel for you, getting a snazzy new card is a good feeling though.
 
Last edited:
When you are playing in the realm of 4k, if the game doesn't support SLI, or it's not half a decade old, there is no overkill, everything is currently underkill. There is almost, but not quite.
 
The only good news seems to be that progress has been incremental on many fronts for quite some time. My i7-2600K is still quite serviceable, although the platform is showing its age.

Based on what we can observe, it seems likely that the middlemen are profiting the most from this. I suspect that much of the finished product is rolling out the back doors of the factories straight into the hands of the miners. If retailers were getting significant stock, you would expect the prices to be sky high, but there would be SOME stock available. They would keep adjusting prices to find what the market will bear.

I was tempted to jump on one of the recent Dell desktop deals given that the price differentials among the video cards were reasonable. The Intel/AMD hybrid chips look like they may be my only route to a reasonable upgrade in the near-term. I haven't seen pricing or any indication that they will be available from retailers outside of NUC's and such. But they would seem to be inconvenient enough for the miners to co-opt cost effectively.

I have been reading about shortages of silicon wafers and fab capacity driven by the ASIC demand, so there may be limitations to what AMD and Nvidia can do on the supply end. They need to walk a tightrope of sorts. If they let this drag on long enough to kill the PC gaming market, it may be a loooong time before it recovers. If the bubble bursts before the PC gaming market dies, they will have a pipeline of product to sell that will have to compete against the flood of dirt cheap used cards of dubious longevity.

Conspiracy time: is all this behind Intel's renewed interest in discrete graphics products and their hiring of Raja away from AMD?
 
I had a thought.

If nVidia and AMD put more effort into mGPU, then we can plug in 2x 1050Ti to get performance on par with a 1070Ti.
(Assuming they can get mGPU performance up to 95% efficiency.)

Since 1050Ti is about 50% of a 1070Ti, if the mGPU scaling had less performance loss, it would be possible to match 1070 or 1070Ti performance levels with two 1050Ti.
 
When you are playing in the realm of 4k, if the game doesn't support SLI, or it's not half a decade old, there is no overkill, everything is currently underkill. There is almost, but not quite.
I'm happy with 1080p, you are right for 4K. I will probably go 8K and skip 4K. I don't see a huge improvement for the cost. I've got a decent monitor and was happy to go widescreen from my previous 1600x1200 lcd.
 
I am hoping to build my daughter a PC next holiday season, no way that is happening at current component prices (GPU, SSD, DRAM). If I was in the market as a 1st time PC builder I'd likely just opt for a console until this blows over.

I don't see the GPU makers doing anything about it, perhaps some slight increase in units but nothing significant.

I'll happily sell my GTX 1070 for $800 if any of you GPU crazy miners wants it.


cheaper to just get her a laptop.
 
I had a thought.

If nVidia and AMD put more effort into mGPU, then we can plug in 2x 1050Ti to get performance on par with a 1070Ti.
(Assuming they can get mGPU performance up to 95% efficiency.)

Since 1050Ti is about 50% of a 1070Ti, if the mGPU scaling had less performance loss, it would be possible to match 1070 or 1070Ti performance levels with two 1050Ti.

I was thinking of AMD and Nvidia designing a device that had 20 or more GPU cores on it, shared HBM2 memory and sold as a system in a box for mining. So what you're saying applies 100%. ;)
 
I had a thought.

If nVidia and AMD put more effort into mGPU, then we can plug in 2x 1050Ti to get performance on par with a 1070Ti.
(Assuming they can get mGPU performance up to 95% efficiency.)

Since 1050Ti is about 50% of a 1070Ti, if the mGPU scaling had less performance loss, it would be possible to match 1070 or 1070Ti performance levels with two 1050Ti.


Ugh. This is an awful idea.

I've had enough experience with both Crossfire and SLI to know that it is never a good solution.

You get higher max fps when you need it the least and often lower min fps when you need it the most. And that's before we even start mentioning such issues as the massive increase in input lag, stutter issues, compatibility issues, etc.

mGPU always looks good on an average FPS benchmark chart, but in real life the experience is subpar.

They are essentially the Dyno Queens of the computer world.

Experience has taught me that it's "single big GPU or go home".

This wasn't always the case, but it has been in the modern area. The last decent mGPU implementation was IMHO the Voodoo 2. But that was a different approach all together alternating each row of pixels between the GPU's.

Many of the problems with mGPU today are due to the insurance to use alternating frames. AFR ruins everything, but it is always used because it "scales better" than various AFR implementations, by which they mean, you see a bigger increase on an average FPS barchart.

They do this because the average idiot doesn't seem to realize that average FPS means nothing. The only measure to look at is minimum FPS. That's when you feel the impact of framerate drops, and in this regard AFR modes usually do no better than a single GPU and often even do worse. At least this has been my experience.

So, more mGPU? No thanks.
 
I want a central, all-house, GPU. I envision a big box with thick power cables. A heat mirage shimmering off of it while a hum fills the air. All the computers have wires snaking into it, while the Big GPU (call it "Lubyanka"?) sends its output to every monitor.

Buy once, cry once. Sure, ten pounds of processed silicon may cost a bit, but it'd get me away from the hodgepodge of upgrades.

I can reduce my GPU footprint from 6-10 (depending on how many computers I'm running) to just one.

I think this is a great idea.

AMD, if you copy this, I claim patent rights. Or something.
 
on what grounds can someone call out a corporation for purposely not creating enough supply and driving up demand?
Isn't that what practically tons of other companies do? When's the last time i saw an article lambasting nintendo from doing this for it's previous 3 generation of consoles?
Does nvidia/amd have any obligation to meet the demands of the consumers?

There aren't any grounds when there's a duopoly unless there is collusion. Also, the companies do want to make more. Would you rather make $300 profit on 1,000 units or $100 profit on 6,000 units? I'm overly simplifying but every company is "greedy" and tries to maximize their income so to think they are selling less on purpose doesn't really make sense.
 
I had a thought.

If nVidia and AMD put more effort into mGPU, then we can plug in 2x 1050Ti to get performance on par with a 1070Ti.
(Assuming they can get mGPU performance up to 95% efficiency.)

Since 1050Ti is about 50% of a 1070Ti, if the mGPU scaling had less performance loss, it would be possible to match 1070 or 1070Ti performance levels with two 1050Ti.

I don't think 1050 and 1050 Ti support SLI.
 
cheaper to just get her a laptop.

She is a gamer so probably not. I have a 950 in and old Q6600 so she would have to keep using that until GPU prices come down. Other than cases and motherboards most components are just higher which really hurts entry level gaming builds.

I did a build recently and probably ended up spending an extra $200 more than I needed to which for me is no big deal but when you are working in the low end every $30-$50 really adds up and blows up the budget.
 
She is a gamer so probably not. I have a 950 in and old Q6600 so she would have to keep using that until GPU prices come down. Other than cases and motherboards most components are just higher which really hurts entry level gaming builds.

I did a build recently and probably ended up spending an extra $200 more than I needed to which for me is no big deal but when you are working in the low end every $30-$50 really adds up and blows up the budget.

No, he's probably right. You can get gaming laptops far cheaper and with better graphics capability than building a comparable machine and paying these high prices for top end cards.

Bestbuy has laptops with 1050 and 1050 Ti cards in them for under $900, 1070 cards for $1,600, etc.

Or spend your money on nice gear and short the machine with a low end gaming card like a 1050 and live with 1080P until such time as cards are available and reasonable and she's ready to step it up.
 
No, he's probably right. You can get gaming laptops far cheaper and with better graphics capability than building a comparable machine and paying these high prices for top end cards.

Bestbuy has laptops with 1050 and 1050 Ti cards in them for under $900, 1070 cards for $1,600, etc.

Or spend your money on nice gear and short the machine with a low end gaming card like a 1050 and live with 1080P until such time as cards are available and reasonable and she's ready to step it up.

The laptop versions are usually gimped with lower stock clocks, different caps, and lower power envelopes (so throttle quicker) Not apples to apples (in most cases). As always deep dive into the specs (CUs, ROPs, SPs, stock clock, TDP)
 
I had a thought.

If nVidia and AMD put more effort into mGPU, then we can plug in 2x 1050Ti to get performance on par with a 1070Ti.
(Assuming they can get mGPU performance up to 95% efficiency.)

Since 1050Ti is about 50% of a 1070Ti, if the mGPU scaling had less performance loss, it would be possible to match 1070 or 1070Ti performance levels with two 1050Ti.

That what I think NAVI is about, but without the overhead associated with SLI. (Special drivers written by AMD/NVIDIA, special implementation by game devs, doubling of resources, and sending the information twice (once for each card))
 
Sure, Nvidia could produce more but they might have a higher marginal cost to really ramp up production.. and for what? They don't get any of the retailer profit.

Probably an unpopular opinion but isn't this problem exacerbated by the fact that AMD Vega 3rd party cards are basically nonexistant? From my viewpoint there seem to be lots of Nvidia stock that is rapidly getting bought out but not so much AMD stock that even exists.

They don't get any retailer profit but the fact that prices are up 200% implies that demand is significantly higher than supply which means they could sell a higher volume of cards at the same profit margin which would mean larger profits to NVDA. How much more money would Nvidia make if they sold 200% more cards in order to get retail prices back to equilibrium?
 
Going to suck but I can see Nvidia and AMD rasing their prices for the new GPUs. 2080 for 1k MSRP for all! Then miners will jack it up to 2k!
 
An article from CNBC is calling out AMD and Nvidia for not doing something to combat the product shortages and price gouging occurring in the GPU market. Speaking about the shortages, Shrout Research's Ryan Shrout wrote in an email "Because of that, many enthusiasts and DIY builders are putting off system builds and upgrades completely, setting up an unfortunate situation for all other component vendors from processors to motherboards to storage."

I'm honestly shocked that CNBC is reporting on this, that is pretty bad press for AMD and Nvidia. I really like what Ryan Shrout had to say as well, I never thought about the crypto-craze harming other segments of the PC market, but it makes complete sense. It's nice that AMD made a statement when contacted by CNBC, but it feels little more than lip service when Vega 64's are going for $1,300.

Nvidia declined to comment for this story.

AMD sent the following statement when asked about the graphics card shortages for PC gamers due to the cryptocurrency mining:

"The gaming market remains a priority for AMD and gamers are a primary focus across our GPUs, CPUs and semi-custom game console products."

AMD and Nvidia can talk all they want but behind the scenes they love this..it is $$$ in their pockets and the investors are loving it. Shrout can talk about this...but again the GPU folks are making them as fast as they can...enjoy the wave...it sucks for gamers right now..but that's the breaks.
 
I wanted to build a new PC but will not do it until prices stabilize.
If it continues, I would never build another PC. That simple.
 
  • Like
Reactions: ncjoe
like this
The laptop versions are usually gimped with lower stock clocks, different caps, and lower power envelopes (so throttle quicker) Not apples to apples (in most cases). As always deep dive into the specs (CUs, ROPs, SPs, stock clock, TDP)


Yes, but some support G-Sync and that can make a real difference.
 
I wanted to build a new PC but will not do it until prices stabilize.
If it continues, I would never build another PC. That simple.

Could be worse. In 1990, my 20MHz 386DX with 1 meg of ram 40meg HD, 512K video, and 13" VGA Monitor was $2000. In todays terms with inflation doubling every 23.4 years (on average of 3% inflation) or 1.03 ^ 28 = 2.28 price factor, that is $4575 in todays dollars.
 
How much more money would Nvidia make if they sold 200% more cards in order to get retail prices back to equilibrium?

The question though is how much does it cost to double production? I doubt the foundries just have that kind of capacity lying idle.
 
They need a new card with three cores on it. The miners will love it, and VR can have one for the game, and one for each eye. :p.
But really I hope after this dies down the companies release a killer VR card.
 
Back
Top