AMD Is Losing Ground to Intel in This Key Chip Market (discrete video cards)

Status
Not open for further replies.

Armenius

Extremely [H]
Joined
Jan 28, 2014
Messages
41,724
Is this the competition you wanted? AMD's marketshare in discrete video cards fell from 17% to 8% as Intel captured 4% in the most recent quarter reported by Jon Peddie Research, while NVIDIA's marketshare grew from 83% to 88%. AMD's marketshare was split between its two competitors, with NVIDIA benefitting the most. The JPR report has been floating around for a month or so, but I haven't seen any market analysis on it yet. We'll see what Q4'22 and Q1'23 brings now that both AMD and NVIDIA have a new generation of products on the market.

https://www.nasdaq.com/articles/amd-is-losing-ground-to-intel-in-this-key-chip-market
Advanced Micro Devices (NASDAQ: AMD) and Nvidia (NASDAQ: NVDA) hold a near-duopoly in discrete GPUs. Nvidia controls the majority of the market with its higher-end chips, while AMD remains the persistent underdog in the lower-end market. However, that duopoly could soon be threatened by Intel (NASDAQ: INTC), which recently returned to the discrete GPU market after an absence of more than two decades. According to JPR, Intel captured 4% of the discrete GPU market in the third quarter of 2022, compared to roughly 0% a year earlier. AMD's share dropped from 17% to 8%, while Nvidia's share rose from 83% to 88%.

It wasn't surprising to see AMD cede some of its market to Nvidia, but its losses to Intel are troubling because the latter has been targeting many of the same lower-end PC gamers as AMD.
 
Eh, it is what it is. The real question for me is, has AMD increased their market share in the server and professional CPU and GPU space? More specifically, I am going to assume the discrete market is significantly smaller than the professional market but, I could be wrong. (They need to increase their presence in that arena.)
 
Not too surprised (I'm sure AMD is mostly competitive in that lower/mainstream tier that Intel is currently targetting). I'm just hoping it's enough success to convince Intel to keep up with it.

I thought Arc 770 and 750 were pretty decent all things considered.

Edit - Actually, just re-read that first sentence; I didn't realize Nvidia market share grew, I thought that Intel just cut into AMD's market. That *is* surprising, I thought the 6000-series were very competitive for AMD (especially that 6600 - 6700 range).
 
Not too surprised (I'm sure AMD is mostly competitive in that lower/mainstream tier that Intel is currently targetting). I'm just hoping it's enough success to convince Intel to keep up with it.

I thought Arc 770 and 750 were pretty decent all things considered.

Edit - Actually, just re-read that first sentence; I didn't realize Nvidia market share grew, I thought that Intel just cut into AMD's market. That *is* surprising, I thought the 6000-series were very competitive for AMD (especially that 6600 - 6700 range).
Intel and NVidia grew market share at AMD’s expense. It’s not that the 6000 series wasn’t good it’s just that AMD didn’t make a lot of them. AMD is focusing on margins, they have told their investors they plan on getting close to 60%, that is hard to do in the consumer space and near impossible in the budget GPU space where there is ample competition.
 
Intel and NVidia grew market share at AMD’s expense. It’s not that the 6000 series wasn’t good it’s just that AMD didn’t make a lot of them. AMD is focusing on margins, they have told their investors they plan on getting close to 60%, that is hard to do in the consumer space and near impossible in the budget GPU space where there is ample competition.
If AMD gets too focused on margins, they will repeat the error that plagued Intel for so many years. Insisting on having set margins resulted in higher prices for stagnant products, and an inflexibility that caused Intel to lose out on the mobile market. It appears that Intel is more flexible with their required margins now, which is likely a large contributing factor to their current resurgence and innovation.

While it is good that AMD has become a profitable company, for capturing markets they should still act like they are on the brink of disaster and aggressively pursue marketshare (and mindshare) for a few more tech generations. Their GPU marketshare has never gained to comfortable levels, and the CPU marketshare is still too low to profitably withstand a resurgent Intel without being thrown back to the 2010 situation.
 
I didn’t check the article, but did the overall size of the market grow? AMD is selling everything they make right? So it doesn’t quite make sense that they’ve lost market share unless the overall market grew, which seems more likely.
 
I didn’t check the article, but did the overall size of the market grow? AMD is selling everything they make right? So it doesn’t quite make sense that they’ve lost market share unless the overall market grew, which seems more likely.
I see this argument a lot. Are you suggesting that because they sell all their stock that they are somehow still moving forward?

If they make 1000 cards and sell them all and Nvidia makes 5000 and only sells 4000, AMD is still significantly behind.

If they are selling everything they are making, I would argue that they are trying to keep their prices artificially high. However, if they are selling them AS FAST as they can make them, then good on them.
 
I didn’t check the article, but did the overall size of the market grow? AMD is selling everything they make right? So it doesn’t quite make sense that they’ve lost market share unless the overall market grew, which seems more likely.
I believe the market shrunk in that quarter but Nvidia suffered less compared to AMD. thus Nvidia managed to increase market share in a falling market
 
Intel and NVidia grew market share at AMD’s expense. It’s not that the 6000 series wasn’t good it’s just that AMD didn’t make a lot of them. AMD is focusing on margins, they have told their investors they plan on getting close to 60%, that is hard to do in the consumer space and near impossible in the budget GPU space where there is ample competition.
Don't worry, they'll sell every card they make. All 200,000 until next cycle, as far as we are aware at least. :cautious:
AMD is selling what they make. They have prioritized other product lines over consumer GPUs. It would be a problem if they were sitting on massive inventory that wasn't selling.
Getting pushed out of the market because you refuse to keep up isn't going to help them either.
I didn’t check the article, but did the overall size of the market grow? AMD is selling everything they make right? So it doesn’t quite make sense that they’ve lost market share unless the overall market grew, which seems more likely.
Or AMD is making significantly less cards than in years past. In any case, they might have sold everything they made, but they only made a figurative handful.
 
Somewhat posted a graph and AMD shrinking after a mining bust seem to be something that happened the 2 previous time it happened, we can imagine 2 possibles reasons that obviously come to mind

1) Buyer that buy any GPUs they can get their hands on, removing brand value.
2) Nvidia appealing to less high knowledge buyer and OEM machine and less affected by the nice second hand market available.

I am not sure Intel is much of a relevant variable.
 
I see this argument a lot. Are you suggesting that because they sell all their stock that they are somehow still moving forward?
This question doesn't really make sense. In my reading, you're basically asking: "if AMD is selling all their stock, will they continue to be in business?" and my answer is yes. But that seems like an obvious question and an obvious answer.
If they make 1000 cards and sell them all and Nvidia makes 5000 and only sells 4000, AMD is still significantly behind.

If they are selling everything they are making, I would argue that they are trying to keep their prices artificially high. However, if they are selling them AS FAST as they can make them, then good on them.
Both nVidia and AMD are supply limited by TSMC to get the process nodes they want to make these cards. As for your other supposition, this is economics 101. But because we don't have access to either the information regarding how many cards they have manufactured and then sold, the length of time it takes them to do so (or any other indicator regarding demand) it would be impossible for any of us to make a "simple" supply and demand curve. At least one with any level of accuracy.
Or AMD is making significantly less cards than in years past. In any case, they might have sold everything they made, but they only made a figurative handful.
Possible. But unlikely. There was no sane manufacturer that "purposely" stepped down their production during the pandemic and the height of the silicon shortage.
Although it might be a more interesting article to talk about how the past few quarters were about being supply constrained, and who got fabrication priority for the best process nodes between Apple, AMD, and nVidia. Heck, it might've been Apple's fault that AMD/nVidia were limited on how many GPU's they could make. I skimmed the article and it seems like this article is kinda missing the point. The silicon shortage surely was at least partially to blame for shifts in the GPU market and it feels like it is purposely trying to make it sound like Intel is some how shifting all of the competition.

Additionally the 7900XTX just hit the market. I'm certain they will get an uptick with their new product hitting the shelves that most would consider to be a decent sweet spot in terms of pricing vs performance, even if it doesn't have "absolute" performance.
 
Possible. But unlikely. There was no sane manufacturer that "purposely" stepped down their production during the pandemic and the height of the silicon shortage.
It is almost certain if we talk volume taht AMD made less discrete GPU since early 2018 ever since:

https://hardforum.com/attachments/361586/

The success of intel Igpu and other force diminish year after year discrete gpu sales since the 2008 economic crisis.

They can purposely do less gpu to sales much more profitable epyc CPUs if they have the choice between the 2 and unpurposely if they have contractual obligation toward sony-microsoft if it ever compete between one or the other (tsmc or otherwise)
 
It is almost certain if we talk volume taht AMD made less discrete GPU since early 2018 ever since:

https://hardforum.com/attachments/361586/

The success of intel Igpu and other force diminish year after year discrete gpu sales since the 2008 economic crisis.

They can purposely do less gpu to sales much more profitable epyc CPUs if they have the choice between the 2 and unpurposely if they have contractual obligation toward sony-microsoft if it ever compete between one or the other (tsmc or otherwise)
That’s fair. Is AMD counting its iGPU as a discrete GPU? That does cut both ways then.

We all know the bottom end of the market is dying due to iGPU’s being “good enough” for the office world.
 
That’s fair. Is AMD counting its iGPU as a discrete GPU? That does cut both ways then.

We all know the bottom end of the market is dying due to iGPU’s being “good enough” for the office world.
Shit, I play fortnite on a 3200g processor on the igpu @ 120fps. I've never had a gpu on the system and I see no reason to.
 
Most of those 200,000 were XT’s
1672178894759.jpeg
 
I heard you need a 4090 just to launch it since the UE5 update! /s
Well, it is harder on the system specs now, even with the graphics turned down, but I can't comment on the top end, although I have heard the cries.
 
An 88% market share isn't really good for any consumer, in my view, especially NVidia loyalists. Honestly, I would love nothing more than to see a return to the market upsets of the old ATI Radeon 9800 days in 2003 when ATI turned the tables on NVidia. Not only did we get great ATI cards but NVidia struck back with their Geforce 6800/6600 series which were some of the biggest year-on-year improvements that I have ever seen in a video card lineup.

The problem is that AMD really can't compete all that well technically at the enthusiast level right now (my understanding is that AMD's answer to DLSS and ray tracing is quite a ways behind NVidia, which is what most people buying GPU's in this market segment care about in 2022). That mainly leaves AMD with the midrange linup but I really don't see them trying to compete there either.

I know that I have said it again and again but I was in the market for a 100W GPU this past summer and the best I could buy was a three year old 1650 Super. AMD had released a RX 6500XT in 2022 and it was a lot newer tech (PCI-E 4.0, much smaller process) but it was no faster than the much older 1650 Super thanks to a gimped 64-bit mem bus (this discussion point matters because the 1650 is currently the most popular card on the market right now). Why would I go with the AMD GPU in this situation when I would not be getting a faster gpu but would be losing out on the ability to use Moonlight and access to drivers that, were in my experience, more stable? Now, if the 6500XT was 30% faster than the 1650 Super the decision would have been a lot tougher and I possibly would have gone with AMD. Of course, in that circumstance, NVidia probably would have released something better than the 1650 Super to compete back which helps everybody across the board. My main issue is that I don't really see AMD trying to compete, it's more like they are just treading water to not fall too far behind NVidia but I can't imagine that working out well in the long-term. Part of me wishes that AMD never bought ATI.
 
Intel sold graphic cards at or bellow $300. They're not good but they are affordable. So far I've only seen new GPU's from AMD that are just slightly cheaper then Nvidia's insane prices.
Eh, it is what it is. The real question for me is, has AMD increased their market share in the server and professional CPU and GPU space? More specifically, I am going to assume the discrete market is significantly smaller than the professional market but, I could be wrong. (They need to increase their presence in that arena.)
I've heard shareholders say the same thing about Nvidia's investment into the server and professional markets. What do I care about AMD's investments into markets that don't matter to me?
 
An 88% market share isn't really good for any consumer, in my view, especially NVidia loyalists. Honestly, I would love nothing more than to see a return to the market upsets of the old ATI Radeon 9800 days in 2003 when ATI turned the tables on NVidia. Not only did we get great ATI cards but NVidia struck back with their Geforce 6800/6600 series which were some of the biggest year-on-year improvements that I have ever seen in a video card lineup.

The problem is that AMD really can't compete all that well technically at the enthusiast level right now (my understanding is that AMD's answer to DLSS and ray tracing is quite a ways behind NVidia, which is what most people buying GPU's in this market segment care about in 2022). That mainly leaves AMD with the midrange linup but I really don't see them trying to compete there either.

I know that I have said it again and again but I was in the market for a 100W GPU this past summer and the best I could buy was a three year old 1650 Super. AMD had released a RX 6500XT in 2022 and it was a lot newer tech (PCI-E 4.0, much smaller process) but it was no faster than the much older 1650 Super thanks to a gimped 64-bit mem bus (this discussion point matters because the 1650 is currently the most popular card on the market right now). Why would I go with the AMD GPU in this situation when I would not be getting a faster gpu but would be losing out on the ability to use Moonlight and access to drivers that, were in my experience, more stable? Now, if the 6500XT was 30% faster than the 1650 Super the decision would have been a lot tougher and I possibly would have gone with AMD. Of course, in that circumstance, NVidia probably would have released something better than the 1650 Super to compete back which helps everybody across the board. My main issue is that I don't really see AMD trying to compete, it's more like they are just treading water to not fall too far behind NVidia but I can't imagine that working out well in the long-term. Part of me wishes that AMD never bought ATI.

Because the RX 6500 XT supports Ray Tracing and the RTX 1650 Super does not.
 
Because the RX 6500 XT supports Ray Tracing and the RTX 1650 Super does not.
The market is completely skewed towards nvidia.

The RX 6600 / 6600 XT have better ray tracing performance than RTX 3050 but the street price of the Nvidia card was more.

People are just not buying AMD cards. I don't think there were any driver/quality/efficiency complaints regarding RDNA2, but still AMD struggled to sell compared to nvidia
 
The market is completely skewed towards nvidia.

The RX 6600 / 6600 XT have better ray tracing performance than RTX 3050 but the street price of the Nvidia card was more.

People are just not buying AMD cards. I don't think there were any driver/quality/efficiency complaints regarding RDNA2, but still AMD struggled to sell compared to nvidia
AMD was badly supply constrained previously. Zen 3, RDNA 2 and Xbox and Playstation are all on the exact same node. TSMC was at capacity on that node and AMD wasn't the only company fabbing on that node so AMD had to decide what products took priority and it was obvious the GPUs took last place. Obviously those products are still fabbed using that node but in the case of Zen 3 and RDNA 2 the number which need to be fabbed has dropped since RDNA 3 and Zen 4 don't use it.

It's not a surprise that AMD lost some marketshare despite selling most everything the company manufactured. I have no clue of the current state of yields or capacity with regards to RDNA 3 but I doubt we'll see the same supply constraints this time around especially considering the chiplet approach AMD moved to for GPUs. This will likely alleviate many of the supply issues for AMD regarding GPUs. It would be surprising if AMD doesn't regain much of the lost marketshare.
 
I didn’t check the article, but did the overall size of the market grow? AMD is selling everything they make right? So it doesn’t quite make sense that they’ve lost market share unless the overall market grew, which seems more likely.
No, the market experienced one of its biggest quarter-to-quarter drop in shipments since 2009. Discrete GPU shipments shrunk by 42%.
That’s fair. Is AMD counting its iGPU as a discrete GPU? That does cut both ways then.

We all know the bottom end of the market is dying due to iGPU’s being “good enough” for the office world.
They're not. If you include all GPUs Intel holds a 70% stranglehold on the graphics market.

https://www.jonpeddie.com/news/q322-biggest-qtr-to-qtr-drop-since-the-2009-recession/
1672237725055.png
 
No, the market experienced one of its biggest quarter-to-quarter drop in shipments since 2009. Discrete GPU shipments shrunk by 42%.
Shrunk unequally obviously. Going back to the article, it seems again like it's missing the point. Silicon shortage, iGPU's, and a host of other factors being significantly more responsible rather than Intel entering the discrete GPU market.
They're not. If you include all GPUs Intel holds a 70% stranglehold on the graphics market.

https://www.jonpeddie.com/news/q322-biggest-qtr-to-qtr-drop-since-the-2009-recession/
View attachment 537714
Again, makes sense. As Intel dominates the market in terms of CPU platform percentage. AMD also hides a lot of its market-share in other forms as we all know. They've survived by making APU's for consoles and semi-custom silicon for other vendors. When only looking specifically at PC's, yeah, Intel is a giant comparatively.
 
Is this the competition you wanted? AMD's marketshare in discrete video cards fell from 17% to 8% as Intel captured 4% in the most recent quarter reported by Jon Peddie Research, while NVIDIA's marketshare grew from 83% to 88%.
Surprising to me, but ARC seems to be better overall.

The 6600XT and ARC 750 seem to compete on a dollar basis. While the 6600XT wins in most games at 1080p, the fps is so high it doesn't matter. $300 for a "budget" card, is pretty steep, so it looks like these "budget" buyers are more likely to own 4K screens, where the ARC 750 generally wins. The fps there is going to matter more. It also makes things playable with raytracing on.

AMD needs to get their budget 7XXX cards out fast.
 
The market is completely skewed towards nvidia.

The RX 6600 / 6600 XT have better ray tracing performance than RTX 3050 but the street price of the Nvidia card was more.

People are just not buying AMD cards. I don't think there were any driver/quality/efficiency complaints regarding RDNA2, but still AMD struggled to sell compared to nvidia

This tells me that same picture.. Funny that the RX 580 and 570 saw upticks for November. (Budget gamers buying miner scraps?)


1672249348566.png
 
This is typical in recent post mining busts, not sure what the big deal is as the numbers are all over the place. Next quarter they'll gain a few % to Nv and it'll stabilise a bit.
Surprising to me, but ARC seems to be better overall.

The 6600XT and ARC 750 seem to compete on a dollar basis. While the 6600XT wins in most games at 1080p, the fps is so high it doesn't matter. $300 for a "budget" card, is pretty steep, so it looks like these "budget" buyers are more likely to own 4K screens, where the ARC 750 generally wins. The fps there is going to matter more. It also makes things playable with raytracing on.

AMD needs to get their budget 7XXX cards out fast.
Arc 750 at 4k with RT on? For budget gamers what matters more is 1080p and at a stretch qhd screens where the 6600 XT generally wins with much more mature drivers and lower power consumption, which plays into budget builds with weak PSU's.
 
Surprising to me, but ARC seems to be better overall.

The 6600XT and ARC 750 seem to compete on a dollar basis. While the 6600XT wins in most games at 1080p, the fps is so high it doesn't matter. $300 for a "budget" card, is pretty steep, so it looks like these "budget" buyers are more likely to own 4K screens, where the ARC 750 generally wins. The fps there is going to matter more. It also makes things playable with raytracing on.

AMD needs to get their budget 7XXX cards out fast.

did you just compare budget cards and 4k? Not how it works. You don't buy a budget card to play at 4k, that is weird mindset. Intel is more likely pushing prebuilt computer makers to include the cards in it thats the only way I see it make sense and gaining market share. I really highly doubt average user is buying arc given how terrible the drivers were and obviously imporving but the image wasn't really there for them to grab 4% market share given the launch. So most likely they are just putting these in the pre builts.
 
AMD was badly supply constrained previously. Zen 3, RDNA 2 and Xbox and Playstation are all on the exact same node. TSMC was at capacity on that node and AMD wasn't the only company fabbing on that node so AMD had to decide what products took priority and it was obvious the GPUs took last place. Obviously those products are still fabbed using that node but in the case of Zen 3 and RDNA 2 the number which need to be fabbed has dropped since RDNA 3 and Zen 4 don't use it.

It's not a surprise that AMD lost some marketshare despite selling most everything the company manufactured. I have no clue of the current state of yields or capacity with regards to RDNA 3 but I doubt we'll see the same supply constraints this time around especially considering the chiplet approach AMD moved to for GPUs. This will likely alleviate many of the supply issues for AMD regarding GPUs. It would be surprising if AMD doesn't regain much of the lost marketshare.
? RDNA2 cards have never been out of stock. The problem that AMD is having is one of mindshare, the fact that nVidia hasn't screwed up royally enough to warrant a change, and AMD's inability to slather the internet with enough marketing to claw market share back.
 
? RDNA2 cards have never been out of stock. The problem that AMD is having is one of mindshare, the fact that nVidia hasn't screwed up royally enough to warrant a change, and AMD's inability to slather the internet with enough marketing to claw market share back.

Umm, were you not there when Crypto was a thing?
 
An 88% market share isn't really good for any consumer, in my view, especially NVidia loyalists. Honestly, I would love nothing more than to see a return to the market upsets of the old ATI Radeon 9800 days in 2003 when ATI turned the tables on NVidia. Not only did we get great ATI cards but NVidia struck back with their Geforce 6800/6600 series which were some of the biggest year-on-year improvements that I have ever seen in a video card lineup.

The problem is that AMD really can't compete all that well technically at the enthusiast level right now (my understanding is that AMD's answer to DLSS and ray tracing is quite a ways behind NVidia, which is what most people buying GPU's in this market segment care about in 2022). That mainly leaves AMD with the midrange linup but I really don't see them trying to compete there either.

I know that I have said it again and again but I was in the market for a 100W GPU this past summer and the best I could buy was a three year old 1650 Super. AMD had released a RX 6500XT in 2022 and it was a lot newer tech (PCI-E 4.0, much smaller process) but it was no faster than the much older 1650 Super thanks to a gimped 64-bit mem bus (this discussion point matters because the 1650 is currently the most popular card on the market right now). Why would I go with the AMD GPU in this situation when I would not be getting a faster gpu but would be losing out on the ability to use Moonlight and access to drivers that, were in my experience, more stable? Now, if the 6500XT was 30% faster than the 1650 Super the decision would have been a lot tougher and I possibly would have gone with AMD. Of course, in that circumstance, NVidia probably would have released something better than the 1650 Super to compete back which helps everybody across the board. My main issue is that I don't really see AMD trying to compete, it's more like they are just treading water to not fall too far behind NVidia but I can't imagine that working out well in the long-term. Part of me wishes that AMD never bought ATI.
AMD does well with what they have, but Nvidia literally wrote the book on ray tracing methods and denoising algorithms. Their work and patents leapfrogged everyone in the field and it gives them a strong advantage there. AMD doesn’t need to compete at the top, that’s not where most people live that’s the home of the few ~5%, the remaining 95% of us live in the murky middle ground. AMD needs to offer up a product stack for the masses.

AMD is incredibly diverse, lots of hats in a lot of rings and while that gives them strength it punishes them for being fabless. They can’t currently meet demand for their products in any of the spaces they operate in because TSMC can’t meet their demand. AMD is forced to triage their limited wafer allotment amongst all its product lineups in a way that keep their existing customer base while appeasing investors. Problem is investors want a bigger push into Datacenter and Enterprise because the margins are better and it offers long term stability that look good on investment reports.
 
Umm, were you not there when Crypto was a thing?
Yup and AMD cards were available. Priced through the roof? Yes. Unavailable? Nope. I was forced to buy a card during this time. So yeah there were many cards that were always available, 6600xt was one of them.
 
For the visual learners. Not only is Nvidia taking the lion's share, but discrete GPUs seem to be targeted for the elite only as all numbers are going down while these companies maintain their bottom line by making it up with targeting the high end.

Its easier and cheaper to build, package, and ship 1 million cards at a $300 profit each then doing the same for 10 million card at $30 profit each.

Screenshot_20221228-181034_Samsung Internet.jpg
Screenshot_20221228-180908_Samsung Internet.jpg
 
For the visual learners. Not only is Nvidia taking the lion's share, but discrete GPUs seem to be targeted for the elite only as all numbers are going down while these companies maintain their bottom line by making it up with targeting the high end.

Its easier and cheaper to build, package, and ship 1 million cards at a $300 profit each then doing the same for 10 million card at $30 profit each.

View attachment 537817View attachment 537818
That’s a rough picture. I suppose we can expect the middle range of cards to die. I guess Intel has a shot at selling mid ranged graphics cards that Nvidia and AMD will increasingly ignore and pickup market share that way while they figure out how to build a high end card over 10 years.
 
Does it really make sense to even make a midrange, mainstream or budget card anymore? Just look at the last gen mainstream RTX 3050. The RTX 2060 midrange card from 2 generations ago is a better buy and likely cheaper to produce.

Would it make sense to design a budget 4030 when the 3050 will get even cheaper as well as the 2060?

Same goes as a 4050 against a 3060 or a 2070.

People just need to get over buying used or just 'old' hardware if they really want to game on the low to mid range.
 
Intel makes a good budget card for the people who need to plug in a shit tonne of monitors, but not do a lot of processing.
 
Does it really make sense to even make a midrange, mainstream or budget card anymore? Just look at the last gen mainstream RTX 3050. The RTX 2060 midrange card from 2 generations ago is a better buy and likely cheaper to produce.

Would it make sense to design a budget 4030 when the 3050 will get even cheaper as well as the 2060?

Same goes as a 4050 against a 3060 or a 2070.

People just need to get over buying used or just 'old' hardware if they really want to game on the low to mid range.
It would be nice if graphic cards worked like iPhones, in which they would just keeping make the "top end cards" of previous generations and using those as the mid-range and eventually bottom end cards.
I'm sure people would continue buying $600 6900XTs or $800 3090Tis. And in another two years buy those same two cards for $300-$450 or whatever. It seems like it's not viable to do it though. Just due to things like die size/transistor counts. Although those previous process nodes would continue to drop in price/value and certainly the foundries like TSMC and Samsung would be happy to have older generations get extended out and increase their ROI.

It would certainly have other benefits too: like only having to design one top end piece of silicon or a few top ends pieces of silicon and pouring all the resources into that, rather than having to design an entire product stack which costs more and requires a lot more time to get ROI. (Although AMD's partial solution would be to simply create dies that are in chiplet form that they can scale up and down by simply having more or less chiplets... which might be the "half-measure").

Will likely never happen, but we can dream.
 
Does it really make sense to even make a midrange, mainstream or budget card anymore? Just look at the last gen mainstream RTX 3050. The RTX 2060 midrange card from 2 generations ago is a better buy and likely cheaper to produce.

Would it make sense to design a budget 4030 when the 3050 will get even cheaper as well as the 2060?

Same goes as a 4050 against a 3060 or a 2070.

People just need to get over buying used or just 'old' hardware if they really want to game on the low to mid range.
Of course it makes sense to produce something other than the very top end of cards. That anyone would even think it's a good idea not to is proof of how skewed and screwed up a lot of people's thinking is. Apply that exact mindset to any other industry and see how it would work out. Hint: it would be a disaster. That's the equivalent of the auto industry no longer producing vehicles which cost less than $100k and expecting the used market to cover the use case of everyone who can't afford a $100k car.

I don't understand where this thought process comes from that the top end somehow rules the video card market. It doesn't and never has. It's the low end and the midrange which drive the market and push it along. That's the vast majority of card sales, not halo cards. The quickest way to destroy PC gaming and the PC gaming market is to believe that nothing but high end and halo cards are necessary. This is not an industry which can be supported solely on the top 1%-5% of purchasers.
 
Status
Not open for further replies.
Back
Top