YeuEmMaiMai
Fully [H]
- Joined
- Jun 11, 2004
- Messages
- 32,576
I have no games that require anything faster than my RX580
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature currently requires accessing the site using the built-in Safari browser.
I think it’s a choice between supplying the consoles and building stand alone GPUs. They have pushed ~35ish million APUs out the door for the current gen consoles. Considering half the shit released these days are console ports, they’re doing just fine. The pc market share reference is what we seem to care about here at H, but yes, they’re moving forward IMHOI see this argument a lot. Are you suggesting that because they sell all their stock that they are somehow still moving forward?
This is a very similar situation to why the discrete sound card market died. Mobo manufacturers started putting sound chips on motherboards and "good enough" was the killer of "better" or "great".Of course it makes sense to produce something other than the very top end of cards. That anyone would even think it's a good idea not to is proof of how skewed and screwed up a lot of people's thinking is. Apply that exact mindset to any other industry and see how it would work out. Hint: it would be a disaster. That's the equivalent of the auto industry no longer producing vehicles which cost less than $100k and expecting the used market to cover the use case of everyone who can't afford a $100k car.
I don't understand where this thought process comes from that the top end somehow rules the video card market. It doesn't and never has. It's the low end and the midrange which drive the market and push it along. That's the vast majority of card sales, not halo cards. The quickest way to destroy PC gaming and the PC gaming market is to believe that nothing but high end and halo cards are necessary. This is not an industry which can be supported solely on the top 1%-5% of purchasers.
Those stats don't include the last quarter. As far as I can recall, amd started the current aggressive pricing around the beginning of November, with a corresponding shift in recommendations at some tech sites and fora somewhat after that. If amd's repositioning has had any effects, one would need more current numbers to see them.Something funny with those stats. AMD has a thorough product stack top to bottom. Product SKU's are mostly all available and are very competitively price/performance to both Nvidia and Intel. Intel has a meek product stack with few trust worthy vendors. Well known driver issues.
Are we just supposed to believe Intel is pushing substantial amounts of cards out the back door to end users, but sales are almost nonexistent? Go ask a popular retailer, or sales chart ranking(Amazon/NewEgg) and see.
Something is off.
Hard to say.What percentage of market share does a company need to be called a monopoly and broken up?
switching to gpu chiplets for the 7000 series should get higher yields from the fabIntel and NVidia grew market share at AMD’s expense. It’s not that the 6000 series wasn’t good it’s just that AMD didn’t make a lot of them. AMD is focusing on margins, they have told their investors they plan on getting close to 60%, that is hard to do in the consumer space and near impossible in the budget GPU space where there is ample competition.
Sure but Intel had a nothing burger launch, no major brands pushing quantity, bad reviews, bad drivers and no apparent volume shipping ANYWHERE! But you are concerned about AMD's late aggressive pricing. They were always priced competitively, now just more so. Doesn't explain the loss and if Intel took some of it up is fucking laughable. Like I said. Something is wrong in the stats. Are they wrongly including Intel iGPU by accident (like Steam) as I could see this as I believe Intel did have quite the CPU sales boost the last 1/2 year as Zen3 trickled down and Zen4 had a slow launch while Intel with solid CPU's with DDR4/5 gobbled up a lot of market deservedly.Those stats don't include the last quarter. As far as I can recall, amd started the current aggressive pricing around the beginning of November, with a corresponding shift in recommendations at some tech sites and fora somewhat after that. If amd's repositioning has had any effects, one would need more current numbers to see them.
Of course it makes sense to produce something other than the very top end of cards. That anyone would even think it's a good idea not to is proof of how skewed and screwed up a lot of people's thinking is. Apply that exact mindset to any other industry and see how it would work out. Hint: it would be a disaster. That's the equivalent of the auto industry no longer producing vehicles which cost less than $100k and expecting the used market to cover the use case of everyone who can't afford a $100k car.
I don't understand where this thought process comes from that the top end somehow rules the video card market. It doesn't and never has. It's the low end and the midrange which drive the market and push it along. That's the vast majority of card sales, not halo cards. The quickest way to destroy PC gaming and the PC gaming market is to believe that nothing but high end and halo cards are necessary. This is not an industry which can be supported solely on the top 1%-5% of purchasers.
Why would something be off?Something funny with those stats. AMD has a thorough product stack top to bottom. Product SKU's are mostly all available and are very competitively price/performance to both Nvidia and Intel. Intel has a meek product stack with few trust worthy vendors. Well known driver issues.
Are we just supposed to believe Intel is pushing substantial amounts of cards out the back door to end users, but sales are almost nonexistent? Go ask a popular retailer, or sales chart ranking(Amazon/NewEgg) and see.
Something is off.
Yeah but they are going to use that saved silicon to make EPYC’s and MI200’s.switching to gpu chiplets for the 7000 series should get higher yields from the fab
This is a very similar situation to why the discrete sound card market died. Mobo manufacturers started putting sound chips on motherboards and "good enough" was the killer of "better" or "great".
Now APU's are essentially doing the same thing. A glut of the cards that normally went into office machines now no longer need cards. And this is slowly making its way up the food chain. The better APU's get, the more of the bottom of the GPU market dies out.
Mid level GPU's can only cost what they cost, because of precisely what you say: scale. So if the only people buying what would normally be a $300 graphics card are (for sake of argument) 1/2 of what they used to be, GPU manufacturers either have to increase the cost of those cards to make up margin or decide that the product is no longer viable at its increased price point. And that price increase for that "1/2" cards left being sold would be more than doubling the margins on the remaining cards. They'd also be losing the discount of having 2x the volume of cards they had before.
There is a very real world where the entire low and mid ranged market is either taken up by APUs or Intel's new "mid-level" options because AMD and nVidia don't want to play in an area that may or may not be profitable.
So while soundcards never "stopped being a thing" the type that is left and the type of buyer that buys them is an entirely different class of customer. And it appears that same process is happening in the GPU market.
There are simply fewer low and mid-range cards being sold. If the trend continues, they will die out.
That is not a false equivalency. It's the exact same situation you're championing but applied to a different market. You flat out stated nothing but high end GPUs should ever be made and everyone who can't afford to purchase one new should suck it up and wait until they can get one used.That's a false equivalency, but I understand you are just trying to find an analogy.
If it's vehicles you want to use, it would be like Ford producing an 2018 F-150 Platinum with a 10" infotainment screen. Then in 2020, the Platinum gets a 12" screen while the Lariat owners are 'stuck' with the old 10" screen used from the Platinium 2 years ago. Finally, in 2022, Platinum owners get a 14" infotainment screen while the Lariat owners get the 12" screen from 2 years ago and the budget XLT owners get a 4 year old 10" screen.
Thats a far better analogy as it is more relatable to tech and everyone can still buy a new truck.
No. The reasons I stated. What are your logic reasons?Why would something be off?
Because it goes against your personal bias?
So let me get this straight.No. The reasons I stated. What are your logic reasons?
Oh wait![]()
Or simply oem pre built ? In markets where price matter a lot:Are we just supposed to believe Intel is pushing substantial amounts of cards out the back door to end users, but sales are almost nonexistent?
An important piece of context that is missing from this is that there was a glut of video cards left on the market after cryptocurrency crashed. NVIDIA stopped producing Ampere earlier than usual to try and clear the market before Lovelace released. It looks like AMD did the same. That would explain the shipments going off a cliff this year. I think NVIDIA has a bad habit of knee jerk reaction to how the market is reacting to various factors and often does the wrong thing. In that way the market is going to constantly be in flux with supply and demand. AMD seems to weather market expectations better than NVIDIA looking at the graph visualizing shipment volume, but unfortunately with them being the underdog they often have to follow the trends of NVIDIA being the market leader.For the visual learners. Not only is Nvidia taking the lion's share, but discrete GPUs seem to be targeted for the elite only as all numbers are going down while these companies maintain their bottom line by making it up with targeting the high end.
Its easier and cheaper to build, package, and ship 1 million cards at a $300 profit each then doing the same for 10 million card at $30 profit each.
View attachment 537817View attachment 537818
I didn't say they had to buy used. Had Nvidia just repackaged a 2060 as a 3050, it would have been a better card and still have been 'new'.That is not a false equivalency. It's the exact same situation you're championing but applied to a different market. You flat out stated nothing but high end GPUs should ever be made and everyone who can't afford to purchase one new should suck it up and wait until they can get one used.
An important piece of context that is missing from this is that there was a glut of video cards left on the market after cryptocurrency crashed.
They don't need to, that was the point I was making. There wasn't an onboard solution that was as a good as an X-Fi either (until relatively recently, though discrete still gives access to more hardware channels and in-outs for people in the audio recording world). Good enough is good enough. Though I could point out that technically the PS5 and XSX are both powered by AMD APUs.Show me a single APU that performs as well as a midrange discrete card. You can't do it because it doesn't exist. APUs are good for the very low end of the gaming spectrum and are badly outdated by the time they come out.
This is a silly argument that enthusiasts try to make all the time. It's been shown time and time again that normal consumer behavior is to never open the sides of their computers ever for any reason. Here on the [H] its taken for granted that we all know how to build a machine, but that isn't the other 99% out there. Making arguments about what "we" can do is missing the point, just as much as recognizing that 99% of people don't buy a $1000+ video card either (which I'm sure we both would recognize is true).Also show me how that's even remotely financially viable in the long term. It would require a CPU upgrade to get any sort of performance upgrade on the GPU side. Either that or buy a $1000+ card. Those are mutually exclusive options for the vast majority of people. Even better is the need to upgrade the whole platform after a few years. A CPU, motherboard and likely RAM upgrade just to increase the performance of the GPU.
Which is ironically what they do. Most of the computing space for normal consumers is moving towards laptops anyway. Devices that also, regardless of manufacturer, generally aren't upgrade-able. MXM is dead. RAM in most normal machines is soldered in now and CPU's are usually as a rule not upgradeable. You can upgrade your NVME or 2.5" HDD if you want to.In most cases it would require the purchase of a completely new computer because most people don't have the knowledge or ability to do partial hardware upgrades. That doesn't sound even remotely financially viable or even feasible.
If Nvidia sold a 3050 then they made money once. If that same GPU sold twice, then Nvidia lost money. Also the bigger the gap between low or midrange from any generation compared to Nvidia's newest GPU's, the less likely game developers will include features that would only be realistically usable on Nvidia's high end cards like Ray-Tracing. Nvidia likes the idea of having features that only their high end GPU's could support but games like Elden Ring don't support Ray-Tracing, and my Vega 56 in Linux plays Elden Ring 60fps at 1080p just fine. Lots of GPU's perform at a Vega 56's performance or better, so there's no reason to spend money on a new current generation GPU's. The value just isn't there.Does it really make sense to even make a midrange, mainstream or budget card anymore? Just look at the last gen mainstream RTX 3050. The RTX 2060 midrange card from 2 generations ago is a better buy and likely cheaper to produce.
Would it make sense to design a budget 4030 when the 3050 will get even cheaper as well as the 2060?
Same goes as a 4050 against a 3060 or a 2070.
People just need to get over buying used or just 'old' hardware if they really want to game on the low to mid range.