_mockingbird
[H]ard|Gawd
- Joined
- Feb 20, 2017
- Messages
- 1,047
If the product can't be sold at a profit, it's a bad product.All depends on the price.
"There is no such thing as a bad video card just bad prices." Elmer Fudd
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
If the product can't be sold at a profit, it's a bad product.All depends on the price.
"There is no such thing as a bad video card just bad prices." Elmer Fudd
Yes that what I understood and I have a list of reasons why I am not sure that it is a good idea (versus relaunching it as a GRE if they really have too many on their hands). Would a 9050 with a big warning of no FSR 4 be significantly more attractive than a 7600 GRE to be worth bringing confusion in their line up.. then again with AMD GPU naming history, anything can happen.I was discussing mullet's hypothetical scenario in which AMD launches the Radeon RX 9050 XT 8GB and how that would happen.
But if that is true, would it be the case all over the line up ?, why not do the same with the 7700xt, 7800xt ? Because it is cheaper to achieve 7700xt performance with tsmc 4-RDNA 4 technology, the 256 bits cut down 360mm die of NAVI 48 on a 9070 compete well with the cut down of 530mm mixed die mix of a 7900xt and its 320 bits bus.When you put something new in production, it costs a lot of money.
AMD can cut the price of the Radeon RX 7600 because it has been it has long been in production.
How do you know it can't make a profit?If the product can't be sold at a profit, it's a bad product.
AMD had to narrow the feature gap (i.e. image upscaling, ray-tracing) with NVIDIA. That is why AMD had to release RDNA4.Yes that what I understood and I have a list of reasons why I am not sure that it is a good idea (versus relaunching it as a GRE if they really have too many on their hands). Would a 9050 with a big warning of no FSR 4 be significantly more attractive than a 7600 GRE to be worth bringing confusion in their line up.. then again with AMD GPU naming history, anything can happen.
But if that is true, would it be the case all over the line up ?, why not do the same with the 7700xt, 7800xt ? Because it is cheaper to achieve 7700xt performance with tsmc 4-RDNA 4 technology, the 256 bits cut down 360mm die of NAVI 48 on a 9070 compete well with the cut down of 530mm mixed die mix of a 7900xt and its 320 bits bus.
...the fact that AMD and NVIDIA haven't bothered to release new products at those pricesHow do you know it can't make a profit?
What prices?...the fact that AMD and NVIDIA haven't bothered to release new products at those prices
Yes that in good part, but also had to cut cost down., which they did.AMD had to narrow the feature gap (i.e. image upscaling, ray-tracing) with NVIDIA. That is why AMD had to release RDNA4.
Rebranding has the advantage of immediately making clear where the product stands.Yes that in good part, but also had to cut cost down., which they did.
Low end, do not care much about RT for sure, but FSR 4 can become more of a good one, now that it work at 1080p and bring better quality than usual TAA, has you say if that group do not mind the feature gap as much they would not mind buying the 7600 GRE, you did not bring confusion to your lineup, did not bring "controversy" of renaming an RDNA 3 card an RDNA 4 name and all the warning about it in every review.
The GeForce RTX 4060 Ti also has 128-bit bus GDDR6 and it's a lot faster than the Radeon RX 7600 XT, so the memory configuration isn't the limiting factor.I really wish they could rebrand the 7600xt and somehow give it gddr7. 16gb of gddr7 would heavily encourage me towards it, because that's the no. 1 flaw I find with the card. 128 bit bus + gddr6 doesn't work on that tier of card. Especially as someone that has a 1440p 180hz monitor.
yeah, but I remember higher resolution tests held it back, especially above 1080pThe GeForce RTX 4060 Ti also has 128-bit bus GDDR6 and it's a lot faster than the Radeon RX 7600 XT, so the memory configuration isn't the limiting factor.
The GeForce RTX 4060 Ti is still well ahead of the Radeon RX 7600 XT at 1440p and 4Kyeah, but I remember higher resolution tests held it back, especially above 1080p
MLID's sources are probably his left hand and his right hand.Based on experience, AMD is unlikely to stop supplying or cancel a product before it is launched. After all, AIB partners have already stocked and produced it. Therefore, in the early stages of sales, both Radeon RX 9060 XT 16GB and Radeon RX 9060 XT 8GB will be available on the market at the same time. As for the follow-up, AIB partners will of course make adjustments based on market sales conditions. Perhaps the 16GB version of Radeon RX 9060 XT will be more likely to appear than the 8GB version.
https://benchlife-info.translate.go...y-1/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
8GB for $450? What person would buy this?XFX Radeon RX 9060 XT 16/8GB GPUs with 3.3 GHz clock listed by retailer, price starts at $449
https://videocardz.com/newz/xfx-rad...-clock-listed-by-retailer-price-starts-at-449
Somebody desperate?8GB for $450? What person would buy this?
They could buy the 5060ti 16gb, they seem in stock online at $420 right now, and if the a bit slower of a 7700xt rumours are true, the 5060ti will be the faster card, cheaper, double the vram, they must be some form of placeholder pricing or special edition one.Somebody desperate?
That kind of gaslighting doesn’t actually work, even for NVIDIA. Sales of the 4080 at $1,200 were very poor, and all of NVIDIA’s desperate pickup artist tactics didn’t change that. If AMD releases a $450 card with 8GB and half the cores of the midrange 9070 XT, it will damage their brand. If they’re so dysfunctional that they can’t stop themselves from stepping in the same turd NVIDIA just stepped in with their 5060 Ti 8GB launch, then I’ll take back all the nice things I said about the 9070 launch and RTG’s leadership.Somebody desperate?
AMD still has a lot of older inventory out there, which sells for less than that and likely performs similarly. The pricing is an obvious move to make them seem like a better "value" to clear that inventory out.
AMD needs to release a new GPU in that space because the investors say they need a release, so make a small amount of them so they "exist", but use them to push people to clear out old stock.
But who knows, when the market is flooded by shit it doesn't take much to make dry land look like paradise.
That's a complete disaster if trueXFX Radeon RX 9060 XT 16/8GB GPUs with 3.3 GHz clock listed by retailer, price starts at $449
https://videocardz.com/newz/xfx-rad...-clock-listed-by-retailer-price-starts-at-449
But they do it all the same, and it does work because cards keep selling, and the prices only go up.That kind of gaslighting doesn’t actually work, even for NVIDIA. Sales of the 4080 at $1,200 were very poor, and all of NVIDIA’s desperate pickup artist tactics didn’t change that. If AMD releases a $450 card with 8GB and half the cores of the midrange 9070 XT, it will damage their brand. If they’re so dysfunctional that they can’t stop themselves from stepping in the same turd NVIDIA just stepped in with their 5060 Ti 8GB launch, then I’ll take back all the nice things I said about the 9070 launch and RTG’s leadership.
Depend on what we mean by keep selling, AMD gaming division revenue have been so low for a while that they stopped making their own graphic for them....because cards keep selling
I completely agree that it is stupid and goes completely against the whole "we're focusing on the entry and mid markets with fair prices to gain market share" mantra AMD was singing about leading up to this launch, I have to assume that these high priced 8GB cards are going to be paper launched, build a few so they can tell investors they exist but only as a means to make the old stock and the 16gb variant seem better in comparison.Depend on what we mean by keep selling, AMD gaming division revenue have been so low for a while that they stopped making their own graphic for them....
View attachment 728982
They are down to a sub-graph of a client and gaming segment, it was maybe operating at a loss even, Xilinx was now bigger than the whole gaming division of AMD in Q1.
I get the pricing strategy you talk about and making sense if you have limited stock, but there is always a limit, selling 9060xt 8GB for more than by then easy to get 5060TI 16GB is just out of that window and does not match the better 9700 gre pricing strategy, I am not certain of course, but the entry point will be much lower, that a direct competitors to the $300 5060 from Nvidia and cleanly a tier below the 5060ti 16GB.
Maybe some fancier for a couple of weeks street price will see a $450 price tag, but no way that is the actual starting price imo, there no room to price the 16GB below the 5060ti 16gb: https://www.bestbuy.com/site/gigaby...0-graphics-card-black/6629363.p?skuId=6629363, which can be found at stock on online reseller a bit already.
The is the same problem that Nvidia experienced in that while the 9060 XT 8GB might perform better... that is until the game goes beyond 8GB. So older games that have more than 8GB of VRAM may perform worse on older GPU's when bellow 8GB, but will perform better with games that go beyond 8GB. This is the situation with the RTX 4060 vs RTX 3060, because the 3060's 12GB can allow it to pull ahead of the 4060 8GB in some games.Somebody desperate?
AMD still has a lot of older inventory out there, which sells for less than that and likely performs similarly. The pricing is an obvious move to make them seem like a better "value" to clear that inventory out.
AMD needs to release a new GPU in that space because the investors say they need a release, so make a small amount of them so they "exist", but use them to push people to clear out old stock.
But who knows, when the market is flooded by shit it doesn't take much to make dry land look like paradise.
It said in the article that the price is not final (AKA placeholder), but why bother reading?8GB for $450? What person would buy this?
Given how the MSRP of the 9070 and 9070 XT went, when AMD says the price isn’t final I expect it to go up, not down from there.It said in the article that the price is not final (AKA placeholder), but why bother reading?
AMD didn't say anything.Given how the MSRP of the 9070 and 9070 XT went, when AMD says the price isn’t final I expect it to go up, not down from there.
AMD knew the price by the time they signed the contract with TSMC to make the chips.AMD didn't say anything.
Prices are usually set at the last minute, so AMD probably doesn't know the prices.
That's the cost of the die.AMD knew the price by the time they signed the contract with TSMC to make the chips.
They just don’t want to say them because they already know it’s too high and their going to catch shit for it.
I know that view has become gospel in tech circles, I just don't see the evidence for it. I get where the frustration comes from, though, believe me! I was frustrated when the market signaled that enough people were willing to pay a thousand bucks for a x80-class card, because I was only willing to pay $800, tops. But every time either NV or AMD exceeds the market's willingness to pay, sales are terrible. If what you say is true, why stop there? Why not charge 10 thousand dollars for a 5070 so that people will see what a great deal the 12 thousand dollar 5090 is? Why not 10 million, or 10 billion? They only have to sell one at 10 billion for the strategy to make sense, right? I kid, of course, but only to show that there are hard limits to that kind of value manipulation.But they do it all the same, and it does work because cards keep selling, and the prices only go up.
They produce a few current gen cards, mark sell them as high as they can, the old stock goes away, then they trickle them out, usually as a means to upsell to a slightly better card with a better margin.
Then when the next generation rolls around they can price it relative to that inflated MSRP from the previous year, and if they get called out for it they can pick any enemy they want, tariffs, inflation, supply woes, shipping disasters, manufacturing, take your pick all while posting larger than ever profits.
Nvidia and AMD have long since moved away from actual product pricing, and instead moved to a value based system, what is the perceived value of the item based on previous and existing market conditions, not based on the objects themselves, and the inflated MSRP pricing they use is their justification and smokescreen for it.
The die, memory, and cooler make up something like 80% of the component costs. It’s not AMD’s first rodeo, they know the manufacturing costs, component costs, shipping costs. The only unknown is how much extra the AIBs tack onto for their OC RGB editions they sell for themselves.That's the cost of the die.
I don't know what point you are trying to make.
That's not what we are talking about. You are changing the subject.The die, memory, and cooler make up something like 80% of the component costs. It’s not AMD’s first rodeo, they know the manufacturing costs, component costs, shipping costs. The only unknown is how much extra the AIBs tack onto for their OC RGB editions they sell for themselves.
AMD doesn’t build a chip and throw it into a vacuum, they have a rough and accurate cost by the time the design phase is complete. Hell PCB design software has component costs stored in them so the designers know what the relative costs of what they are building will be as they design the card.
AMD has been working with their partners for how many decades now? It’s not some new agreement they have no knowledge of. They know exactly what’s going on.
There is always a breaking point, in any price.I know that view has become gospel in tech circles, I just don't see the evidence for it. I get where the frustration comes from, though, believe me! I was frustrated when the market signaled that enough people were willing to pay a thousand bucks for a x80-class card, because I was only willing to pay $800, tops. But every time either NV or AMD exceeds the market's willingness to pay, sales are terrible. If what you say is true, why stop there? Why not charge 10 thousand dollars for a 5070 so that people will see what a great deal the 12 thousand dollar 5090 is? Why not 10 million, or 10 billion? They only have to sell one at 10 billion for the strategy to make sense, right? I kid, of course, but only to show that there are hard limits to that kind of value manipulation.
These mind games might sway someone who was on the fence, but they don't seem to translate to big sales, and that will be even more true when the massive wafer supply shortage finally ends. That could come from Intel finally getting its shit together with 18A, the AI money train derailing, or some factor no one's even thinking about yet. It has to end eventually, though. All shortages end eventually. There's too much money on the table to let TSMC have the whole pie forever. Once NV and AMD can order as many wafers as they want, then gaming no longer cannibalizes professional cards. They can make all the money in compute, and then make all the money in gaming, too. That will drive sales quantity targets for gaming cards way up, which means lower prices. No amount of value manipulation is going to make the average casual gamer pay $600+ for a relative level of performance that used to go for $230 (5070 vs GTX 1660 Super). The thing that's changed is NV doesn't want to sell gaming cards because it eats scarce wafer space from more-profitable compute cards.
They won’t sell a product at a loss and their growth predictions demand a minimum of a 40% margin. So they will price them accordingly to maintain their investor demands. If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.That's not what we are talking about. You are changing the subject.
Market conditions set the prices.
AMD is looking at the competitor (NVIDIA) to see how the competitor's products are selling and setting its prices accordingly.
AMD has been asking reviewers how its products should be priced.They won’t sell a product at a loss and their growth predictions demand a minimum of a 40% margin. So they will price them accordingly to maintain their investor demands. If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.
You only need to look at the last 3 generations of GPU’s to see exactly what’s going on. A shift is unlikely any time soon, AMD, Nvidia, and their investors are more than pleased with the current pricing models.
If anything given the turmoil in the gaming industry the investors are telling them to decrease GPU production and raise prices more.
People keep saying market conditions set the GPU prices, like that means something. Meanwhile the prices keep going up to outcries of them being too high yet the cards still sell out.
That sound a bit of oversimplification, maybe not their problem that week/month, but it is obviously a big problem for them, they cannot sell the next batch that is on its way (and take months to make) at the wanted price, the next gen of GPU will be weighted down by the previous overstock and so on...If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.
I do not know how much they care about desktop gpu pricing model by now (every call, every quarter of all question ask, try to find one about desktop discrete gaming GPUs....), maybe they do not mind this:investors are more than pleased with the current pricing models.
AMD gaming division margin are so low now that they are now kept secret.... I am not sure if they are not trying to be quite aggressive price wise.AMD then said F-yeah better margins for us and rode with it.
And keep CUDA in most college student computer via the gaming-regular PC purchase, that has value for what future new enterprise employee/boss are used and know, like Microsoft windows gaming position, maybe young gamer with the little activate windows warning in the background is zero direct money, but it can be future money.. They can make all the money in compute, and then make all the money in gaming, too.
I do think there’s a lot of truth in that. We didn’t know how good we had it, and the companies didn’t realize they were leaving money on the table. But I still think the main difference between then and now is that there’s so little high-density wafer space to go around. $600 may be a profit-maximizing point given the supply constraint, but remove that constraint and it’s no longer the case. Let’s say NV said “f__k it, 5070 is $400 with 18 GB of VRAM!” Everyone who doesn’t already have a faster card would scramble to buy one, but it would cannibalize scarce wafer space away from higher-profit cards, so it would be bad for business. Remove that scarcity, and you get a different situation. If it costs $300 to make, then you make $100 for every sale at $400 MSRP instead of $300 at $600 MSRP, but they might easily sell more than 3 times as many. Or maybe $500 ends up being the optimal price. Whatever the optimal price is, it’s going to be lower without the wafer constraint.There is always a breaking point, in any price.
Covid and the rise of the scalping market proved to investors the cards were too under valued.
If a scalper can buy it for $300 and sell it for $600 only tells investors they should have been selling it for $600 to begin with.
When investors saw that they turned to both AMD and Nvidia and said, if they can sell it for $600 then you can to. Nvidia was already under heavy investor scrutiny due to the crypto crash and claims they were hiding how much of their sales were propped up by it.
Not willing to tank another lawsuit they raised prices to match what they feel scalpers could reasonably sell the cards for and glossed everything over.
AMD then said F-yeah better margins for us and rode with it.
But TSMC can’t supply 3 times as many chips so that strategy doesn’t work.I do think there’s a lot of truth in that. We didn’t know how good we had it, and the companies didn’t realize they were leaving money on the table. But I still think the main difference between then and now is that there’s so little high-density wafer space to go around. $600 may be a profit-maximizing point given the supply constraint, but remove that constraint and it’s no longer the case. Let’s say NV said “f__k it, 5070 is $400 with 18 GB of VRAM!” Everyone who doesn’t already have a faster card would scramble to buy one, but it would cannibalize scarce wafer space away from higher-profit cards, so it would be bad for business. Remove that scarcity, and you get a different situation. If it costs $300 to make, then you make $100 for every sale at $400 MSRP instead of $300 at $600 MSRP, but they might easily sell more than 3 times as many. Or maybe $500 ends up being the optimal price. Whatever the optimal price is, it’s going to be lower without the wafer constraint.
But TSMC can’t supply 3 times as many chips so that strategy doesn’t work.
You can all keep saying that things don’t work this way, but time and time again AMD and Nvidia show us they do. And until people stop paying for the GPU’s it won’t change. But people keep buying them so they won’t.