• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

[MLID] AMD wary of launching 8gb 9060xt

I was discussing mullet's hypothetical scenario in which AMD launches the Radeon RX 9050 XT 8GB and how that would happen.
Yes that what I understood and I have a list of reasons why I am not sure that it is a good idea (versus relaunching it as a GRE if they really have too many on their hands). Would a 9050 with a big warning of no FSR 4 be significantly more attractive than a 7600 GRE to be worth bringing confusion in their line up.. then again with AMD GPU naming history, anything can happen.

When you put something new in production, it costs a lot of money.

AMD can cut the price of the Radeon RX 7600 because it has been it has long been in production.
But if that is true, would it be the case all over the line up ?, why not do the same with the 7700xt, 7800xt ? Because it is cheaper to achieve 7700xt performance with tsmc 4-RDNA 4 technology, the 256 bits cut down 360mm die of NAVI 48 on a 9070 compete well with the cut down of 530mm mixed die mix of a 7900xt and its 320 bits bus.
 
Yes that what I understood and I have a list of reasons why I am not sure that it is a good idea (versus relaunching it as a GRE if they really have too many on their hands). Would a 9050 with a big warning of no FSR 4 be significantly more attractive than a 7600 GRE to be worth bringing confusion in their line up.. then again with AMD GPU naming history, anything can happen.


But if that is true, would it be the case all over the line up ?, why not do the same with the 7700xt, 7800xt ? Because it is cheaper to achieve 7700xt performance with tsmc 4-RDNA 4 technology, the 256 bits cut down 360mm die of NAVI 48 on a 9070 compete well with the cut down of 530mm mixed die mix of a 7900xt and its 320 bits bus.
AMD had to narrow the feature gap (i.e. image upscaling, ray-tracing) with NVIDIA. That is why AMD had to release RDNA4.

On the other hand, at the lowest end of the market, where price trumps all other concerns, the end user is willing to make do without.
 
AMD had to narrow the feature gap (i.e. image upscaling, ray-tracing) with NVIDIA. That is why AMD had to release RDNA4.
Yes that in good part, but also had to cut cost down., which they did.

Low end, do not care much about RT for sure, but FSR 4 can become more of a good one, now that it work at 1080p and bring better quality than usual TAA, has you say if that group do not mind the feature gap as much they would not mind buying the 7600 GRE, you did not bring confusion to your lineup, did not bring "controversy" of renaming an RDNA 3 card an RDNA 4 name and all the warning about it in every review.
 
Yes that in good part, but also had to cut cost down., which they did.

Low end, do not care much about RT for sure, but FSR 4 can become more of a good one, now that it work at 1080p and bring better quality than usual TAA, has you say if that group do not mind the feature gap as much they would not mind buying the 7600 GRE, you did not bring confusion to your lineup, did not bring "controversy" of renaming an RDNA 3 card an RDNA 4 name and all the warning about it in every review.
Rebranding has the advantage of immediately making clear where the product stands.

Most consumers would assume that the Radeon RX 9050 XT is worse than the Radeon RX 9060 XT.

Radeon RX 7650 GRE? ...not so much
 
I really wish they could rebrand the 7600xt and somehow give it gddr7. 16gb of gddr7 would heavily encourage me towards it, because that's the no. 1 flaw I find with the card. 128 bit bus + gddr6 doesn't work on that tier of card. Especially as someone that has a 1440p 180hz monitor.
 
I really wish they could rebrand the 7600xt and somehow give it gddr7. 16gb of gddr7 would heavily encourage me towards it, because that's the no. 1 flaw I find with the card. 128 bit bus + gddr6 doesn't work on that tier of card. Especially as someone that has a 1440p 180hz monitor.
The GeForce RTX 4060 Ti also has 128-bit bus GDDR6 and it's a lot faster than the Radeon RX 7600 XT, so the memory configuration isn't the limiting factor.
 
The GeForce RTX 4060 Ti also has 128-bit bus GDDR6 and it's a lot faster than the Radeon RX 7600 XT, so the memory configuration isn't the limiting factor.
yeah, but I remember higher resolution tests held it back, especially above 1080p
 
yeah, but I remember higher resolution tests held it back, especially above 1080p
The GeForce RTX 4060 Ti is still well ahead of the Radeon RX 7600 XT at 1440p and 4K

relative-performance-2560-1440.png
relative-performance-3840-2160.png
 
Based on experience, AMD is unlikely to stop supplying or cancel a product before it is launched. After all, AIB partners have already stocked and produced it. Therefore, in the early stages of sales, both Radeon RX 9060 XT 16GB and Radeon RX 9060 XT 8GB will be available on the market at the same time. As for the follow-up, AIB partners will of course make adjustments based on market sales conditions. Perhaps the 16GB version of Radeon RX 9060 XT will be more likely to appear than the 8GB version.

https://benchlife-info.translate.go...y-1/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
 
Based on experience, AMD is unlikely to stop supplying or cancel a product before it is launched. After all, AIB partners have already stocked and produced it. Therefore, in the early stages of sales, both Radeon RX 9060 XT 16GB and Radeon RX 9060 XT 8GB will be available on the market at the same time. As for the follow-up, AIB partners will of course make adjustments based on market sales conditions. Perhaps the 16GB version of Radeon RX 9060 XT will be more likely to appear than the 8GB version.

https://benchlife-info.translate.go...y-1/?_x_tr_sl=auto&_x_tr_tl=en&_x_tr_hl=en-GB
MLID's sources are probably his left hand and his right hand.
 
8GB for $450? What person would buy this?
Somebody desperate?
AMD still has a lot of older inventory out there, which sells for less than that and likely performs similarly. The pricing is an obvious move to make them seem like a better "value" to clear that inventory out.
AMD needs to release a new GPU in that space because the investors say they need a release, so make a small amount of them so they "exist", but use them to push people to clear out old stock.

But who knows, when the market is flooded by shit it doesn't take much to make dry land look like paradise.
 
Somebody desperate?
They could buy the 5060ti 16gb, they seem in stock online at $420 right now, and if the a bit slower of a 7700xt rumours are true, the 5060ti will be the faster card, cheaper, double the vram, they must be some form of placeholder pricing or special edition one.

As for AMD large inventory, looking at their price in the US:
https://www.newegg.com/p/pl?d=7700xt
https://www.newegg.com/p/pl?d=7600xt

nothing seem even at msrp, let alone liquidation we have too much of it mode. Cheapest non-refurbished/open box 6600xt on newegg seem at $350 and $375 and it is the 2 only sku left (rest are over $500 nothing must be left options).

4200 RMB 9070 GRE and its 12GB of vram that would be around $485 USD pre VATS, the 9060xt 16GB entry point must be a bit lower than that, so around ~$450 max and the 9060 8GB will be $375 at the very max if the 16GB 450 would be my guess.

In reality, they probably want to match the 5060 $300 for the 9060xt 8GB (9060xt seem like as good chance to be barely faster) and that why it take so long to announce to make it work... to see if both the performance and price of the 5060 are really "true"
 
Last edited:
Somebody desperate?
AMD still has a lot of older inventory out there, which sells for less than that and likely performs similarly. The pricing is an obvious move to make them seem like a better "value" to clear that inventory out.
AMD needs to release a new GPU in that space because the investors say they need a release, so make a small amount of them so they "exist", but use them to push people to clear out old stock.

But who knows, when the market is flooded by shit it doesn't take much to make dry land look like paradise.
That kind of gaslighting doesn’t actually work, even for NVIDIA. Sales of the 4080 at $1,200 were very poor, and all of NVIDIA’s desperate pickup artist tactics didn’t change that. If AMD releases a $450 card with 8GB and half the cores of the midrange 9070 XT, it will damage their brand. If they’re so dysfunctional that they can’t stop themselves from stepping in the same turd NVIDIA just stepped in with their 5060 Ti 8GB launch, then I’ll take back all the nice things I said about the 9070 launch and RTG’s leadership.
 
  • Like
Reactions: Decko
like this
That kind of gaslighting doesn’t actually work, even for NVIDIA. Sales of the 4080 at $1,200 were very poor, and all of NVIDIA’s desperate pickup artist tactics didn’t change that. If AMD releases a $450 card with 8GB and half the cores of the midrange 9070 XT, it will damage their brand. If they’re so dysfunctional that they can’t stop themselves from stepping in the same turd NVIDIA just stepped in with their 5060 Ti 8GB launch, then I’ll take back all the nice things I said about the 9070 launch and RTG’s leadership.
But they do it all the same, and it does work because cards keep selling, and the prices only go up.
They produce a few current gen cards, mark sell them as high as they can, the old stock goes away, then they trickle them out, usually as a means to upsell to a slightly better card with a better margin.
Then when the next generation rolls around they can price it relative to that inflated MSRP from the previous year, and if they get called out for it they can pick any enemy they want, tariffs, inflation, supply woes, shipping disasters, manufacturing, take your pick all while posting larger than ever profits.

Nvidia and AMD have long since moved away from actual product pricing, and instead moved to a value based system, what is the perceived value of the item based on previous and existing market conditions, not based on the objects themselves, and the inflated MSRP pricing they use is their justification and smokescreen for it.
 
because cards keep selling
Depend on what we mean by keep selling, AMD gaming division revenue have been so low for a while that they stopped making their own graphic for them....

bQDzZwMxpFUv3AUjSbwp3o-1200-80.png


They are down to a sub-graph of a client and gaming segment, it was maybe operating at a loss even, Xilinx was now bigger than the whole gaming division of AMD in Q1.

I get the pricing strategy you talk about and making sense if you have limited stock, but there is always a limit, selling 9060xt 8GB for more than by then easy to get 5060TI 16GB is just out of that window and does not match the better 9700 gre pricing strategy, I am not certain of course, but the entry point will be much lower, that a direct competitors to the $300 5060 from Nvidia and cleanly a tier below the 5060ti 16GB.

Maybe some fancier for a couple of weeks street price will see a $450 price tag, but no way that is the actual starting price imo, there no room to price the 16GB below the 5060ti 16gb: https://www.bestbuy.com/site/gigaby...0-graphics-card-black/6629363.p?skuId=6629363, which can be found at stock on online reseller a bit already.
 
Depend on what we mean by keep selling, AMD gaming division revenue have been so low for a while that they stopped making their own graphic for them....

View attachment 728982

They are down to a sub-graph of a client and gaming segment, it was maybe operating at a loss even, Xilinx was now bigger than the whole gaming division of AMD in Q1.

I get the pricing strategy you talk about and making sense if you have limited stock, but there is always a limit, selling 9060xt 8GB for more than by then easy to get 5060TI 16GB is just out of that window and does not match the better 9700 gre pricing strategy, I am not certain of course, but the entry point will be much lower, that a direct competitors to the $300 5060 from Nvidia and cleanly a tier below the 5060ti 16GB.

Maybe some fancier for a couple of weeks street price will see a $450 price tag, but no way that is the actual starting price imo, there no room to price the 16GB below the 5060ti 16gb: https://www.bestbuy.com/site/gigaby...0-graphics-card-black/6629363.p?skuId=6629363, which can be found at stock on online reseller a bit already.
I completely agree that it is stupid and goes completely against the whole "we're focusing on the entry and mid markets with fair prices to gain market share" mantra AMD was singing about leading up to this launch, I have to assume that these high priced 8GB cards are going to be paper launched, build a few so they can tell investors they exist but only as a means to make the old stock and the 16gb variant seem better in comparison.

Leading up to the 9070 launch, I was optimistic with all of AMD's "we're targeting the everyday average gamer this generation" song and dance, and that fell apart by day 3 of the launch, so now I'm back to my normal pessimistic self who is trying to figure out WTF AMD is thinking by trying to imagine myself as Scrooge working in their accounting office.
 
Somebody desperate?
AMD still has a lot of older inventory out there, which sells for less than that and likely performs similarly. The pricing is an obvious move to make them seem like a better "value" to clear that inventory out.
AMD needs to release a new GPU in that space because the investors say they need a release, so make a small amount of them so they "exist", but use them to push people to clear out old stock.

But who knows, when the market is flooded by shit it doesn't take much to make dry land look like paradise.
The is the same problem that Nvidia experienced in that while the 9060 XT 8GB might perform better... that is until the game goes beyond 8GB. So older games that have more than 8GB of VRAM may perform worse on older GPU's when bellow 8GB, but will perform better with games that go beyond 8GB. This is the situation with the RTX 4060 vs RTX 3060, because the 3060's 12GB can allow it to pull ahead of the 4060 8GB in some games.
 
It said in the article that the price is not final (AKA placeholder), but why bother reading?
Given how the MSRP of the 9070 and 9070 XT went, when AMD says the price isn’t final I expect it to go up, not down from there.
 
Given how the MSRP of the 9070 and 9070 XT went, when AMD says the price isn’t final I expect it to go up, not down from there.
AMD didn't say anything.

Prices are usually set at the last minute, so AMD probably doesn't know the prices.
 
AMD didn't say anything.

Prices are usually set at the last minute, so AMD probably doesn't know the prices.
AMD knew the price by the time they signed the contract with TSMC to make the chips.

They just don’t want to say them because they already know it’s too high and their going to catch shit for it.
 
AMD knew the price by the time they signed the contract with TSMC to make the chips.

They just don’t want to say them because they already know it’s too high and their going to catch shit for it.
That's the cost of the die.

I don't know what point you are trying to make.
 
But they do it all the same, and it does work because cards keep selling, and the prices only go up.
They produce a few current gen cards, mark sell them as high as they can, the old stock goes away, then they trickle them out, usually as a means to upsell to a slightly better card with a better margin.
Then when the next generation rolls around they can price it relative to that inflated MSRP from the previous year, and if they get called out for it they can pick any enemy they want, tariffs, inflation, supply woes, shipping disasters, manufacturing, take your pick all while posting larger than ever profits.

Nvidia and AMD have long since moved away from actual product pricing, and instead moved to a value based system, what is the perceived value of the item based on previous and existing market conditions, not based on the objects themselves, and the inflated MSRP pricing they use is their justification and smokescreen for it.
I know that view has become gospel in tech circles, I just don't see the evidence for it. I get where the frustration comes from, though, believe me! I was frustrated when the market signaled that enough people were willing to pay a thousand bucks for a x80-class card, because I was only willing to pay $800, tops. But every time either NV or AMD exceeds the market's willingness to pay, sales are terrible. If what you say is true, why stop there? Why not charge 10 thousand dollars for a 5070 so that people will see what a great deal the 12 thousand dollar 5090 is? Why not 10 million, or 10 billion? They only have to sell one at 10 billion for the strategy to make sense, right? I kid, of course, but only to show that there are hard limits to that kind of value manipulation.

These mind games might sway someone who was on the fence, but they don't seem to translate to big sales, and that will be even more true when the massive wafer supply shortage finally ends. That could come from Intel finally getting its shit together with 18A, the AI money train derailing, or some factor no one's even thinking about yet. It has to end eventually, though. All shortages end eventually. There's too much money on the table to let TSMC have the whole pie forever. Once NV and AMD can order as many wafers as they want, then gaming no longer cannibalizes professional cards. They can make all the money in compute, and then make all the money in gaming, too. That will drive sales quantity targets for gaming cards way up, which means lower prices. No amount of value manipulation is going to make the average casual gamer pay $600+ for a relative level of performance that used to go for $230 (5070 vs GTX 1660 Super). The thing that's changed is NV doesn't want to sell gaming cards because it eats scarce wafer space from more-profitable compute cards.
 
That's the cost of the die.

I don't know what point you are trying to make.
The die, memory, and cooler make up something like 80% of the component costs. It’s not AMD’s first rodeo, they know the manufacturing costs, component costs, shipping costs. The only unknown is how much extra the AIBs tack onto for their OC RGB editions they sell for themselves.

AMD doesn’t build a chip and throw it into a vacuum, they have a rough and accurate cost by the time the design phase is complete. Hell PCB design software has component costs stored in them so the designers know what the relative costs of what they are building will be as they design the card.

AMD has been working with their partners for how many decades now? It’s not some new agreement they have no knowledge of. They know exactly what’s going on.
 
The die, memory, and cooler make up something like 80% of the component costs. It’s not AMD’s first rodeo, they know the manufacturing costs, component costs, shipping costs. The only unknown is how much extra the AIBs tack onto for their OC RGB editions they sell for themselves.

AMD doesn’t build a chip and throw it into a vacuum, they have a rough and accurate cost by the time the design phase is complete. Hell PCB design software has component costs stored in them so the designers know what the relative costs of what they are building will be as they design the card.

AMD has been working with their partners for how many decades now? It’s not some new agreement they have no knowledge of. They know exactly what’s going on.
That's not what we are talking about. You are changing the subject.

Market conditions set the prices.

AMD is looking at the competitor (NVIDIA) to see how the competitor's products are selling and setting its prices accordingly.
 
I know that view has become gospel in tech circles, I just don't see the evidence for it. I get where the frustration comes from, though, believe me! I was frustrated when the market signaled that enough people were willing to pay a thousand bucks for a x80-class card, because I was only willing to pay $800, tops. But every time either NV or AMD exceeds the market's willingness to pay, sales are terrible. If what you say is true, why stop there? Why not charge 10 thousand dollars for a 5070 so that people will see what a great deal the 12 thousand dollar 5090 is? Why not 10 million, or 10 billion? They only have to sell one at 10 billion for the strategy to make sense, right? I kid, of course, but only to show that there are hard limits to that kind of value manipulation.

These mind games might sway someone who was on the fence, but they don't seem to translate to big sales, and that will be even more true when the massive wafer supply shortage finally ends. That could come from Intel finally getting its shit together with 18A, the AI money train derailing, or some factor no one's even thinking about yet. It has to end eventually, though. All shortages end eventually. There's too much money on the table to let TSMC have the whole pie forever. Once NV and AMD can order as many wafers as they want, then gaming no longer cannibalizes professional cards. They can make all the money in compute, and then make all the money in gaming, too. That will drive sales quantity targets for gaming cards way up, which means lower prices. No amount of value manipulation is going to make the average casual gamer pay $600+ for a relative level of performance that used to go for $230 (5070 vs GTX 1660 Super). The thing that's changed is NV doesn't want to sell gaming cards because it eats scarce wafer space from more-profitable compute cards.
There is always a breaking point, in any price.
Covid and the rise of the scalping market proved to investors the cards were too under valued.
If a scalper can buy it for $300 and sell it for $600 only tells investors they should have been selling it for $600 to begin with.
When investors saw that they turned to both AMD and Nvidia and said, if they can sell it for $600 then you can to. Nvidia was already under heavy investor scrutiny due to the crypto crash and claims they were hiding how much of their sales were propped up by it.
Not willing to tank another lawsuit they raised prices to match what they feel scalpers could reasonably sell the cards for and glossed everything over.
AMD then said F-yeah better margins for us and rode with it.
 
That's not what we are talking about. You are changing the subject.

Market conditions set the prices.

AMD is looking at the competitor (NVIDIA) to see how the competitor's products are selling and setting its prices accordingly.
They won’t sell a product at a loss and their growth predictions demand a minimum of a 40% margin. So they will price them accordingly to maintain their investor demands. If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.

You only need to look at the last 3 generations of GPU’s to see exactly what’s going on. A shift is unlikely any time soon, AMD, Nvidia, and their investors are more than pleased with the current pricing models.

If anything given the turmoil in the gaming industry the investors are telling them to decrease GPU production and raise prices more.

People keep saying market conditions set the GPU prices, like that means something. Meanwhile the prices keep going up to outcries of them being too high yet the cards still sell out.
 
They won’t sell a product at a loss and their growth predictions demand a minimum of a 40% margin. So they will price them accordingly to maintain their investor demands. If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.

You only need to look at the last 3 generations of GPU’s to see exactly what’s going on. A shift is unlikely any time soon, AMD, Nvidia, and their investors are more than pleased with the current pricing models.

If anything given the turmoil in the gaming industry the investors are telling them to decrease GPU production and raise prices more.

People keep saying market conditions set the GPU prices, like that means something. Meanwhile the prices keep going up to outcries of them being too high yet the cards still sell out.
AMD has been asking reviewers how its products should be priced.

AMD is keen to price its products competitively or at least to get favorable reviews.
 
If they don’t sell that’s not their problem, because the AIB’s already paid for the silicon, if they want to lower the price they can, in extreme circumstances AMD might offer rebates of some sort to the AIB’s on future silicon but they don’t do it frequently.
That sound a bit of oversimplification, maybe not their problem that week/month, but it is obviously a big problem for them, they cannot sell the next batch that is on its way (and take months to make) at the wanted price, the next gen of GPU will be weighted down by the previous overstock and so on...

investors are more than pleased with the current pricing models.
I do not know how much they care about desktop gpu pricing model by now (every call, every quarter of all question ask, try to find one about desktop discrete gaming GPUs....), maybe they do not mind this:
s%2Fb72858db-2df2-400c-8468-51897405e33b_1423x1033.png
or nvidia gaming being down 11% in q4 "2025" financial year versus the year before, understanding the silicon demand trade-off, but I am not sure if they are specially please (if they care at all).

AMD then said F-yeah better margins for us and rode with it.
AMD gaming division margin are so low now that they are now kept secret.... I am not sure if they are not trying to be quite aggressive price wise.

. They can make all the money in compute, and then make all the money in gaming, too.
And keep CUDA in most college student computer via the gaming-regular PC purchase, that has value for what future new enterprise employee/boss are used and know, like Microsoft windows gaming position, maybe young gamer with the little activate windows warning in the background is zero direct money, but it can be future money.
 
There is always a breaking point, in any price.
Covid and the rise of the scalping market proved to investors the cards were too under valued.
If a scalper can buy it for $300 and sell it for $600 only tells investors they should have been selling it for $600 to begin with.
When investors saw that they turned to both AMD and Nvidia and said, if they can sell it for $600 then you can to. Nvidia was already under heavy investor scrutiny due to the crypto crash and claims they were hiding how much of their sales were propped up by it.
Not willing to tank another lawsuit they raised prices to match what they feel scalpers could reasonably sell the cards for and glossed everything over.
AMD then said F-yeah better margins for us and rode with it.
I do think there’s a lot of truth in that. We didn’t know how good we had it, and the companies didn’t realize they were leaving money on the table. But I still think the main difference between then and now is that there’s so little high-density wafer space to go around. $600 may be a profit-maximizing point given the supply constraint, but remove that constraint and it’s no longer the case. Let’s say NV said “f__k it, 5070 is $400 with 18 GB of VRAM!” Everyone who doesn’t already have a faster card would scramble to buy one, but it would cannibalize scarce wafer space away from higher-profit cards, so it would be bad for business. Remove that scarcity, and you get a different situation. If it costs $300 to make, then you make $100 for every sale at $400 MSRP instead of $300 at $600 MSRP, but they might easily sell more than 3 times as many. Or maybe $500 ends up being the optimal price. Whatever the optimal price is, it’s going to be lower without the wafer constraint.
 
I do think there’s a lot of truth in that. We didn’t know how good we had it, and the companies didn’t realize they were leaving money on the table. But I still think the main difference between then and now is that there’s so little high-density wafer space to go around. $600 may be a profit-maximizing point given the supply constraint, but remove that constraint and it’s no longer the case. Let’s say NV said “f__k it, 5070 is $400 with 18 GB of VRAM!” Everyone who doesn’t already have a faster card would scramble to buy one, but it would cannibalize scarce wafer space away from higher-profit cards, so it would be bad for business. Remove that scarcity, and you get a different situation. If it costs $300 to make, then you make $100 for every sale at $400 MSRP instead of $300 at $600 MSRP, but they might easily sell more than 3 times as many. Or maybe $500 ends up being the optimal price. Whatever the optimal price is, it’s going to be lower without the wafer constraint.
But TSMC can’t supply 3 times as many chips so that strategy doesn’t work.

You can all keep saying that things don’t work this way, but time and time again AMD and Nvidia show us they do. And until people stop paying for the GPU’s it won’t change. But people keep buying them so they won’t.
 
But TSMC can’t supply 3 times as many chips so that strategy doesn’t work.

You can all keep saying that things don’t work this way, but time and time again AMD and Nvidia show us they do. And until people stop paying for the GPU’s it won’t change. But people keep buying them so they won’t.

I agree with most of what you're saying in this thread. People keep saying it's all "just supply and demand", while just ignoring that they are the demand, too. Yet I'm just crazy for pointing out that they are part of the problem that they're hiding behind.

I guess people just don't think of long term consequences. No one stops and thinks the second part of, "Yeah I want this bad enough to buy it at this idiotic price, so I'm going to do it just because I want it and I can afford... but wait, what is that going to do in the long term?" Welcome to the long term.

It's all downhill from here, as long as we as consumers don't stop and think.
 
TLDR ram is cheap AMD, why do this? Extra $20 spent on manufacturing tops
 
Back
Top