Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

I think shifts in market share can happen pretty quickly. And 90+% of the market is non flag ship parts.
I think the resistance in the ability to just manufacture the cards make the possible speed of a shift, slow.

Imagine AMD selling the 7900xtx $599, would they gain market share or simply be back order everywhere..... Maybe chiplet open that a bit more than before, but in the current reality I can imagine that shift must by definition be quite gradual (making you reserve more next gen, design for more sales, take less orders on your other CPU and others competing with yourself business, build up AIB channels and so on)
 
I think the resistance in the ability to just manufacture the cards make the possible speed of a shift, slow.

Imagine AMD selling the 7900xtx $599, would they gain market share or simply be back order everywhere..... Maybe chiplet open that a bit more than before, but in the current reality I can imagine that shift must by definition be quite gradual (making you reserve more next gen, design for more sales, take less orders on your other CPU and others competing with yourself business, build up AIB channels and so on)
At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.
 
I think its obvious they are on a path the beating Nvidia up. Assuming they don't screw something up... OR Nvidia manages a switch to chiplets themselves. Reports are their next gen will be monolithic as well.
AMD has reduced their development costs by a magnitude, not just production costs. Not only are these cheaper to fab... it will be much much cheaper to design new generations. Reusing the 6nm bits for 2-3 generations is viable. They did it with Zen. They only really need a full redesign every couple gens. They could get even more savings fabbing the older process bits all at once, ordering chips for 2 generations of GPUs and just warehousing them for a year.
IMHO, they're fine as long as they don't require 4+ slots and a kickstand. I still think the 4090 is ridiculous.

If RT becomes a gaming requirement (which means, it's expected to work well in gaming consoles), then it could be a future issue, but that's probably way out at this point.
 
At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.

Maybe not, lets say they got it back to a 50/50 split then the investors would be fine with that as it will lead to more profitable sales later. Only AMD really knows what the cost is for each GPU, but I would assume $750.00 would still be quite profitable for them on their Halo card. I am just not convinced that AMD goal is to take market share.
 
Maybe not, lets say they got it back to a 50/50 split then the investors would be fine with that as it will lead to more profitable sales later. Only AMD really knows what the cost is for each GPU, but I would assume $750.00 would still be quite profitable for them on their Halo card. I am just not convinced that AMD goal is to take market share.
It is a bit hard to know because there the actual margin-cost of the GPU itself and everything else on a graphic cards, the way AIB talked about the latest generation, their margin did not seem to be that high even at the very high price tag.

I am not sure why a massive cut of price would lead to more profitable sales later.
 
Your not wrong about disrupting the market leader.
I think we should probably wait for benchmarks a bit though. We don't know how much the 7900 will beat the 4080. It seems like it will, we don't even know that for sure.
I mean how crazy do things need to get before people bite on AMD ? If 90% of the time a 7900XTX is within single digits of a 4090... and beats a 4080 by double digits in basically everything not counting a couple RT titles for $200 less. I think people start biting at that point if they are on NV 2000s and older.

I think shifts in market share can happen pretty quickly. And 90+% of the market is non flag ship parts. All the holding off on really mid range next gen from both NV and AMD is annoying. What AMD really needs to gain market share is a 7800 and 7700. Come in with those cards at $700/500 and put the screws on the last gen. I wish AMD had done that and just said fine we are going to have to blow out a bunch of 6000 cards and loose some money.

I agree with you, as much as AMDs tech is revolutionary. Alchemist sucking really sucks... hopefully BM does kick NV and AMD in the backside. I just hope Intel has learned enough lessons to do that.
If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
 
  • Like
Reactions: ChadD
like this
At that price the market share they gain would be done with such a small profit margin that the investors would shit themselves with rage and then throw it at Lisa, AMD's stock price would tank, Lisa would lose her bonuses, and likely her job too.
She's got that nice ring to fall back on though. She would be fine. ;)
If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
Not many looking for rx6600. They can barely drum up enough excitement for their top of the line, brand spanking new chiplet design GPU. And that excitement is mostly to compare to what Nvidia has already released; it doesn't even stand on it's own two legs. Their almost bottom of the barrel offerings wouldn't be flying off the shelves, they would like be collecting dust on those shelves.

Like it was said previously, AMD is content being second fiddle in the GPU market place. They know where they stand and their are standing firmly there for a reason.
 
If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
If they cranked the production of RX 6600 then its price would drop to less than $200 due to oversupply
 
If they cranked the production of RX 6600 then its price would drop to less than $200 due to oversupply
They're already regularly priced at 200 dollars, I almost bought one as a back up card for 219 a month or two ago. I saw a couple of deals on 6600 XTs in the low 200s.
 
It is a bit hard to know because there the actual margin-cost of the GPU itself and everything else on a graphic cards, the way AIB talked about the latest generation, their margin did not seem to be that high even at the very high price tag.

I am not sure why a massive cut of price would lead to more profitable sales later.

Sometimes you have to get a person to try your brand at a good deal before they willingly seek out your products. Personally I would like to see the 6800XT class card at a lower price for the masses, Also AIB margins are not the same as the manufactures. If they could get something with that performance at $400 or less then that would gain market share, just a matter if they want it and they have the capability to supply it. It would more or less seem they are good with having the consoles be their low margin market share for gaming.
 
If amd really wanted market share, they'd crank out those $250 rx6600's at all the capacity they could buy at tsmc.
Or just release a 7600 for $250. Take the loss on the remaining 6600s they would have to sell for 150. lol
 
A little would probably go a long way I imagine. I mean all the cache that takes up tons of die space are on the chiplets. The GCD is all logic... even if they decided to go up to 350mm or so the amount of extra horsepower would be impressive. I don't know if they will... heck this might be it for this gen. I think of it like early zen. As ground breaking as zen 1 was, it was the follow ups that took the idea and ran with it. I think this gen they where probably conservative. Make sure it works they couldn't afford to have a generation fall on its face. Knowing how Su dealt with Zen though I'm sure they already have the next gen well on the way... cause they probably are going to use the exact same cache chiplets. I wouldn't be shocked if they already had very early engineering bits for the next gen.
If they do have a refresh with a larger die version in the 7950 xtx, they need to focus on whatever accelerates ray tracing. That's my biggest worry/disappointment with the latest amd gpu cards so far, the gap this generation seems to have widened when amd needed to narrow it.

When the benchmarks come out, if the 7900 xtx can't even beat a 3090 in raytracing I won't bother.
 
If they do have a refresh with a larger die version in the 7950 xtx, they need to focus on whatever accelerates ray tracing. That's my biggest worry/disappointment with the latest amd gpu cards so far, the gap this generation seems to have widened when amd needed to narrow it.

When the benchmarks come out, if the 7900 xtx can't even beat a 3090 in raytracing I won't bother.
I am pretty sure that gap will actually have gotten much smaller. But we'll have to wait for benchmarks.
People need to consider the 4090 is the freak part. The 4080 is the real comparison/competition here, based on the difference on paper the 4080 is going to be a lot short of the 4090 in RT. The question will be how the 7900 compares to the 4080 in RT. I suspect it will probably still be the 4080 on top, but it might be a much smaller lead then some might expect. I mean the 4090 exists its real but I some how doubt NV keeps pumping stock of the 4090... real ones in the wild will continue to be sold way over MSRP. Its out there and Nvidia has the win... I don't really trust AMDs slide bars, but if we go by those the RT comparisons vs 4080 might be pretty interesting.

Anyway third party benchs is what we need.
 
Great interview on how AMD with RDNA3 plan to quote "take Nvidia and nearly drown them in the ocean, then like a powerful wave toss them onto the shore like they're nothing, with their pants pulled down and their ass in the breeze" (paraphrasing)

 
Last edited:
Great interview on how AMD with RDNA3 plan to quote "Take Nvidia by the shoulders, wheel them into a corner, and turn the lights out", figuratively speaking. And speaking figuratively, that they are "literally going to take Nvidia and nearly drown them in the ocean, then, like a powerful wave toss them onto the shore like they're nothing, with their pants pulled down and their ass in the breeze", figuratively, literally (I'm paraphrasing).


I already watched this bad boy. But in all seriousness, it does call into question AMD's profit scalability in comparison to nVidia.
As was noted, chiplet design is having a massive effect on cost vs running a monolithic design. It's possible that the 7900XTX is setup to be more profitable than the 4080 despite the $200 difference.
AMD may be able to make equal money despite a cost difference unlike the past where they had to undercut just to stay competitive.

The short term race is how soon AMD can claim the top spot while nVidia races to have chiplet designs.
 
AMD is in with it's biggest opportunity thus far to make massive in-roads on Nvidia's market.
What they do next will determine the percentages they take.
They had one other massive opportunity and blew it during the height of the pandemic when they neglected the gamer allocation in favor of Crypto.
Sure! it was good money but it came at the expense of exposure to guys who would have run out and bought one....especially Nvidia faithful who at that point just wanted any card.
With fewer card from this generation in the pipeline, lowering prices to move them is a good first step.
The sooner they're gone, the sooner they can get on with their plan for the new gen.
Nvidia and it's AIB partners still want top dollar for last gen and believe they should bully folks into those cards by market manipulation.
It is up to people which practice they'll support.
 
I think this gen of AMD cards will be very good bang for buck, albeit not any better than Ampere in ray tracing. I think the next will challenge nVidia for the top spot. I just hope they don't price them equivalent to nVidia at that point and instead use the opportunity to earn some mind share.
 
I think this gen of AMD cards will be very good bang for buck, albeit not any better than Ampere in ray tracing. I think the next will challenge nVidia for the top spot. I just hope they don't price them equivalent to nVidia at that point and instead use the opportunity to earn some mind share.
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
 
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
He might be talking about the lower tier AMD cards, not necessarily the XTX. If he wasn't, well then I guess nvidia's pricing strategy is working its magic.
 
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
AMD is playing a bit of the same game Nvidia is playing to move old stock. At least they have been getting their older product out at an actual discount. Having said that... I would expect the 7900 XT will probably not get restocked super heavy while the XTX will. Then in a few months AMD will drop a 7800 XT... that might drop a bit more ram or something perform pretty darn close and come in at a proper 7800 price.

The 7900s are going to sell well... but for Nvidia the real oh no moment will probably be in 6 months or so when AMD very easily pushes out a mini refresh. As Sam was basically telling Steve, like Zen AMD can just reuse all the mem controller, analog bits for multiple generations. Refreshes are much cheaper and much faster to market... actual next generation parts you can have the entire engineering team working on the logic, no one needs to spend months remapping controllers and cache designs to another process. Radeon 8000... I would bet money will use the exact same 6nm MCDs. (heck they might even survive longer then that... I'm not sure those parts would benefit at all from a shrink anytime soon)
 
I haven't dug into the chip design apart from high-level architecture. Is 7900xtx the full fat chip or is there more fat to be had?
 
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.

He might be talking about the lower tier AMD cards, not necessarily the XTX. If he wasn't, well then I guess nvidia's pricing strategy is working its magic.
I'm only cheering for competition in hopes that it gets price/performance into a better place. The reality that we live in is that people are still buying up almost everything at ridiculous prices. It seems the 4080s are showing that people are a little more hesitant to buy it out in seconds, but they are still going to sell everything that's available in a day or two. This $1000 7900xtx card is a step in right direction. If it matches or surpasses the 4080 in rasterization, then at least nVidia will be put on notice. If AMD uses that to try and unseat nVidia in the next gen of cards by going for market share, then maybe that performance level comes down to $899 or even $850, or something more reasonable. If you watch the GN video that came out overnight, you'll see that AMD should be seeing more and more competitive advantage in pricing on chiplets. I'm sure nVidia is going to go down the chiplet road, but AMD has a massive headstart. We didn't get into this mess overnight and we're not getting out of it overnight. The 1080Ti was $700 and nVidia decided to never make that mistake again. It's been 3 generations since then. The 3080 was MSRP at a somewhat OK price, but it was unobtainium. Now, look at CPUs. After 4 generations of Ryzen, AMD and Intel are just now slitting each other's throats on pricing. I think there's more gpu pain ahead, but hopefully, it will improve a generation at a time. And, as Furious_Styles said, there should still be some value in the 7800-7700-7600 range for the time being, but I still don't see them beating the used market on 3080s. Myself, just this week, I just kind of side graded from a 3080 to a used 3090 because I saw a deal too good to pass up and I had a buyer for my 3080.
 
You're still cheering for a $1,000 card in a performance segment that was around $700 not too long ago.
Would I go to jail if I admitted part of me is cheering for the idea of a $1600 monster AMD card with dual exhausts, so that more chairs are thrown, more bottles are broken, and more guys are saying something unclean about some other guy's girlfriend?

Because the GPU market lately is embracing really the two things that make America great: gettin' mad and buying shit.
 
Last edited:
I haven't dug into the chip design apart from high-level architecture. Is 7900xtx the full fat chip or is there more fat to be had?
If you mean are their disabled bits on the 7900 XTX. No it doesn't look like it. The logic chip is what it is. IMO they went conservative with it. I mean its an upgrade from the previous gen logic... but they didn't swing to hard. I am sure they wanted to prove the concept and get a working product first. The MCM cache chips though are probably good to go for one more generation. I would think they will create a slightly bumped still RDNA 3 chip for a refresh using the exact same cache MCM chips. When they move the RDNA 4 most likely they will also use the same 6nm MCM chips.
Zen has had a good pretty steady upgrade schedule for the same reasons. Chiplets allows them to focus on nothing but the Logic bits in refreshes and every second gen. Think of it as a Tick Tock development process... they design a MCM + Logic in the first Tick, then they follow that up with a Tock gen where everyone works on the logic.
Anyway ya 7900 XTX is not going to beat a 4090 by the looks of it... and AMD doesn't have a golden perfect die they can release right away. They probably are in a great position to tweak the logic only chip with a very fast refresh. I'm sure there in no rush... until they move all their old chips. If they do have a 7950 slightly upgraded logic part though its probably already tapped out engineering samples.
 
I am batting for AMD on this one, they talk a mean game but I am really hoping to see some follow-through, I want an actual race between AMD and Nvidia, not a leisurely jog.
 
I am batting for AMD on this one, they talk a mean game but I am really hoping to see some follow-through, I want an actual race between AMD and Nvidia, not a leisurely jog.

Well I think AMD has more room to grow with the chiplet design, while Nvidia's back is pretty much against the wall with monolithic design. Not really sure if they can really crank the wick much more for the next generation and if so that will give AMD ample room to catch up. But who knows Nvidia might have a Ace up their sleeve. I am however more concerned by the pricing these days then the performance. If 80% of the market can't afford the cards then it hurts the gaming market as a whole. This is also why I expect Ray Tracing to go nowhere, as it needs a user base for any developer to seriously utilize it.
 
If you mean are their disabled bits on the 7900 XTX. No it doesn't look like it. The logic chip is what it is. IMO they went conservative with it. I mean its an upgrade from the previous gen logic... but they didn't swing to hard. I am sure they wanted to prove the concept and get a working product first. The MCM cache chips though are probably good to go for one more generation. I would think they will create a slightly bumped still RDNA 3 chip for a refresh using the exact same cache MCM chips. When they move the RDNA 4 most likely they will also use the same 6nm MCM chips.
The thing about the MCM design like you note is if they wanted to improve logic or have a die shrink, it's more viable since they don't have to port the RAM or the controllers. AMD has never bothered to compete with "Titan" and the 4090 is basically nVidia's Titan class part renamed. Any tech enthusiast will be more than happy if the 7900XTX has a clear win vs the 4080 in raster and if it's at least as good as the 3080Ti (or so) in ray-tracing. Especially at a (at minimum) -$300 price point. Other than nVidia die hards, I can't see anyone picking the 4080 over the 7900XTX for any other reason but that or availability. I have a feeling AMD is going to need to print as many chips as they can possibly squeeze out this gen because this may be the next 9700 banger.

Well I think AMD has more room to grow with the chiplet design, while Nvidia's back is pretty much against the wall with monolithic design. Not really sure if they can really crank the wick much more for the next generation and if so that will give AMD ample room to catch up. But who knows Nvidia might have a Ace up their sleeve. I am however more concerned by the pricing these days then the performance. If 80% of the market can't afford the cards then it hurts the gaming market as a whole. This is also why I expect Ray Tracing to go nowhere, as it needs a user base for any developer to seriously utilize it.
I think the whole market is waiting to see the ~$300 parts. AMD is likely to sell everything they make. nVidia may not, just due to pricing and the fact that as the market leader they'll have way more parts in the channel as evidenced by the 4080.
 
Well I think AMD has more room to grow with the chiplet design, while Nvidia's back is pretty much against the wall with monolithic design. Not really sure if they can really crank the wick much more for the next generation and if so that will give AMD ample room to catch up. But who knows Nvidia might have a Ace up their sleeve. I am however more concerned by the pricing these days then the performance. If 80% of the market can't afford the cards then it hurts the gaming market as a whole. This is also why I expect Ray Tracing to go nowhere, as it needs a user base for any developer to seriously utilize it.
It's more a question of, does AMD do anything with it? rather than can they, because it's obvious that they can.
Their chiplet design saves silicon and cuts costs, but will AMD choose to put that saved silicon into more GPUs, or will they take those silicon savings and put it towards Enterprise where they consistently fail to meet OEM demand for Threadrippers and EPYCs? So will they choose to try to take a larger share of the GPU market or will they take the safe bet and apply it to the Enterprise market where they are guaranteed sales and high margins?
I (and probably most of us here) want AMD to take the fight to Nvidia in the GPU space, but investors and shareholders will want them to go after Enterprise because it gets them better returns now and isn't seeing a reduction in demand.
So I don't at all question AMD's ability to deliver, I question their priorities.
 
will AMD choose to put that saved silicon into more GPUs, or will they take those silicon savings and put it towards Enterprise
Ding ding ding. Or towards Enterprise GPU's. Sell the same piece of silicon for 4x-7x as consumer market, why wouldn't they.
 
Last edited:
It's more a question of, does AMD do anything with it? rather than can they, because it's obvious that they can.
Their chiplet design saves silicon and cuts costs, but will AMD choose to put that saved silicon into more GPUs, or will they take those silicon savings and put it towards Enterprise where they consistently fail to meet OEM demand for Threadrippers and EPYCs? So will they choose to try to take a larger share of the GPU market or will they take the safe bet and apply it to the Enterprise market where they are guaranteed sales and high margins?
I (and probably most of us here) want AMD to take the fight to Nvidia in the GPU space, but investors and shareholders will want them to go after Enterprise because it gets them better returns now and isn't seeing a reduction in demand.
So I don't at all question AMD's ability to deliver, I question their priorities.

Well they are using different nodes this time from the cpu and part of the gpu, so that should allow them to dedicate more to the gpu side this time without coming at the cost of cpu production. Also since the new cpu's are not flying off the shelves I think they will just dedicate more production to the server side instead of desktop. But reality is you never want to have more then what the market demands, so being slightly under market demand is the goal to maintain maximum prices. So the answer is maybe? Just depends on how much more demand there is for the enterprise side and how much they want to actually supply in that market.
 
Well they are using different nodes this time from the cpu and part of the gpu, so that should allow them to dedicate more to the gpu side this time without coming at the cost of cpu production. Also since the new cpu's are not flying off the shelves I think they will just dedicate more production to the server side instead of desktop. But reality is you never want to have more then what the market demands, so being slightly under market demand is the goal to maintain maximum prices. So the answer is maybe? Just depends on how much more demand there is for the enterprise side and how much they want to actually supply in that market.
Well for the last 2 years AMD failed to meet both Dell and Lenovo's demand for EPYCs and Threadrippers, both had to push customers to Xeons because AMD could not get them the chips, so there was certainly a failure to meet demand in those spaces.
 
PowerColor 7900 XTX Red Devil. 3.5+ slots.

We are truly in the era of monster truck GPU's.

1669164863595.png
 
Last edited:
Waiting for single slot with a preinstalled water block.
Same. Or a heatkiller block for the reference GPU. I'm sure there will be a liquid devil which are usually great but I just don't like the 'devil' on the front of the GPU. Either way i'm getting rid of the 3090 and going 7900XTX. Jensen can eat shit with the stupid pricing of the 4090 and the IMO probably the worst priced GPU ever with the 4080.
 
AMD marketing has everyone making the association of 7900XTX = 4080 competitor. So, they're effectively shifting their card stack by one tier, and the $999 7900XTX replaces the $649 6800XT. Then, the 7900XT replace the non-XT 6800.

Not as egregious as what NVIDIA is doing with 4080 pricing, but to me, when AMD realized the 7900XTX wouldn't compete with a 4090, they saw an opportunity to avoid an unfavorable 7900XTX--4090 comparison while simultaneously increasing their 4080 equivalent card asking price by $350. Not badly done...
 
They should have called it the 7800, not 7900.

Makes them look impotent. I know it's just a name....

But people are already complaining about a 4080 costing 1200 but not a 4090 costing 1600....
 
They should have called it the 7800, not 7900.

Makes them look impotent. I know it's just a name....
I would lean harder into the "it's just a name" part.

If there is another card that they want to release, they still have the space to do so and can name it whatever they want.

I think it's more odd that the 3090 and 4090 exist in terms of "name". But they probably figured out that they could make more money by calling it a **90 rather than "Titan".
But people are already complaining about a 4080 costing 1200 but not a 4090 costing 1600....
Eh, I think it has just as much to do with paying more money for the scaling instead of an equal amount with the 2 series and to some extent the 3 series. nVidia more or less took away all the multiplicative gains in terms of cost vs performance that people usually get when moving generationally. That's more or less what the GN video was about.

However, nVidia also needs to move old 3 series stock and THEY want to be the ones making all the money instead of scalpers. I have no doubt that if the 4080 was $800, it would fly off the shelves and be heavily scalped (honestly it's also kind of annoying that the 7900XTX is $1000, we can thank nVidia for artificially moving up the costs). However, then at least they'd be selling cards... sooooo. Whatever, your move nVidia.
 
Last edited:
Back
Top