Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

No, I'm not. If AMD can produce a faster card with feature parity, I'd buy that just as easily. I've had many AMD cards over the years as well as NVIDIA. I'm simply betting that its more likely that AMD can make a card just fast enough to force NVIDIA to drop its pricing a bit. Had the mining demand not been so strong, the $500 cheaper MSRP of the 6900XT might have forced NVIDIA to drop the price on the RTX 3090 or the subsequent 3080 Ti. Not everyone cared about ray tracing performance, but it might have gone that way depending on how much weight individuals gave to it. That said, I think a lot of people think more like I do when shopping the top of the stack.
Yup. My plan was a 3080 and a 6800XT for my two main systems (since both are at 1440P). One would handle ray tracing, and given the performance being almost identical for raster workloads, I'd have the alternative if there was something that just ran better on AMD, or driver bugs kept one from working well. But I did grant the possibility of RT being important, so with them being equal, I did pivot to a 3080 for my main gaming box. I'd have been ok if I'd been forced to stick with AMD though - and I don't regret having the 6800XT in my workstation either.

In fact, the only time I've REALLY cared is for VR - there are still some micro-stutter issues with AMD (or were at the time) so that HAD to be Nvidia.
Again, at 4K in some games the small percentage of difference is more pronounced than it is at lower resolutions where you've essentially got frames to spare. That said, I mostly agree with you. AMD didn't falter that much on the 6900XT given it's much lower MSRP, but the poor ray tracing performance absolutely killed it on the high end for buyers like me. Like it or not, individual games often sell GPU's. For the time, that game was Cyberpunk 2077.
And for me it was Control - I wanted to play that with RT. REALLY wanted to play that with RT. But that's the one game I've played where it mattered - in the last two years.
That's why I was talking about AMD competing aggressively on price. When they do it right, AMD can sometimes offer more performance for the money in the more mainstream segments where the bulk of people buy their cards.
Also agreed.
 
AMD is on the same overpopulated node that NVIDIA is on. Going the chiplet route may have saved them some money compared to the monolithic chips on NVIDIA, but I can't imagine it being much.

Adored was saying that AMD's die size is 308 compared to Nvidia's 608, or thereabouts. And the cache/memory chips are like, 37mm^2, basically free. The traces are going to cost more than the chips.

The max tapeout for a single die is just north of 800mm^2, so even if AMD stays on this node forever, they can still double their performance based on RDNA3. That's why Nvidia is shitting their pants right now.
 
Only half the chip is on that node... that is the point.
They are not no ? They are on TSMC 5 and 6, not 4 like Nvidia. even before better yield saving, there is cheaper process being used.
 
They are not no ? They are on TSMC 5 and 6, not 4 like Nvidia. even before better yield saving, there is cheaper process being used.

Nvidia is on the newer node compared to AMD. Partially why Nvidia has higher costs to make their cards.
 
It was a really good card that still lost out to the 3080 which all things considered was the "cheaper" option. But the last gen was a wash nothing was really available so the good card was the one you could actually get.
This may now include, the card that actually fits (physical) and can be powered reasonably. I might be able to get 8 CPUs for an 8 socket system, but I might not have anything to plug them into.

And, of course, if the price isn't outrageous. Good price, works with what I have.... win!

Expensive, requires buying all new... not (necessarily) a win. I disagree that "great regardless of cost" equates to absolute greatness. Because, let's be honest, there is a higher class (one totally out of reach).
 
We'll get a better idea tomorrow. They should be able to under cut by quite a bit if they CHOOSE to anyway.
Nvidia outsells AMD by something on the order of 20:1. This gap in scale makes it so that AMD would have trouble competing on price even if their GPU chip price went to zero. Keep in mind that the main hunk of silicon is just one portion of the COGS. There's also every other component on the board, the board itself, assembly, the cooler, and various sources of overhead like distribution. Many of these go down considerably when talking about a literal order of magnitude difference in volume.

FWIW, this is a big reason why there are only 2.5 players in the market. Even with a fully complete turnkey design, it is still unfathomably expensive to compete in this space. The only reason Intel has any hope is because they are gigantic enough to place multi billion dollar bets. Very very few other companies are in that position.
 
Nvidia outsells AMD by something on the order of 20:1. This gap in scale makes it so that AMD would have trouble competing on price even if their GPU chip price went to zero. Keep in mind that the main hunk of silicon is just one portion of the COGS. There's also every other component on the board, the board itself, assembly, the cooler, and various sources of overhead like distribution. Many of these go down considerably when talking about a literal order of magnitude difference in volume.

FWIW, this is a big reason why there are only 2.5 players in the market. Even with a fully complete turnkey design, it is still unfathomably expensive to compete in this space. The only reason Intel has any hope is because they are gigantic enough to place multi billion dollar bets. Very very few other companies are in that position.
Nvidia isn't getting volume deals much better then AMD. No one gets massive discounts at volume on much of anything in the tech world.

Intel used to say the same thing... and Intel still sells more then AMD to be fair. But the last couple generations they absolutely bleed share to the company selling at 1:20 previously when Bulldozer was their flagship. A good product changes things very fast in consumer tech.

Still I don't think it likely that AMD does much different then what they normally do. I expect them to be a bit cheaper... but not burn down the house. Having said that if they have a performer the 20:1 Nvidia sales thing will evaporate fairly quickly. I don't think most consumers are really all that die hard green or red. It was no different if we go way way back to when AMDs Athlons where selling well... the moment they had a stinker product Intel gained all the share they earned back. If reports are true that is happening now for that matter... Intel seems to be doing pretty well with 13th gen in terms of sales.
 
Nvidia isn't getting volume deals much better then AMD. No one gets massive discounts at volume on much of anything in the tech world.

Intel used to say the same thing... and Intel still sells more then AMD to be fair. But the last couple generations they absolutely bleed share to the company selling at 1:20 previously when Bulldozer was their flagship. A good product changes things very fast in consumer tech.

Still I don't think it likely that AMD does much different then what they normally do. I expect them to be a bit cheaper... but not burn down the house. Having said that if they have a performer the 20:1 Nvidia sales thing will evaporate fairly quickly. I don't think most consumers are really all that die hard green or red. It was no different if we go way way back to when AMDs Athlons where selling well... the moment they had a stinker product Intel gained all the share they earned back. If reports are true that is happening now for that matter... Intel seems to be doing pretty well with 13th gen in terms of sales.
The problem, in my eyes, is that AMD can only really produce "just as good" products to Nvidia. Ryzen processors were "better in many ways" products versus intel 6000, 7000, 8000, 9000, 10000 & 11000 series processors. Yes, Intel was faster in some generations in gaming, but Ryzen was better most generations in many ways to productivity, value, platform flexibility, platform cost, system responsiveness etc.

With GPUs, people only care about one damn thing: FPS at X settings.

If Radeon offers "just as good" FPS at X settings, what's the incentive to switch brands and lose NVEnc, Shadowplay, DLSS, DXR etc.

I know AMD has competing features that are "just as good" at best, but that still means that with Nvidia, you get "the standard" and with AMD you get "mostly just as good as the standard". Not exactly a sexy marketing slogan.

What AMD needs is "Better in many ways" with Radeon. Better performance, better features.
 
How about better price.
A GPU is expensive. Not motorsports expensive but expensive enough that "pay that little bit more and get a better experience" is a big factor.

Remember when the RX570 was as fast or faster in every way to the 1050ti and cheaper, but you don't see that reflected in the sales numbers. When buying something that offers 'an experience', and something that you may live with for years, price is what you're willing to spend, but not the determining factor on what products in that range you buy.
 
And something to note as well:

People don't look at "is the 7900XTX better than the 4090"

They look at "is AMD better than NVidia"

That is what is important. Is AMD regularly faster than Nvidia? Are AMD's streaming/encoding features regularly better than Nvidia's? Does AMD create new technology adopted by developers that regularly offers a better experience than the same by Nvidia? One generation isn't enough. It needs to be RELIABLY TRUE. If the two trade blows one gen, but Nvidia is the defacto standard, than nothing changes. If AMD comes out with an amazing, fastest and nicest range but then fumble and Nvidia takes back the next generation, nothing changes. That's why Nvidia puts SO MUCH EFFORT into having the fastest cards and pay a lot of money making sure EVERYONE talks about them. They spend SO MUCH MONEY to introduce new technology that makes EVERYONE test it out and talk about it.

Say what you want about Nvidia, but THEY. DO. NOT. STOP.

AMD managed to trump Intel because Intel got fat and lazy. Nvidia is a different beast entirely.
 
AMD managed to trump Intel because Intel got fat and lazy. Nvidia is a different beast entirely.
I don’t know about lazy, but their 10nm fumbles set them back years. But AMD did get desperate, and that caused them to really think outside the box and it worked out unusually well for them.

Intels 10nm (Intel 7) process was supposed to go live back in 2015 had that actually happened even the initial Zen 1 would have been hard pressed to save AMD. Intel at that time was still designing for their future nodes not their active nodes, a practice they have obviously abandoned as it forced Intel to throw out 3 generations worth of designs as they were designed around the 10nm process that never materialized. It’s not an exaggeration to say the delays in Intel 10nm set them back 5 years of development.
 
I don’t know about lazy, but their 10nm fumbles set them back years. But AMD did get desperate, and that caused them to really think outside the box and it worked out unusually well for them.

Intels 10nm (Intel 7) process was supposed to go live back in 2015 had that actually happened even the initial Zen 1 would have been hard pressed to save AMD. Intel at that time was still designing for their future nodes not their active nodes, a practice they have obviously abandoned as it forced Intel to throw out 3 generations worth of designs as they were designed around the 10nm process that never materialized. It’s not an exaggeration to say the delays in Intel 10nm set them back 5 years of development.
Even so, Their (intel's) Die sizes were getting smaller and smaller, their performance upgrades getting less and less, but prices were staying the same if only going up. Before any 10nm woes (and those could even be attributed to their incompetence and laziness as well, as they were designing a process to make them continue to reduce die size and inflate margin, not to blow minds with performance) Intel was falling asleep on their cushy throne.
 
That is what is important. Is AMD regularly faster than Nvidia? Are AMD's streaming/encoding features regularly better than Nvidia's? Does AMD create new technology adopted by developers that regularly offers a better experience than the same by Nvidia? One generation isn't enough. It needs to be RELIABLY TRUE. If the two trade blows one gen, but Nvidia is the defacto standard, than nothing changes. If AMD comes out with an amazing, fastest and nicest range but then fumble and Nvidia takes back the next generation, nothing changes. That's why Nvidia puts SO MUCH EFFORT into having the fastest cards and pay a lot of money making sure EVERYONE talks about them. They spend SO MUCH MONEY to introduce new technology that makes EVERYONE test it out and talk about it.

This!! Excellent post. This pretty much is exactly why the Market is the way it is. AMD, since taking over ATI, have been very inconsistent at best to incompetent at worst. Lisa Su has turned things around. Their success in the CPU market is a guideline to what they have to do in the GPU market and that is, be consistent. Release good products every generation. And this year they have been granted a gift from the GPU Gods, Nvidia has completely messed up their release. High Price, Connector problems, Renaming cards, slow release, Only the 4090 has good performance gains, etc. If RDNA 3 is as good as the rumours are suggesting, then, they should start to change the mindshare away from Nvidia.
 
This!! Excellent post. This pretty much is exactly why the Market is the way it is. AMD, since taking over ATI, have been very inconsistent at best to incompetent at worst. Lisa Su has turned things around. Their success in the CPU market is a guideline to what they have to do in the GPU market and that is, be consistent. Release good products every generation. And this year they have been granted a gift from the GPU Gods, Nvidia has completely messed up their release. High Price, Connector problems, Renaming cards, slow release, Only the 4090 has good performance gains, etc. If RDNA 3 is as good as the rumours are suggesting, then, they should start to change the mindshare away from Nvidia.
Don't disagree with most of this. I do disagree with your comment regarding current mindshare. One generation isn't going to budge the needle one bit on mindshare. In this time of financial uncertainty, I don't think either company is going to lose or gain mindshare/market share.
 
Even so, Their (intel's) Die sizes were getting smaller and smaller, their performance upgrades getting less and less, but prices were staying the same if only going up. Before any 10nm woes (and those could even be attributed to their incompetence and laziness as well, as they were designing a process to make them continue to reduce die size and inflate margin, not to blow minds with performance) Intel was falling asleep on their cushy throne.
You can also blame AMD. Bulldozer set the industry back... it allowed Intel to coast. AMD made the same mistake Intel made with Pentium 4 a few years later.... they had terrible leadership at the time. Its amazing that they watched Intel make the pipeline length mistake... where they where forced to abandon the arch. Yet still figured they could pull the same thing off a few years later. As much as people point to the CMT design as the fail imo it had to do with the massive pipeline length. Like P4 before clockspeed was not a fix. I wonder if it could work today with modern improved prefetch units... probably still suck in single core IPC.

As innovative as Zen was... it was actually a much more standard core design then bulldozer was. AMD gave Intel an opportunity to get fat and lazy.

Hopefully whatever AMD has cooking for GPUs this gen... they do take the lead. Nvidia needs to be put on their toes as well. As much as people say ya 4090 is great... Nvidia skipped on gen with Tesla. They have also been stuffing their dies with AI focused bits really aimed at the data center. Nvidia COULD turn out a Raster monster of a GPU if they had someone pushing them to do so. Yes the 4090 is a raster monster... its just doing it with a VERY expensive datacenter level AI chip which makes for stupid expensive cards. Nvidia could provide the same level of raster performance in a chip half the size and cost if they toned down the insane number of Tensor cores that have very little consumer use. NO one is pushing them to do so.
 
Don't disagree with most of this. I do disagree with your comment regarding current mindshare. One generation isn't going to budge the needle one bit on mindshare. In this time of financial uncertainty, I don't think either company is going to lose or gain mindshare/market share.

Regarding Mindshare, I think it will, especially combined with the current Nvidia problems. And it won't be just one generation. If RDNA 3 is good, it will building on the reputation gained from RNDA 2. Because the RDNA 2 product line was very good. It was a release with no major flaws, cards were priced well and very competitive. I also think the current financial situation will hurt Nvidia more than AMD. I guess we shall see!! :) A lot depends on what AMD announce later on today.
 
Last edited:
1667495755600.png
 
As Kyle already stated: every card that AMD can produce will sell. The fight for mindshare doesn't matter.
It might matter down the road. :) Maybe.
Its true though what does mind share matter when everything you make sells. The console players are both already team AMD...
Having said that though. Reports from most people in retail seem to say previous gen AMD did sit on the shelves for at least a little bit even when they where the clear best bang for the buck.
 
It might matter down the road. :) Maybe.
Its true though what does mind share matter when everything you make sells. The console players are both already team AMD...
Having said that though. Reports from most people in retail seem to say previous gen AMD did sit on the shelves for at least a little bit even when they where the clear best bang for the buck.
All the evidence is anecdotal. And even if cards are sitting on a shelf: they're already bought, sold, paid for. AMD to my knowledge doesn't sell cards based on commissions. In fact I don't know any computer hardware company that does. The cost of silicon, PCB, and components has gotten too high and the margins too thin.
From AMD's and sales perspectives (as well as the stockholders): not their problem. While that particular distributor might choose to not re-up on AMD products, someone else will fill the void. Mindshare doesn't matter if it literally doesn't touch their bottom line.

I don't see a scenario where it does for a long time barring catastrophic failure (as in, they can't produce a GPU, with any application for 10 years plus), or every foundry with a reasonable process gets destroyed (although that would equally affect nVidia).
 
Last edited:
Its true though what does mind share matter when everything you make sells. The console players are both already team AMD...
Poor Nintendo......

I am not sure I go into mindshare does not matter, AMD cleanly beating NVIDIA 3 generation in a row would probably matter and it is not like AMD is trying to make has much card has they could, at least if we go by the rumors:
https://www.tomshardware.com/news/amd-apple-nvidia-reportedly-reducing-5nm-tsmc-orders

All the evidence is anecdotal. And even if cards are sitting on a shelf: they're already bought, sold, paid for. AMD to my knowledge doesn't sell cards based on commissions. In fact I don't know any computer hardware company that does. The cost of silicon, PCB, and components has gotten too high and the margins too thin.

There short term and short term, card still on the self can reduce next weeks card commands, that can affect AIBs next decision on buying chips no ?
 
Last edited:

This is why AMD can choose to be much cheaper. 308mm on a tried and true process with a low defect rate. 37mm chips coming from the most expensive latest greatest process where the higher defect rate will take out a lower % of chips due to the size of a working chip.
Hopefully Navi 31 is fantastic and all... but this is super exciting going forward. (assuming Navi 31 isn't hot garbage at least) Nvidia really needs to go this way as well... imagine Nvidia being able to mix and match Tensor and Raster chiplets to provide Raster super chips for gaming, and 100% tensor chips for AI.
 
Last edited:
There short term and short term, card still on the self can reduce next weeks card commands, that can affect AIBs next decision on buying chips no ?
It doesn't affect the AIB's either: they aren't taking cards back. If cards are on shelves it affects people on the POS end only. The way it would affect AMD or AiB's "directly" is the situation nVidia finds themselves in: trying to push new product without having sold out of their old product. However AMD isn't having issues anywhere close to that. Outliers at best. I don't hear of any such problems happening at Microcenter, or Amazon, or Newegg as an example. All the major players are having zero problems pushing cards.
 
It doesn't affect the AIB's either
I mean if you have less than a month short term view, but it could affect there very next command, direct opportunity cost is an affect.

However AMD isn't having issues anywhere close to that. Outliers at best. I don't hear of any such problems happening at Microcenter, or Amazon, or Newegg as an example.
I am not sure what hearing about it would look like or what problems pushing cards would look like either ?

The only clue we have would be price and how often they go into back order ?

https://www.newegg.com/asrock-radeo..._6800xt-_-14-930-049-_-Product&quicklink=true

6800xt are now down to $535 on Newegg in a world where the cheapest 3080 are $700 on newegg, what would having problems pushing 6800xt would look like ? $450 ?
 
  • Like
Reactions: DPI
like this
I mean if you have less than a month short term view, but it could affect there very next command, direct opportunity cost is an affect.


I am not sure what hearing about it would look like or what problems pushing cards would look like either ?

The only clue we have would be price and how often they go into back order ?

https://www.newegg.com/asrock-radeon-rx-6800-xt-rx6800xt-pgd-16go/p/N82E16814930049?Description=RX 6800xt&cm_re=RX_6800xt-_-14-930-049-_-Product&quicklink=true

6800xt are now down to $535 on Newegg in a world where the cheapest 3080 are $700 on newegg, what would having problems pushing 6800xt would look like ? $450 ?
Well that and leaked information. The major reason why nVidia's pricing is higher is because it's artificial. They wanted to reap more of the profit during the silicon shortage - jacking up the prices so that they could enrich themselves and keep money out of scalpers hands. Or as much as possible anyway. Profit that has been revealed by AiB partners (such as EVGA), nVidia wants to horde for themselves and not pass on to anyone else. Not the AiB's, and not the retailers. And when the margin plummets, it's clear that nVidia also allows their AiB's to twist in the wind, being more than happy to keep the entire margin they already collected from their AiB's and simply let their AiB's deal with it. Hence, stacks of 3000 cards sitting around at high prices and AiB partners being more than a bit shy about reupping with new parts and eVGA literally leaving the market.

AMD's prices are more reactionary yes, but this isn't evidence of lack of sales or profit for that matter. 6900XT pricing is commensurate with a new gen dropping as far as I can see and looking at one outlier card doesn't tell me anything at all. There are no other 6900XT cards available anywhere close to that price. I suppose you disagree. AMD is going to launch new product today, and I'm sure it will be sitting at the top of their pricing stack.

Feel free to check-in with FrgMstr about it. I'm happy to take him at his word that all AMD cards will sell no matter what, but also more than that trust what I see.
 
Last edited:
Feel free to check-in with @FrgMstr about it. I'm happy to take him at his word that all AMD cards will sell no matter what, but also more than that trust what I see.
All the new card of the cut down production they will make will sales it is certain (we leave in a market that $2000 4090 sold quickly and PS5 are still virtually impossible to buy online), it does not mean that no matter what how many card AMD sales in the next 5years will be unnaffected.

https://www.tomshardware.com/news/amd-apple-nvidia-reportedly-reducing-5nm-tsmc-orders
https://wccftech.com/tsmc-will-soldier-on-to-growth-despite-amd-nvidias-troubles-says-wedbush/

On that mindshare does not matter on the price point they will be able to reach.

with 6900xt going out at $655, your $1100 new video card will need to be quite something.
 
I know that some of what is being presented is not that exciting to many viewers, but…AMD really needs to tell their presenters to have a bit more enthusiasm.
 
Back
Top