Join us on November 3rd as we unveil AMD RDNA™ 3 to the world!

The last gen was a total shit show and there was no stable price/performance metric to measure off of so the only measurement method really was "does it work for you at that price", because it's not like you couldn't buy the cards at any point it was just a matter of paying for them at specific points.
Regardless of price they competed well and had an answer to almost everything Nvidia offered. I've seen multiple people notice no real difference between 3090/6900xt.
 
Regardless of price they competed well and had an answer to almost everything Nvidia offered. I've seen multiple people notice no real difference between 3090/6900xt.
Once you get into the halo cards unless you are trying to push 4K with raytracing and all the eye candy you aren't going to have much of a difference, for all of us normal people still at 1080 ad 1440 they are all functional overkill after a point and you really won't see that difference.
Let them both battle for 4K bragging rights and that's great and all but it's everything below that, which matters and they both have good offerings there that easily trade back and forth.
 
Regardless of price they competed well and had an answer to almost everything Nvidia offered. I've seen multiple people notice no real difference between 3090/6900xt.
The issue here is with Mindshare: It's a real thing.

I've worked out of a B&M computer shot for over ten years (only on the weekends now). This last few generations has been the worst for AMD cards, even during the crypto craze, where we had 30+ Nvidia Geforce cards on the shelf, flying out the door for exorbitant prices, we maybe had one AMD card that would sit there for weeks before we used it for a repair or something. Right now we have several different models of 1650 all the way to the 4090, over 50 cards on the shelf. ONE of them is a RX 6600, and its been there for 4 weeks.

regardless of how well they compete, people (at least from my experience) don't want them.

there are tons of potential reasons why. Ranging from Farari effect, add-in feature integration, crowd-think, etc. but the truth is that people just buy Geforce cards.

To clarify, back during the Hawaii days of the 290X, it was still heavily Nvidia-biased, but we had plenty of AMD cards for sale.
 
What I gather from this video is that Navi 31 is going to be a show stopper when it comes to pure raster performance and upset a few Nvidia fans when it comes to availability.
At the same time, Lovelace's L2 has got to be faster than Navi 31's Infinity Fabric since it's on die and not strung out in chiplets.
Jim also seems to be acquiring his information from another site. I find it both amusing, and informative. YMMV.

 
Tbh a lot of people believed (and still believed) that the 5700xt was supposed to compete with the 2080.
Which is funny because they made it perfectly clear before launch that they weren't doing a high end or halo card that gen, even if they hadn't the model number and pricing made that obvious.

I think a safe bet is that the 7000 series will be similar to 6000 and will likely be competitive but trail nvidia slightly in raster performance and more in RT with prices reflecting that. It sounds like AMD's design is cheaper to produce which means they could get more aggressive with their pricing if they wanted(and nvidia left them a lot of room there too) but I doubt they'll have the supply to do that without just leaving money on the table which no company is going to do. They will still likely be a good value relatively if you care more about raster performance than RT and DLSS.

Outside of a surprise(good or bad) I'm curious to see if they've improved their deficit at 4k since we're finally getting to the point that high end cards can reasonably handle it, at least until RT comes into the picture. I'm at 1440p currently but I wouldn't mind bumping up to 4k if I had the gpu power to properly handle it.
 
He's pointing out how AMD GPUs are overhyped before release. Way too often do the Radeon rumors over-promise and under-deliver. AMD themselves have over-hyped their next release just to fall flat on their faces, least we forget the Radeon Rebellion.
Least we forget the last two AMD GPU releases that did not do that and no indicator that is about to happen this round. If anything there has been literal silence in these regards.
 
Outside of a surprise(good or bad) I'm curious to see if they've improved their deficit at 4k since we're finally getting to the point that high end cards can reasonably handle it, at least until RT comes into the picture. I'm at 1440p currently but I wouldn't mind bumping up to 4k if I had the gpu power to properly handle it.
I am guessing they have made improvements in 4K performance. Last gen they were locked @ 256 bus for highend this time they have bumped up. Also consider their evolution of the cache they pioneered last round. To me it seems obvious they have prioritized 4K+ performance a last gen weakness.
 
I could be just out of the loop, but I am not sure how significantly less hype AMD could have made pre-november 3 at least and when they will talk about performance I expect it will be with a list of actual title that is not that particularly unrepresentative to look good and with not massaged number, like they seem to be always doing in recent time.
 
Which is funny because they made it perfectly clear before launch that they weren't doing a high end or halo card that gen, even if they hadn't the model number and pricing made that obvious.

I think a safe bet is that the 7000 series will be similar to 6000 and will likely be competitive but trail nvidia slightly in raster performance and more in RT with prices reflecting that. It sounds like AMD's design is cheaper to produce which means they could get more aggressive with their pricing if they wanted(and nvidia left them a lot of room there too) but I doubt they'll have the supply to do that without just leaving money on the table which no company is going to do. They will still likely be a good value relatively if you care more about raster performance than RT and DLSS.

Outside of a surprise(good or bad) I'm curious to see if they've improved their deficit at 4k since we're finally getting to the point that high end cards can reasonably handle it, at least until RT comes into the picture. I'm at 1440p currently but I wouldn't mind bumping up to 4k if I had the gpu power to properly handle it.
Agreed on the 4K part. I have a G9 that mostly runs linux - sometimes games, and I’d really like a 4K capable AMD card in there.
 
AMD will deliver in all the ways it needs to, I am sure the raster performance will be awesome, and the power-to-performance curve will be great.
Very few people give a shit about power to performance ratios with desktop hardware. What matters is which card is the fastest. This is what drives perception of the brand and the product line.
It was a really good card that still lost out to the 3080 which all things considered was the "cheaper" option. But the last gen was a wash nothing was really available so the good card was the one you could actually get.
This is precisely why I think AMD once again over promised and under delivered. It's not that the 6900XT was a bad card by any means. Far from it. However, it often lost out to the cheaper RTX 3080. Granted, the RTX 3080 had poor availability. It seemed clear to me that NVIDIA concentrated on selling 3090's and created just enough 3080's to convince people they were a real thing. At the end of the day what you might have opted for came down to how far you were willing to get bent over for the card you wanted or luck of the draw.
 
Once you get into the halo cards unless you are trying to push 4K with raytracing and all the eye candy you aren't going to have much of a difference, for all of us normal people still at 1080 ad 1440 they are all functional overkill after a point and you really won't see that difference.
Let them both battle for 4K bragging rights and that's great and all but it's everything below that, which matters and they both have good offerings there that easily trade back and forth.
This simply isn't true. The raster performance was slightly worse and at 4K, that matters. The ray tracing was considerably worse and the 6000 series launched with no answer to DLSS 2.0. For people who buy halo cards and do 4K gaming, the 6900XT wasn't nearly as attractive.
 
It seemed clear to me that NVIDIA concentrated on selling 3090's and created just enough 3080's to convince people they were a real thing. At the end of the day what you might have opted for came down to how far you were willing to get bent over for the card you wanted or luck of the draw.
They only sold 5.25 time more 3080s-Ti than 3090 accordign to the steam hardware survey (adn I imagine more 3090 end up in non steam box), while the demand must have been more 12:1 or something of the sorts...

Wouldn<t the 6800xt be the much better comparable to a 3080 ? Both side did not offer particularly interesting price by dollars above that, the xx90s were arguably bad card for the price for gaming, like top of the line CPUs tend to be, but that was the "norm" for the last added fps to cost a lot.
 
The issue here is with Mindshare: It's a real thing.

I've worked out of a B&M computer shot for over ten years (only on the weekends now). This last few generations has been the worst for AMD cards, even during the crypto craze, where we had 30+ Nvidia Geforce cards on the shelf, flying out the door for exorbitant prices, we maybe had one AMD card that would sit there for weeks before we used it for a repair or something. Right now we have several different models of 1650 all the way to the 4090, over 50 cards on the shelf. ONE of them is a RX 6600, and its been there for 4 weeks.

regardless of how well they compete, people (at least from my experience) don't want them.

there are tons of potential reasons why. Ranging from Farari effect, add-in feature integration, crowd-think, etc. but the truth is that people just buy Geforce cards.

To clarify, back during the Hawaii days of the 290X, it was still heavily Nvidia-biased, but we had plenty of AMD cards for sale.
Intel held this very same mind-wash. It took Zen's big core push and great prices to sway to the point is now. I don't know if AMD is capable of this sway with GPU's as Nvidia has pushed heavy into RT and DLSS(*) to avoid power usage and raster performance which AMD, based on specs, should clearly lead in this new gen.
 
Intel held this very same mind-wash. It took Zen's big core push and great prices to sway to the point is now. I don't know if AMD is capable of this sway with GPU's as Nvidia has pushed heavy into RT and DLSS(*) to avoid power usage and raster performance which AMD, based on specs, should clearly lead in this new gen.
The issue is that Nvidia, despite their flaws, have never gotten complacent.
 
This simply isn't true. The raster performance was slightly worse and at 4K, that matters. The ray tracing was considerably worse and the 6000 series launched with no answer to DLSS 2.0. For people who buy halo cards and do 4K gaming, the 6900XT wasn't nearly as attractive.
That what I was saying unless you are playing 4K you won’t see that difference. Either of those cards running 1440 or 1080 are going to crush it and you aren’t going to be GPU limited there at all.
 
The issue here is with Mindshare: It's a real thing.

I've worked out of a B&M computer shot for over ten years (only on the weekends now). This last few generations has been the worst for AMD cards, even during the crypto craze, where we had 30+ Nvidia Geforce cards on the shelf, flying out the door for exorbitant prices, we maybe had one AMD card that would sit there for weeks before we used it for a repair or something. Right now we have several different models of 1650 all the way to the 4090, over 50 cards on the shelf. ONE of them is a RX 6600, and its been there for 4 weeks.

regardless of how well they compete, people (at least from my experience) don't want them.
...

Your experience pretty much lines up with the steam surveys. According to those, none of amd's cards is increasing their already tiny share of the user base and neither is there an amd card to be seen in the top fifteen.

Source: https://store.steampowered.com/hwsurvey/videocard/?sort=chg
 
Last edited:
He's pointing out how AMD GPUs are overhyped before release. Way too often do the Radeon rumors over-promise and under-deliver. AMD themselves have over-hyped their next release just to fall flat on their faces, least we forget the Radeon Rebellion.

Last time I checked the 6900XT came in where most people expected it to, or slightly better. Ray Tracing is really the only thing AMD underperforms Nvidia by a margin people might care about last generation.
 
Very few people give a shit about power to performance ratios with desktop hardware. What matters is which card is the fastest. This is what drives perception of the brand and the product line.

This is precisely why I think AMD once again over promised and under delivered. It's not that the 6900XT was a bad card by any means. Far from it. However, it often lost out to the cheaper RTX 3080. Granted, the RTX 3080 had poor availability. It seemed clear to me that NVIDIA concentrated on selling 3090's and created just enough 3080's to convince people they were a real thing. At the end of the day what you might have opted for came down to how far you were willing to get bent over for the card you wanted or luck of the draw.
That often lost to is benchmark brain. Gamers are a bad bunch for looking at things that don't apply to them. According to steam 65% of users are still at 1080p.... 13.5% are at 1440p. 2% game at 4k. Gamers are great at repeating youtube benchmark runners... 3080 is 5% faster at 4k ! ! Never mind that its 5% slower at the resolution they are actually using. Of course most of us understand the last gen was a wash performance wise. Probably never been a generation as equally matched in terms of FPS for the dollar... at least in the upper mid range and up, and not counting AMDs better discounts of old product now.

As far as power draw goes. I agree most Desktop gamers don't in general care. We'll have to see what AMD really has this gen... and if the rumors of sub 400 watts are real. I would say although most gamers don't care... if gamers are afraid of one brand setting their homes on fire they may actually care. That may be unlikely and a silly thing to worry about (maybe?) but in all seriousness if this gen... one brand is plug and go, and the other means also replacing your PSU. That is something most gamers are likely to consider.
 
That often lost to is benchmark brain. Gamers are a bad bunch for looking at things that don't apply to them. According to steam 65% of users are still at 1080p.... 13.5% are at 1440p. 2% game at 4k. Gamers are great at repeating youtube benchmark runners... 3080 is 5% faster at 4k ! ! Never mind that its 5% slower at the resolution they are actually using. Of course most of us understand the last gen was a wash performance wise. Probably never been a generation as equally matched in terms of FPS for the dollar... at least in the upper mid range and up, and not counting AMDs better discounts of old product now.

As far as power draw goes. I agree most Desktop gamers don't in general care. We'll have to see what AMD really has this gen... and if the rumors of sub 400 watts are real. I would say although most gamers don't care... if gamers are afraid of one brand setting their homes on fire they may actually care. That may be unlikely and a silly thing to worry about (maybe?) but in all seriousness if this gen... one brand is plug and go, and the other means also replacing your PSU. That is something most gamers are likely to consider.
I'm certainly in the running for AMD this gen. 1440p, don't care about RT, don't want to pay nvidia premium.
 
Very few people give a shit about power to performance ratios with desktop hardware. What matters is which card is the fastest. This is what drives perception of the brand and the product line.

This is precisely why I think AMD once again over promised and under delivered. It's not that the 6900XT was a bad card by any means. Far from it. However, it often lost out to the cheaper RTX 3080. Granted, the RTX 3080 had poor availability. It seemed clear to me that NVIDIA concentrated on selling 3090's and created just enough 3080's to convince people they were a real thing. At the end of the day what you might have opted for came down to how far you were willing to get bent over for the card you wanted or luck of the draw.

I think whether the 6900XT won or loss depending on the settings used to benchmark it and the game. On some games it easily outperformed the 3080, bigger problem was availability, AMD was capacity constrained and was the only thing limiting their sales. Nvidia didn't have that issue.
 
I think whether the 6900XT won or loss depending on the settings used to benchmark it and the game. On some games it easily outperformed the 3080, bigger problem was availability, AMD was capacity constrained and was the only thing limiting their sales. Nvidia didn't have that issue.
Um... both were pretty damned constrained, especially with miners last cycle.
 
Um... both were pretty damned constrained, especially with miners last cycle.

Nah, AMD was far more constrained as there CPU's were on the same process, Nvidia had everything Samsung could produce on the other hand. AMD is a bit more free this coming generation for gpu production, but it still wont be on the same scale as Nvidia.
 
Nah, AMD was far more constrained as there CPU's were on the same process, Nvidia had everything Samsung could produce on the other hand. AMD is a bit more free this coming generation for gpu production, but it still wont be on the same scale as Nvidia.
Oh I don't disagree, but you said that Nvidia didn't have that issue - they did, it's just that it was a different foundry that couldn't produce more (whether the RAM or the die itself).
 
The last gen was a total shit show and there was no stable price/performance metric to measure off of so the only measurement method really was "does it work for you at that price", because it's not like you couldn't buy the cards at any point it was just a matter of paying for them at specific points.

Why are you mixing with mining boom and scalpers selling cards. The argument that RDNA2 was overyhyped is insane. All the way up to launch there were leaks its only going to compete with 3070 lmao. After Raja left AMD has been pretty spot on with their improvement claims. They usually under promised and over delivered with RDNA2. I am sure they will keep it the same with RDNA3.

Pricing was f'ed for both sides not just AMD.
 
AMD isn't going to be able to undercut NVIDIA by much if they want to make any money.
We'll get a better idea tomorrow. They should be able to under cut by quite a bit if they CHOOSE to anyway.
Nvidia this gen has a big fat monolithic die. AMD is using a much smaller chiplet design on the latest greatest fab... while other parts are on cheaper less defect prone processes.

Analog bits like memory controllers... and cache bits get almost zero benefit from die shrinks. The difference between even 12nm and 6nm for memory controllers is actually basically zero. Logic is the only part of the chip that is really effected by a die shrink.

Not that it matters if AMD is paying 50% less for every completed chip.... they may still just follow Nvidia anyway. They do have the room this gen though to put the boots to Nvidia on price if they choose too. IMO they should, its probably the last real opportunity they will have to do that, as Nvidia will be going to a chiplet design next gen as well. Hit them in the pocket and get some market share while that is in option. If I was betting I would say 20/20/60... 20% chance they follow Nvidia to excessive pricing, 20% chance they go for their neck and try and make everything on the market look like a terrible deal. Then the most likely option that they price just enough under that they can flash marketing slides about how great a deal they are vs Nvidia.... while most people still say EHH the Nvidia tax is worth it. I hope they choose to go hard now while they can... but 20% is my odds on that.
 
  • Like
Reactions: Axman
like this
As far as power draw goes. I agree most Desktop gamers don't in general care. We'll have to see what AMD really has this gen... and if the rumors of sub 400 watts are real. I would say although most gamers don't care... if gamers are afraid of one brand setting their homes on fire they may actually care. That may be unlikely and a silly thing to worry about (maybe?) but in all seriousness if this gen... one brand is plug and go, and the other means also replacing your PSU. That is something most gamers are likely to consider.
I would not expect particularly big numbers of watt from that 379mm AD103 and lower die on the Nvidia side (current specs seem to be 320w maximum), should be quite comparable like the previous generation.

I very much doubt the 4080 adapter will have the same issue. If we talk about the niche 7900xtx vs 4090 at those price points people seem ready to spend a lot to make it work for sli or now giant card, on the low to high range the issue would be bigger for a larger % of buyers
 
  • Like
Reactions: ChadD
like this
We'll get a better idea tomorrow. They should be able to under cut by quite a bit if they CHOOSE to anyway.
Nvidia this gen has a big fat monolithic die. AMD is using a much smaller chiplet design on the latest greatest fab... while other parts are on cheaper less defect prone processes.

Analog bits like memory controllers... and cache bits get almost zero benefit from die shrinks. The difference between even 12nm and 6nm for memory controllers is actually basically zero. Logic is the only part of the chip that is really effected by a die shrink.

Not that it matters if AMD is paying 50% less for every completed chip.... they may still just follow Nvidia anyway. They do have the room this gen though to put the boots to Nvidia on price if they choose too. IMO they should, its probably the last real opportunity they will have to do that, as Nvidia will be going to a chiplet design next gen as well. Hit them in the pocket and get some market share while that is in option. If I was betting I would say 20/20/60... 20% chance they follow Nvidia to excessive pricing, 20% chance they go for their neck and try and make everything on the market look like a terrible deal. Then the most likely option that they price just enough under that they can flash marketing slides about how great a deal they are vs Nvidia.... while most people still say EHH the Nvidia tax is worth it. I hope they choose to go hard now while they can... but 20% is my odds on that.
AMD is on the same overpopulated node that NVIDIA is on. Going the chiplet route may have saved them some money compared to the monolithic chips on NVIDIA, but I can't imagine it being much.
 
IMO they should, its probably the last real opportunity they will have to do that, as Nvidia will be going to a chiplet design next gen as well.
I kind of doubt that certain, with how much improvement they achieve, how much they seem to be able to charge for cards, they probably going to TSMC 3.

If they can get away to make GPU small enough and TSMC yield stay good enough I could see again a mono-die , it is probably very hard to pull off AMD chiplet design and we can assume all those experience-learling-work on the CPU side have contributed.
 
I think whether the 6900XT won or loss depending on the settings used to benchmark it and the game. On some games it easily outperformed the 3080, bigger problem was availability, AMD was capacity constrained and was the only thing limiting their sales. Nvidia didn't have that issue.
You are correct in that the 6900XT often faster than the RTX 3080. It was even faster than the RTX 3090 in some rare cases but it was never a 3080 killer, much less a threat to the RTX 3090. Availability of the 6900XT was often better than NVIDIA's offerings through scalpers and even my local Microcenter which tells me that there wasn't as much demand for the 6900XT. After the availability started to improve, 6900XT's were everywhere when RTX 3090's and RTX 3080's still couldn't be found. Again, despite capacity differences the 6900XT was the more available card.

Here is the thing, I am literally the target audience of this stuff. I game at 4K and I do not like turning settings down if it can technically be avoided by throwing more money at the problem. 5-10% performance difference at 4K makes a bigger difference than it does at lower resolutions when you are getting 200FPS. Today, there are some games that push an RTX 3090 enough that it drops well below 100FPS. I used to buy Titan's in pairs or whatever Ti model was at the top of the stack. For me, cost has rarely been a significant factor when choosing a GPU. Now, if the performance difference is well under 10% and the cost difference is around a third less, then I might give the cheaper option serious consideration. Aside from that, the card that's slower doesn't make a sale with someone like me.

However, it wasn't quite that close a lot of the time and then there was essentially dismal ray tracing performance. Cyberpunk 2077 was a game I was playing a lot at the time and that was unacceptable. Given the availability problems and price gouging, I considered both but I really wanted to go with the RTX 3090 for its ray tracing performance. Yeah, a lot of people will tell you that not enough games implement it (even today) but I never want to be left with that "should have bought a V8" feeling if some game I want to play happens to have that feature. I always regret a purchase like that more than I would in a case where I may have over spent for a small performance increase over the slower option.

For normal consumers, price/performance matters. In that, AMD can compete with NVIDIA. AMD often has cards that may be faster than their NVIDIA counterparts but its often a product of aggressive pricing strategies than technical or performance superiority. That's something we've rarely ever seen from AMD over the years. Hell, back when we saw it from ATi's 9700 Pro it was a bit of a fluke. ATi had to buy another company in order to get the technology to make that happen.

I wish it weren't this way. What we need is feature and performance parity where you could flip a coin and be happy with either one. This would force AMD and NVIDIA to compete in price and units sold. That's the best scenario as it favors the consumer. For most people, NVIDIA pretty much sets the standard and you just have to hope AMD is competitive enough to force some price adjustments on NVIDIA's part which allows you to get the most for your money. At the top of the stack, it's virtually hopeless. AMD always claims superiority, parity, or whatever and under delivers. It's rare that AMD does anything that would force NVIDIA to rethink its pricing on flagship cards like the RTX 4090.
 
AMD is on the same overpopulated node that NVIDIA is on. Going the chiplet route may have saved them some money compared to the monolithic chips on NVIDIA, but I can't imagine it being much.
From my understanding NVIDIA is on TSMC 4nm, AMD is on a mix of 5nm GCD and 6nm MCD
 
AMD is on the same overpopulated node that NVIDIA is on. Going the chiplet route may have saved them some money compared to the monolithic chips on NVIDIA, but I can't imagine it being much.


Savings per wafer? Only however much of a discount they get for being TSMCs second largest customer. Savings per die? A decent amount likely, plus the decreased likelihood of a defect rendering a die useless.
 
You are correct in that the 6900XT often faster than the RTX 3080. It was even faster than the RTX 3090 in some rare cases but it was never a 3080 killer, much less a threat to the RTX 3090. Availability of the 6900XT was often better than NVIDIA's offerings through scalpers and even my local Microcenter which tells me that there wasn't as much demand for the 6900XT. After the availability started to improve, 6900XT's were everywhere when RTX 3090's and RTX 3080's still couldn't be found. Again, despite capacity differences the 6900XT was the more available card.

Here is the thing, I am literally the target audience of this stuff. I game at 4K and I do not like turning settings down if it can technically be avoided by throwing more money at the problem. 5-10% performance difference at 4K makes a bigger difference than it does at lower resolutions when you are getting 200FPS. Today, there are some games that push an RTX 3090 enough that it drops well below 100FPS. I used to buy Titan's in pairs or whatever Ti model was at the top of the stack. For me, cost has rarely been a significant factor when choosing a GPU. Now, if the performance difference is well under 10% and the cost difference is around a third less, then I might give the cheaper option serious consideration. Aside from that, the card that's slower doesn't make a sale with someone like me.

However, it wasn't quite that close a lot of the time and then there was essentially dismal ray tracing performance. Cyberpunk 2077 was a game I was playing a lot at the time and that was unacceptable. Given the availability problems and price gouging, I considered both but I really wanted to go with the RTX 3090 for its ray tracing performance. Yeah, a lot of people will tell you that not enough games implement it (even today) but I never want to be left with that "should have bought a V8" feeling if some game I want to play happens to have that feature. I always regret a purchase like that more than I would in a case where I may have over spent for a small performance increase over the slower option.

For normal consumers, price/performance matters. In that, AMD can compete with NVIDIA. AMD often has cards that may be faster than their NVIDIA counterparts but its often a product of aggressive pricing strategies than technical or performance superiority. That's something we've rarely ever seen from AMD over the years. Hell, back when we saw it from ATi's 9700 Pro it was a bit of a fluke. ATi had to buy another company in order to get the technology to make that happen.

I wish it weren't this way. What we need is feature and performance parity where you could flip a coin and be happy with either one. This would force AMD and NVIDIA to compete in price and units sold. That's the best scenario as it favors the consumer. For most people, NVIDIA pretty much sets the standard and you just have to hope AMD is competitive enough to force some price adjustments on NVIDIA's part which allows you to get the most for your money. At the top of the stack, it's virtually hopeless. AMD always claims superiority, parity, or whatever and under delivers. It's rare that AMD does anything that would force NVIDIA to rethink its pricing on flagship cards like the RTX 4090.

Problem I have with that if the final outlook, your hoping that AMD can compete at the same level and features so you can get a Nvidia card cheaper. This is why AMD does not want to be in that situation, as it gains them no market share. Thus punishing AMD for competing on the same level. Only thing I felt AMD undelivered on is Ray Tracing, past that the performance difference was minor at the top of the stack. But this last generation market was distorted by mining. But reality is the Halo cards matter very little in total number of cards sold, $300 and under is where 90% of the market is and that area is being ignored by both companies at the moment. Will find out more here shortly.
 
Problem I have with that if the final outlook, your hoping that AMD can compete at the same level and features so you can get a Nvidia card cheaper.
No, I'm not. If AMD can produce a faster card with feature parity, I'd buy that just as easily. I've had many AMD cards over the years as well as NVIDIA. I'm simply betting that its more likely that AMD can make a card just fast enough to force NVIDIA to drop its pricing a bit. Had the mining demand not been so strong, the $500 cheaper MSRP of the 6900XT might have forced NVIDIA to drop the price on the RTX 3090 or the subsequent 3080 Ti. Not everyone cared about ray tracing performance, but it might have gone that way depending on how much weight individuals gave to it. That said, I think a lot of people think more like I do when shopping the top of the stack.
This is why AMD does not want to be in that situation, as it gains them no market share. Thus punishing AMD for competing on the same level. Only thing I felt AMD undelivered on is Ray Tracing, past that the performance difference was minor at the top of the stack.
Again, at 4K in some games the small percentage of difference is more pronounced than it is at lower resolutions where you've essentially got frames to spare. That said, I mostly agree with you. AMD didn't falter that much on the 6900XT given it's much lower MSRP, but the poor ray tracing performance absolutely killed it on the high end for buyers like me. Like it or not, individual games often sell GPU's. For the time, that game was Cyberpunk 2077.
But this last generation market was distorted by mining. But reality is the Halo cards matter very little in total number of cards sold, $300 and under is where 90% of the market is and that area is being ignored by both companies at the moment. Will find out more here shortly.
That's why I was talking about AMD competing aggressively on price. When they do it right, AMD can sometimes offer more performance for the money in the more mainstream segments where the bulk of people buy their cards.
 
AMD is on the same overpopulated node that NVIDIA is on. Going the chiplet route may have saved them some money compared to the monolithic chips on NVIDIA, but I can't imagine it being much.
Only half the chip is on that node... that is the point. The mem controllers (and perhaps other bits... hopefully the talk about it tomorrow) are on a cheaper older node. Memory controllers on 12nm are going to be a few % points at most slower then 4-5nm. I am not sure exactly what nodes the analog bits are on... but its going to be much less populated and cheaper. Of the 80 billion transistors or so on a Monolithic GPU a good chunk of those are devoted to stuff that gets zero benefit from the shrink. That is the main advantage of splitting things up.
 
Back
Top