AMD's best Big Navi GPU may only be a match for Nvidia Ampere's second tier

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
"Coreteks' AIB sources also suggest there will be a pair of Big Navi GPUs at launch, based on the same Sienna Cichlid silicon. That's not much of a stretch as this is classic AMD graphics card practice; it's very own rule of two. This typically sees a cut-down card launching alongside the full-fat option. Just think about the RX 5700 XT and RX 5700 last year, RX 580 and RX 570, R9 390X and R9 390… and on and on.

And traditionally we usually end up recommending the lower-spec one as its lower price, and generally only-very-slight specs cut, end up making it a better value card by comparison, offering often very similar performance.

It looks like those two Sienna Cichlid cards will be the only Big Navi GPUs we see this year too, if this report is to be believed. A second fish-based codename, Navy Flounder, has been tied to the new RDNA 2 architecture, and is reportedly a mid-range version of the Big Navi GPU slated for release in the first three months of 2021.

With only the high-end market set to be served at the tail-end of this year, it seems almost certain that most of us are going to have to wait until next year to see anything more affordable than the ultra-enthusiast, ultra-expensive GeForce and Radeon cards."


https://www.pcgamer.com/amd-big-navi-rdna-2-aimed-at-rtx-3080/
 
Hopes and dreams about Big Navi currently dashed, but I was hopeful

Big not so big Navi

So, do we have any actual evidence of this one way or the other? Obviously not so, what are you guys going on about with rumors? After all, was there not a rumor that Zen 3 is not going to be released until next year, which turned out to be complete bunk?
 
So, do we have any actual evidence of this one way or the other? Obviously not so, what are you guys going on about with rumors? After all, was there not a rumor that Zen 3 is not going to be released until next year, which turned out to be complete bunk?
Just made a post about the 3090 series also for your reference if interested
 
So, do we have any actual evidence of this one way or the other? Obviously not so, what are you guys going on about with rumors? After all, was there not a rumor that Zen 3 is not going to be released until next year, which turned out to be complete bunk?

Upto 15% performance improvement seems a resonable expectation. Fingers crossed that AMD can disrupt on price.

50% improvement over a 2080ti sounds fanciful knowing the die sizes chosen by AMD (& their previous power efficiency challenges of their architectures)

But as you said, these are all rumors as of now...
 
Wait, another RTG product that's going to be completely medium?

im_shocked.gif
 
With only the high-end market set to be served at the tail-end of this year, it seems almost certain that most of us are going to have to wait until next year to see anything more affordable than the ultra-enthusiast, ultra-expensive GeForce and Radeon cards."
If people haven't realized yet, but AMD has no interest in competing in the PC GPU market. This is just to make investors happy that AMD is trying. AMD is in a unique situation in that anything they do in one market is a conflict of interest in another. That's why in 2013 AMD has the R9 290 and just recycled their previous gen products by moving them one tier down. They did it again when the released the R9 390 along with the R9 Fury. It wasn't until the RX series where AMD actually released new mid and low range products that wasn't a rebranding. AMD being dominant in the console market means Sony and Microsoft won't be happy seeing AMD making better priced GPU's for the PC market.

Nvidia on the other hand has no problem destroying the console market, like they did with the GTX 970. Once the GTX 970 was released it made the PS4 and XB1 look like toys by comparison. No doubt Nvidia is looking to recreate that difference with the RTX 3000 series. Wouldn't shock me that Nvidia has a RTX 3070 that performs like a RTX 2080 Ti for about $400, while AMD was planning to release RDNA2 for $700 that performs like a RTX 2080 Ti. If the rumors are true about RDNA2 then the PS5 and Xbox Series X's Tflops maybe a terrible metric to determine their performance, and may actually be a lot slower than we anticipated. Look at Halo Infinite graphics on the Xbox Series X. Not looking too different than Halo 3.

960x0.jpg
 
If people haven't realized yet, but AMD has no interest in competing in the PC GPU market. This is just to make investors happy that AMD is trying. AMD is in a unique situation in that anything they do in one market is a conflict of interest in another. That's why in 2013 AMD has the R9 290 and just recycled their previous gen products by moving them one tier down. They did it again when the released the R9 390 along with the R9 Fury. It wasn't until the RX series where AMD actually released new mid and low range products that wasn't a rebranding. AMD being dominant in the console market means Sony and Microsoft won't be happy seeing AMD making better priced GPU's for the PC market.

Nvidia on the other hand has no problem destroying the console market, like they did with the GTX 970. Once the GTX 970 was released it made the PS4 and XB1 look like toys by comparison. No doubt Nvidia is looking to recreate that difference with the RTX 3000 series. Wouldn't shock me that Nvidia has a RTX 3070 that performs like a RTX 2080 Ti for about $400, while AMD was planning to release RDNA2 for $700 that performs like a RTX 2080 Ti. If the rumors are true about RDNA2 then the PS5 and Xbox Series X's Tflops maybe a terrible metric to determine their performance, and may actually be a lot slower than we anticipated. Look at Halo Infinite graphics on the Xbox Series X. Not looking too different than Halo 3.

View attachment 265859

I don’t disagree with much of what you wrote, but it seems like the issues with Halo Infinite are not hardware based. There is a lot of chatter coming out that 343i is totally fucked up and the source code/tech is a scrambled mess. Rich at ReviewTech USA did a nice little piece on it a day or two ago.
 
So, do we have any actual evidence of this one way or the other? Obviously not so, what are you guys going on about with rumors? After all, was there not a rumor that Zen 3 is not going to be released until next year, which turned out to be complete bunk?
Exactly. People are getting whipped into a frenzy faster than a social media post. Such a waste of time and energy. Wait for cards to drop and make your own conclusions. This is all rumor-mongering and hearsay until then.

If people haven't realized yet, but AMD has no interest in competing in the PC GPU market. This is just to make investors happy that AMD is trying. AMD is in a unique situation in that anything they do in one market is a conflict of interest in another. That's why in 2013 AMD has the R9 290 and just recycled their previous gen products by moving them one tier down. They did it again when the released the R9 390 along with the R9 Fury. It wasn't until the RX series where AMD actually released new mid and low range products that wasn't a rebranding. AMD being dominant in the console market means Sony and Microsoft won't be happy seeing AMD making better priced GPU's for the PC market.

Nvidia on the other hand has no problem destroying the console market, like they did with the GTX 970. Once the GTX 970 was released it made the PS4 and XB1 look like toys by comparison. No doubt Nvidia is looking to recreate that difference with the RTX 3000 series. Wouldn't shock me that Nvidia has a RTX 3070 that performs like a RTX 2080 Ti for about $400, while AMD was planning to release RDNA2 for $700 that performs like a RTX 2080 Ti. If the rumors are true about RDNA2 then the PS5 and Xbox Series X's Tflops maybe a terrible metric to determine their performance, and may actually be a lot slower than we anticipated. Look at Halo Infinite graphics on the Xbox Series X. Not looking too different than Halo 3.

View attachment 265859
Consoles don't really make that much money for GPU manufacturers. It's just another steady source of income. Without even getting too far into it, which do you think makes them more: a console that sells for $500 or a video card that sells for $500? Considering that their APU is just a small part of the console and it's being sold by another company their "component" in the console is a drop in the bucket in terms of money. If you want to get more into the math of it, on your own time check to see how many consoles sold in total from last gen versus how many discrete GPU's were sold. Do you think think the PS4 made more money or less money for AMD than the RX480/580?

The professional and workstation market far exceeds consoles in every way (size, money, power). And so does the server market. AMD isn't stupid. If they took all the super computer money away from nVidia they'd have a much bigger piece of the pie. If it was possible they'd gladly give nVidia all of the console contracts in exchange. It's an obvious and easy trade. In order to do that they need to have a top end GPU part that is excellent in absolute performance and also perf per watt.

Apply logic here. AMD is a company; selling more is better than selling less. They don't care if consoles die because their contract is always to supply x number of units (their job is to supply an ordered amount of units, all of it has to be done ahead of time because it utilizes fab space), so it makes no difference if a console is even sold or not because they get paid either way. In other words selling as many GPU's on top of their console contracts is the best possible outcome for them. Again, not even mentioning the top end server and workstation marketplace.
 
Last edited:
If people haven't realized yet, but AMD has no interest in competing in the PC GPU market. This is just to make investors happy that AMD is trying. AMD is in a unique situation in that anything they do in one market is a conflict of interest in another. That's why in 2013 AMD has the R9 290 and just recycled their previous gen products by moving them one tier down. They did it again when the released the R9 390 along with the R9 Fury. It wasn't until the RX series where AMD actually released new mid and low range products that wasn't a rebranding. AMD being dominant in the console market means Sony and Microsoft won't be happy seeing AMD making better priced GPU's for the PC market.

Nvidia on the other hand has no problem destroying the console market, like they did with the GTX 970. Once the GTX 970 was released it made the PS4 and XB1 look like toys by comparison. No doubt Nvidia is looking to recreate that difference with the RTX 3000 series. Wouldn't shock me that Nvidia has a RTX 3070 that performs like a RTX 2080 Ti for about $400, while AMD was planning to release RDNA2 for $700 that performs like a RTX 2080 Ti. If the rumors are true about RDNA2 then the PS5 and Xbox Series X's Tflops maybe a terrible metric to determine their performance, and may actually be a lot slower than we anticipated. Look at Halo Infinite graphics on the Xbox Series X. Not looking too different than Halo 3.

View attachment 265859
Not quite sure about "No Interest", but AMD has a 4-year R&D gap to fill against NVidia who has larger budgets, larger talent pools, and more cards in the field collecting usage data and metrics to better optimize their designs for. Much of AMD's GPU advancements are being designed based on NVidia's successful work so in reality they are going to be a generation behind for a significant period. Now granted that isn't bad, certainly not everybody needs and certainly can't afford the Top tier equipment, and unless you are running 4K at 120 FPS+ the cheaper cards are probably your better bet. But since AMD isn't investing as much into their R&D they maintain a lower overhead so that immediately takes a chunk off the top of their expenses, which means they can offer a cheaper product and right now that may be far more important. 1080p is still the mainstream and having an abundance of solid 144 FPS + options there is super important as that resolution makes up like 60% of all the gaming monitors out there and AMD right now is all about returns, they have made fantastic strides paying back their loans and the sooner those are done and gone the far better of a position they can be in. AMD finds itself in a unique position right now, they can't manufacture enough parts to meet their demand, but they also have no means of increasing their manufacturing capabilities and their Microsoft, Sony, and Lenovo agreements have to come first, otherwise they are going to get hit with very large lawsuits that they can ill afford, which means consumers will have to come second. Knowing that they have limited manufacturing windows, a limited supply, and a large demand they have to focus their capabilities to the largest of markets where they can get their best returns on. Which means the upper Mid tiers and below, not producing a monolithic die to try and compete with the 3080 TI and 3090 cards was a smart move because they can't afford to waist silicon on difficult or low yield parts. Once Samsung gets their new fabs up and running I could very likely see a situation where they use those fabs for the budget orientated parts as those facilities are going to be very close to the TSMC ones but likely cheaper, which will free up those valuable TSMC wafers for larger products but that is 2 years out. Until then AMD can continue to design and research and potentially build things on paper that could meet or beat NVidia's top tier offerings, but they will lack the capabilities to reasonably build them.
 
If big Navi is indeed up to 80 CUs then we might be looking RTX Titan or above as far as TFLOPS (16 - 18 depending on clocks).
So assuming linear scaling, (which isn't always the case) It'll be about 0-12.5% faster than the RTX Titan.
That doesn't take into account any IPC or other improvements.
 
Exactly. People are getting whipped into a frenzy faster than a social media post. Such a waste of time and energy. Wait for cards to drop and make your own conclusions. This is all rumor-mongering and hearsay until then.


Consoles don't really make that much money for GPU manufacturers. It's just another steady source of income. Without even getting too far into it, which do you think makes them more: a console that sells for $500 or a video card that sells for $500? Considering that their APU is just a small part of the console and it's being sold by another company their "component" in the console is a drop in the bucket in terms of money. If you want to get more into the math of it, on your town time check to see how many consoles sold in total from last gen versus how many discrete GPU's were sold. Do you think think the PS4 made more money or less money for AMD than the RX480/580?

The professional and workstation market far exceeds consoles in every way (size, money, power). And so does the server market. AMD isn't stupid. If they took all the super computer money away from nVidia they'd have a much bigger piece of the pie. If it was possible they'd gladly give nVidia all of the console contracts in exchange. It's an obvious and easy trade. In order to do that they need to have a top end GPU part that is excellent in absolute performance and also perf per watt.

Apply logic here. AMD is a company; selling more is better than selling less. They don't care if consoles die because their contract is always to supply x number of units (their job is to supply an ordered amount of units, all of it has to be done ahead of time because it utilizes fab space), so it makes no difference if a console is even sold or not because they get paid either way. In other words selling as many GPU's on top of their console contracts is the best possible outcome for them. Again, not even mentioning the top end server and workstation marketplace.
A GPU may sell for $500 but the cards manufacturer gets the lion's share of that AMD's actual GPU on that $500 card may only have been $100 and of that $100 only $30 of that is actual profit. So AMD could make $30 off a GPU or $25 off a Console, the difference is AMD knows exactly how many console chips they have to produce, it is a set order, a fixed amount over time, when the run is done they are left with 0 left in stock and there is no excess inventory they have to write down. This is not true with those GPU chips, if they don't sell AMD is ultimately left holding the bag and those write downs aren't small so the profit difference between the two is much smaller than most assume.
 
Another anonymous source. If big navi is almost twice as many cores and has improved IPC and higher clocks which is rumored. Even if we take the worst end it will definitely be faster than 2080ti. I would say 30% or so faster then 2080ti minimum. I mean if Nvidia is pushing their cards to 350W stock as rumored its because they had to in order to stay ahead. That is probably more telling then anything.
 
A GPU may sell for $500 but the cards manufacturer gets the lion's share of that AMD's actual GPU on that $500 card may only have been $100 and of that $100 only $30 of that is actual profit. So AMD could make $30 off a GPU or $25 off a Console, the difference is AMD knows exactly how many console chips they have to produce, it is a set order, a fixed amount over time, when the run is done they are left with 0 left in stock and there is no excess inventory they have to write down. This is not true with those GPU chips, if they don't sell AMD is ultimately left holding the bag and those write downs aren't small so the profit difference between the two is much smaller than most assume.
AMD more or less does the same for video cards. AMD isn't not a first party manufacturer. Meaning they only need to produce what their board partners order.
The only exception to this is they did make their own reference RVII and they do make their own professional level cards.
And this isn't even addressing how many more video cards are sold as compared to consoles (feel free to compare the total number of consoles sold last gen versus graphics cards), and your very sketchy at best income for both consoles and GPU parts, or the fact that you didn't even bother to talk about market size and the professional market.

You're not addressing the most obvious statements in my post. In order for you to do that, just answer this one question: what makes more money, consoles or workstation/server GPUs? And that's even ignoring desktop parts which is multiplicatively larger.

To that end, Apple sells more iPhones in a year than the total sales of XB1/PS4/Switch combined. The PC desktop market is even bigger than that when looking at OEMs and the fact that every single machine contains graphics components. Making the argument that consoles are more lucrative than anything happening on the PC side is ludicrous. You could argue that low end GPU sales matter more than high end ones. But consoles are a drop in the bucket compared to the ocean and strata of desktop/workstation/server parts. In short: any argument saying that AMD doesn't care about what happens on desktop is coming from a place of ignorance (which was my response to DukeNukemX). My overall point in both this post and the previous post is that console sales matter far less to AMD than anything happening on server/workstation or hell even desktop.
 
Last edited:
A GPU may sell for $500 but the cards manufacturer gets the lion's share of that AMD's actual GPU on that $500 card may only have been $100 and of that $100 only $30 of that is actual profit. So AMD could make $30 off a GPU or $25 off a Console, the difference is AMD knows exactly how many console chips they have to produce, it is a set order, a fixed amount over time, when the run is done they are left with 0 left in stock and there is no excess inventory they have to write down. This is not true with those GPU chips, if they don't sell AMD is ultimately left holding the bag and those write downs aren't small so the profit difference between the two is much smaller than most assume.

Cards manufacturer as in AIB? Because I very much doubt that. They make around a 44% gross margin, there’s no way they would only make $30 on a $500 gpu, that’s ludicrous. Even taking R&D into account, manufacturing these wouldn’t cost them more than $75-100 on a $500 gpu. Then driver development and marketing would add maybe another $50-100. Add another $50 for AIB and worse case it would be $200-250. AMD and NVIDIA would’ve gone broke long ago if they only made $30 net on a $500 card.
 
Last edited:
Cards manufacturer as in AIB? Because I very much doubt that. They make around a 44% gross margin, there’s no way they would only make $30 on a $500 gpu, that’s ludicrous.
The AIB's yes, the cost of manufacturing the physical card is the bulk of the GPU cost, the Chip itself is a large part but not the most of it, look at the 2080TI, the physical GPU on that NVidia sells for ~$300 to the AIB's and according to NVidia's financials they make about $45 profit on. AMD is not selling the $500 GPU, the AIB is.
 
The AIB's yes, the cost of manufacturing the physical card is the bulk of the GPU cost, the Chip itself is a large part but not the most of it, look at the 2080TI, the physical GPU on that NVidia sells for ~$300 to the AIB's and according to NVidia's financials they make about $45 profit on. AMD is not selling the $500 GPU, the AIB is.

Show me a source where Nvidia states they only make $45 on a 2080 ti die.
 
AMD more or less does the same for video cards. AMD isn't not a first party manufacturer. Meaning they only need to produce what their board partners order.
The only exception to this is they did make their own reference RVII and they do make their own professional level cards.
And this isn't even addressing how many more video cards are sold as compared to consoles (feel free to compare the total number of consoles sold last gen versus graphics cards), and your very sketchy at best income for both consoles and GPU parts, or the fact that you didn't even bother to talk about market size and the professional market.

You're not addressing the most obvious statements in my post. In order for you to do that, just answer this one question: what makes more money, consoles or workstation/server GPUs? And that's even ignoring desktop parts which is multiplicatively larger.

To that end, Apple sells more iPhones in a year than the total sales of XB1/PS4/Switch combined. The PC desktop market is even bigger than that when looking at OEMs and the fact that every single machine contains graphics components. Making the argument that consoles are more lucrative than anything happening on the PC side is ludicrous. You could argue that low end GPU sales matter more than high end ones. But consoles are a drop in the bucket compared to the ocean and strata of desktop/workstation/server parts. In short: any argument saying that AMD doesn't care about what happens on desktop is coming from a place of ignorance (which was my response to DukeNukemX). My overall point in both this post and the previous post is that console sales matter far less to AMD than anything happening on server/workstation or hell even desktop.
AMD does not have a good workstation or server GPU their last ones were a joke, so Consoles by far. NVidia's Quadro lineup beats AMD in every metric and is actually available through the OEM's.
 
"Big Navi is a halo product," Kumar goes on to say, "enthusiasts love to buy the best, and we are certainly working on giving them the best."

That means this Navi 2 card is going to be big, it's going to be expensive, and it should genuinely deliver some competitive gaming performance compared with the top Nvidia GPUs. But whether that's compared with the top graphics cards of the Turing or upcoming Nvidia Ampere generation will be a question for the future. But hopefully the relatively imminent future.

But the RDNA 2 architecture isn't just going to stick around at the high end of the GPU stack; the entire Radeon range will be shot through with the Big Navi genetics.

"The RDNA 2 architecture goes through the entire stack," says Kumar, "it will go from mainstream GPUs all the way up to the enthusiasts and then the architecture also goes into the game console products... as well as our integrated APU products.

"This allows us to leverage the larger ecosystem, accelerate the development of exciting features like ray tracing and more."

https://www.pcgamer.com/amd-big-navi-first-rdna-product/
 
AMD does not have a good workstation or server GPU their last ones were a joke, so Consoles by far. NVidia's Quadro lineup beats AMD in every metric and is actually available through the OEM's.
The professional and workstation market far exceeds consoles in every way (size, money, power). And so does the server market. AMD isn't stupid. If they took all the super computer money away from nVidia they'd have a much bigger piece of the pie. If it was possible they'd gladly give nVidia all of the console contracts in exchange. It's an obvious and easy trade. In order to do that they need to have a top end GPU part that is excellent in absolute performance and also perf per watt.
You're really bad at answering questions. Basically you're answering out of bias without actually answering the question as asked.
And your statement about units sold isn't even true let alone their quality. If AMD had to lose either consoles or the professional market, they'd lose consoles every time. You do realize they make GPU's for Apple and all of their professional workstations? How about high end OEM? How about their use in Supercomputers? You clearly don't even know AMD's product line or their bottom line. This is why it's pointless to talk to people about any of this on the internet. Because it's straight up ignorance.
 
Last edited:
Show me a source where Nvidia states they only make $45 on a 2080 ti die.
The costs to manufacture a 2080 TI have been beaten to death with Reddit already breaking down the entire card part by part and building the costs up based on bulk purchasing, they put the cost of the actual GPU on the cards to be around $300 USD give or take $30. NVidia's financial reports show that for the last 2 years they have maintained a 33-45% profit margin on their GPU die sales, assuming that their higher binned parts are on the upper end of those profits than it would actually put their profits on a 2080TI at $135 so yeah my bad I really fucked up my math there. I will go brew a pot of coffee and reflect accordingly
 
The costs to manufacture a 2080 TI have been beaten to death with Reddit already breaking down the entire card part by part and building the costs up based on bulk purchasing, they put the cost of the actual GPU on the cards to be around $300 USD give or take $30. NVidia's financial reports show that for the last 2 years they have maintained a 33-45% profit margin on their GPU die sales, assuming that their higher binned parts are on the upper end of those profits than it would actually put their profits on a 2080TI at $135 so yeah my bad I really fucked up my math there. I will go brew a pot of coffee and reflect accordingly

I asked for a source from nvidia, not reddit conjecture.
 
I asked for a source from nvidia, not reddit conjecture.
NVidia does not give out those numbers, so conjecture from the Reddit addicts is about the best anybody can offer but they shouldn't be too far off. and to date have been pretty accurate for both NV's and AMD's cards.
 
NVidia does not give out those numbers, so conjecture from the Reddit addicts is about the best anybody can offer but they shouldn't be too far off. and to date have been pretty accurate for both NV's and AMD's cards.
Do you not realize the absurdity of your own statements? Without the numbers from AMD and nVidia, then you don't have any idea how accurate the numbers from the past are. Let alone this set or any future set.
 
You're really bad at answering questions. Basically you're answering out of bias without actually answering the question as asked.
And that isn't even true. If AMD had to lose either consoles or the professional market, they'd lose consoles every time. You do realize they make GPU's for Apple and all of their professional workstations? How about high end OEM? How about their use in Supercomputers? You clearly don't even know AMD's product line or their bottom line. This is why it's pointless to talk to people about any of this on the internet. Because it's straight up ignorance.
AMD would be foolish to give up on their workstation lineup because it is still a multibillion dollar market, but their console numbers aren't too far behind it, especially with Sony doubling their PS5 orders. Their workstation parts pay for R&D and Drivers, the consoles keep the lights on, if either went both would suffer heavily
 
Do you not realize the absurdity of your own statements? Without the numbers from AMD and nVidia, then you don't have any idea how accurate the numbers from the past are. Let alone this set or any future set.
Well until future cards are on the market and in people's hands they can't get a part breakdown, but if you know a card sells for $500 and you know the financial reports from the AIB and the GPU manufacturer you know their rough profit margin and if you know the costs of each and every part on the card itself you can solve for X where X is the actual GPU itself. Is it absurd yeah I don't understand how those people have that much time to spend doing that kind of number crunching but to date they haven't been wrong? Until either party releases the actual numbers it's the best anybody can really go on but if an AIB sells a card for $500 unless they are selling at cost it should have cost them between 335 and 390 to actually make and ship that card. if you know the board and components on it cost between say $150 - $200 for all the capacitors and what not that leaves you a pretty distinct range for what the GPU cost was. Take that GPU estimate range and compare it against the financial reports from either NV or AMD and you dial that in further, not exact but a pretty good ball park.
 
The Xbox Series X GPU is 52 CUs at 1.82ghz, and sits at RTX 2080 Ti power, and the full size Navi is only 15% more powerful than a GPU paired on the same die as a 8 core 3.8Ghz Zen 2 chiplet?

Anyone should doubt this rumor based off of what we know now just from the consoles.
 
The Xbox Series X GPU is 52 CUs at 1.82ghz, and sits at RTX 2080 Ti power, and the full size Navi is only 15% more powerful than a GPU paired on the same die as a 8 core 3.8Ghz Zen 2 chiplet?

Anyone should doubt this rumor based off of what we know now just from the consoles.
Where did you get that info from?
 
Back
Top