Big Navi is coming

So what is your real expectation for Big Navi? I mean gaming wise?

I hope for a RX6900 with 80CU + 16GB HBM2e as the flagship for $699, a RX6800 with 62 CU + 16GB GDDR6 for $499 and a RX6700 with 40CU + 12GB GDDR6 for $399. Maybe a RX6600 with 36CU + 8GB GDDR6 for $299. Now that I think about it, it sounds too good to be true.
If these products are competitive with their Nvidia counterparts I expect the prices to be $100 cheaper at max than Nvidia. AMD is still a public company out there to make money and not make friends.
 
So many buy into the "AMD is our friend" fallacy.
At the moment they are more my friend than Intel or Nvidia. Nvidia starts a little to redeem themselves, but Intel is still far away from that. Of course AMD needs to make money, but I think they already made a good amount, even with the cheaper prices of the Ryzen CPUs. I mean look at their market cap now compared to pre 2017. They didn't achieve that with high prices for their products. Also the Zen 3 and RDNA2 should have high yields since they are on a proven manufacturing process, not like Nvidia with their mess of a 8nm Samsung process.
Without AMD the 3080 wouldn't cost $699, Nvidia knows that something is coming, why else should they offer a 102 GPU for $699, they could have easily taken $999 for it and people would still buy it.
 
Look at AMD history and the cut down version is always only marginally slower. With news of a 3070 Ti actually existing from the Galax leak we can predict that AMD will offer a 3080 competitor and something that outperforms the 3070 which Nvidia is preparing to counter with the Ti.

They will likely undercut Nvidia again. $499 part outperforming the base 3070 and the $599 full CU part competing with the 3080.

I predict a 3080 10GB price drop as well. Maybe $50.
 
Last edited:

I highly, highly doubt AMD will do a one-off card with HBM2. I think it's going to be a "no suprises" line up with several cards with logically separated specs.
  • 6900 XT with 80CUs, 256-bus, 16GB GDDR6 (or perhaps 72CUs, but higher power and speeds than the non-XT) - $599
  • 6900 with 72CUs, 256-bus, 16GB - $499
  • 6800 with 64CUs, 192-bus, 12GB - $399
  • ...
Lower price and lower power draw than nV competition, but slightly less performance (and no comparison in RT). My main concern with the above is 256-bus is really going to be deficient at high res/IQ vs nV's larger-bus cards. 6900 XT and 6900 will compare admirably to 3090 and 3080 (esp. for the price) up to 1440p, but 4k and higher will be owned by Ampere. I really hope the rumors are wrong on only up to 256-bus, but logically keeping the bus small makes sense to save costs. I'm hoping there's a 512-bus champ coming.

Edit: The more I think about it, having a 256-bit bus on an 80CU Navi 2 card is going to be a significant mem bandwidth bottleneck at higher resolutions and IQ, even if mem frequencies are sky high (which they won't be). I think AMD may be doing Navi 2 a disservice with only 256-bus.
 
Last edited:
I highly, highly doubt AMD will do a one-off card with HBM2. I think it's going to be a "no suprises" line up with several cards with logically separated specs.
  • 6900 XT with 80CUs, 256-bus, 16GB GDDR6 (or perhaps 72CUs, but higher power and speeds than the non-XT) - $599
  • 6900 with 72CUs, 256-bus, 16GB - $499
  • 6800 with 64CUs, 192-bus, 12GB - $399
  • ...
Lower price and lower power draw than nV competition, but slightly less performance (and no comparison in RT). My main concern with the above is 256-bus is really going to be deficient at high res/IQ vs nV's larger-bus cards. 6900 XT and 6900 will compare admirably to 3090 and 3080 (esp. for the price) up to 1440p, but 4k and higher will be owned by Ampere. I really hope the rumors are wrong on only up to 256-bus, but logically keeping the bus small makes sense to save costs. I'm hoping there's a 512-bus champ coming.

Edit: The more I think about it, having a 256-bit bus on an 80CU Navi 2 card is going to be a significant mem bandwidth bottleneck at higher resolutions and IQ, even if mem frequencies are sky high (which they won't be). I think AMD may be doing Navi 2 a disservice with only 256-bus.

Yes, the 256bit bus is super concerning. Navi struggles with 4K vs 1440P and 1080p (1080p it's maybe 5% slower than a 2070 super, bump up to 4k and it gets stretched to 10%). Maybe Navi's issues with 4k are not bandwidth related? I can't see how this is going to work in AMDs favor.
 
When the best bang for buck AMD cards are still the 5700xt and the RX 570.. (ok a 5700 flashed but not everyone will do that) I feel that AMD could have a nice run at the fattest part of the GPU market which is still 1440p/1080p and under $400. At the very least I'm hoping for something competitive from Navi 2 and would have no issues running a 6900xt or whatever the $5-700 GPU is at 1440p high refresh if it pans out.
 
When the best bang for buck AMD cards are still the 5700xt and the RX 570.. (ok a 5700 flashed but not everyone will do that) I feel that AMD could have a nice run at the fattest part of the GPU market which is still 1440p/1080p and under $400. At the very least I'm hoping for something competitive from Navi 2 and would have no issues running a 6900xt or whatever the $5-700 GPU is at 1440p high refresh if it pans out.

Lisa has specifically said that these RDNA2 cards are for 4K gaming. I also think that it is silly to think that AMD will intentionally gimp their cards with an inferior memory structure. If it is 256-bit then there is likely truth to them using the infinity cache and cluster cache technologies to improve that. The thing I keep coming back to is that AMD intended to have a Nvidia-killer. 2 years ago when they started the project they would expect the new Gen Nvidia cards to be around 30-40% faster than a 2080Ti. It is exactly where it is. I expect the card to be more than 2X the 5700XT and right at or just above 3080 speeds, though there have been reputable leaks that the highest end will compete with 3090. Also, they are having a completely separate launch for RDNA2, that shows extreme confidence.
 
I think AMD will at least match the 3080. Nvidia seemed to push things way too far with the 3090 (power, 3-slot, etc.) meaning they were worried about losing the crown.

So AMD might be more competitive than most people assume. If they could crush the 3090, or get in that range for much cheaper, well it will be a different game.
 
Hey I'm plenty hopeful and the fact that Nvidia did swing for the fences with RTX 3000 may very well mean AMD has something good cooking. Would be nice to have my first all Red rig since the 1090T and 6850. 👍
Yeah that would be nice. My last AMD card was the 7970 which was actually pretty good. Hopes are up, but can they deliver enough stock? I guess the new Ryzen CPUs, will show the direction, if they will be available it could be a good indicator for RDNA2.
 
Yeah my 7950 card got me through the mining boom until I got a 980 Ti upgrade. I am looking forward to what AMD has up their sleeve.
 
Pretty underwhelming with a 256-bit memory bandwidth. Maybe it’ll just be a tad bit faster than the 3070
 
Lisa has specifically said that these RDNA2 cards are for 4K gaming. I also think that it is silly to think that AMD will intentionally gimp their cards with an inferior memory structure. If it is 256-bit then there is likely truth to them using the infinity cache and cluster cache technologies to improve that. The thing I keep coming back to is that AMD intended to have a Nvidia-killer. 2 years ago when they started the project they would expect the new Gen Nvidia cards to be around 30-40% faster than a 2080Ti. It is exactly where it is. I expect the card to be more than 2X the 5700XT and right at or just above 3080 speeds, though there have been reputable leaks that the highest end will compete with 3090. Also, they are having a completely separate launch for RDNA2, that shows extreme confidence.

well AMD could NEVER beat Intel said the shills and the fan boys

Zen 2 already smashed Intel into the hurt locker and now Zen 3 coming out in like 2 weeks is going to more than likely absolutely smash anything Intel has

RDNA 1 was like Zen + and RDNA 2 is like Zen 2 maybe even 3 if the pattern of Su's success continues.

Just my humble opinion
 
Hey I'm plenty hopeful and the fact that Nvidia did swing for the fences with RTX 3000 may very well mean AMD has something good cooking. Would be nice to have my first all Red rig since the 1090T and 6850. 👍

With how little headroom 3080 has for OC, its power draw, and with how expensive it is, I'm hoping AMD pulls off something similar to 4850/4870 when NVIDIA tried to overprice 280 series and then AMD immediately embarassed them.

"The GeForce GTX 280 price was cut by a whopping 62%, as it is now priced at $399. ... But as things stand today, the Radeon HD 4870 and GeForce GTX 260 are the only cards that match up in terms of pricing, while the GTX 280 is significantly more expensive, and the Radeon HD 4850 is a bit cheaper.Aug 11, 2008
TechSpot › review › 109-gefor...
GeForce GTX 260/280 versus Radeon HD 4850/4870 - TechSpot"
 
With how little headroom 3080 has for OC, its power draw, and with how expensive it is, I'm hoping AMD pulls off something similar to 4850/4870 when NVIDIA tried to overprice 280 series and then AMD immediately embarassed them.

"The GeForce GTX 280 price was cut by a whopping 62%, as it is now priced at $399. ... But as things stand today, the Radeon HD 4870 and GeForce GTX 260 are the only cards that match up in terms of pricing, while the GTX 280 is significantly more expensive, and the Radeon HD 4850 is a bit cheaper.Aug 11, 2008
TechSpot › review › 109-gefor...
GeForce GTX 260/280 versus Radeon HD 4850/4870 - TechSpot"

This is my hope as well. I do recall a warm feeling of vindication for holding out not buying a GTX 280 and bought an HD 4870 that was 90% as fast for 50% the cost. If I memory serves, I was able to buy a second 4870 for crossfire for nearly the same cost of a single 280.

The issue with the 4870 was the first cards only had 256mb which was quickly becoming insufficient at the time of launch. It was only about a month after launch 512mb versions started showing up. Ironically, nV may be in a somewhat similar situation now with 3080 10GB since already know a 20GB version is inbound.
 
This is my hope as well. I do recall a warm feeling of vindication for holding out not buying a GTX 280 and bought an HD 4870 that was 90% as fast for 50% the cost. If I memory serves, I was able to buy a second 4870 for crossfire for nearly the same cost of a single 280.

The issue with the 4870 was the first cards only had 256mb which was quickly becoming insufficient at the time of launch. It was only about a month after launch 512mb versions started showing up. Ironically, nV may be in a somewhat similar situation now with 3080 10GB since already know a 20GB version is inbound.

If you watch Linus they did find a limit for the 10 GB of vRAM.
Wolfenstein at 8K uses 11 GB vRAM when running on a 3090...so do not buy a 3080 for 8K gaming, problem solved...
 
Pretty underwhelming with a 256-bit memory bandwidth. Maybe it’ll just be a tad bit faster than the 3070

If 6900 XT is 80CU and 256-bit, it will still almost certainly be faster than RTX 3070 at all resolutions and IQ, and will probably equal or come close to RTX 3080 at 1080p and 1440p. At 4k, however, it will be game over vs 3080/90.

If you watch Linus they did find a limit for the 10 GB of vRAM.
Wolfenstein at 8K uses 11 GB vRAM when running on a 3090...so do not buy a 3080 for 8K gaming, problem solved...

Point taken: nV is mostly chasing a bigger VRAM number here for optics/marketing more than utility. That said, I do subscribe to the view that we're going to see a major uptick in VRAM usage with the arrival of the new consoles that both have a usable ~15GB VRAM. For most of us, we'll be trading out/upgrading from any 10GB card before the need for >10GB arrives.
 
Last edited:
This is my hope as well. I do recall a warm feeling of vindication for holding out not buying a GTX 280 and bought an HD 4870 that was 90% as fast for 50% the cost. If I memory serves, I was able to buy a second 4870 for crossfire for nearly the same cost of a single 280.

The issue with the 4870 was the first cards only had 256mb which was quickly becoming insufficient at the time of launch. It was only about a month after launch 512mb versions started showing up. Ironically, nV may be in a somewhat similar situation now with 3080 10GB since already know a 20GB version is inbound.

4850/4870 both had 512mb at launch but your point remains. 1gb 4870 was a bit nicer and lasted longer. That being said I modded the hell out of my 4850. It was a monster value.
 
4850/4870 both had 512mb at launch but your point remains. 1gb 4870 was a bit nicer and lasted longer. That being said I modded the hell out of my 4850. It was a monster value.

Right, 4870 came with 512mb to start, then 1gb a month later. Was thinking of the 7800 GTX for which I first bought the 256mb card, then the 512mb late in the cycle.
 
What if it's a 400W part that performs like a non-Ti 2080 and sounds like a Mustang GT and costs USD$650?
 
90AB1AE9-D128-4311-B8EB-6EF6FCD272D8.jpeg
 
What if it's a 400W part that performs like a non-Ti 2080 and sounds like a Mustang GT and costs USD$650?
Considering they already have a 2080 class GPU that uses 300w, I don't think they'd of put that much effort in to reproduce a card that they already sell and increases the power use even more ;).
 
If 6900 XT is 80CU and 256-bit, it will still almost certainly be faster than RTX 3070 at all resolutions and IQ, and will probably equal or come close to RTX 3080 at 1080p and 1440p. At 4k, however, it will be game over vs 3080/90.



Point taken: nV is mostly chasing a bigger VRAM number here for optics/marketing more than utility. That said, I do subscribe to the view that we're going to see a major uptick in VRAM usage with the arrival of the new consoles that both have a usable ~15GB VRAM. For most of us, we'll be trading out/upgrading from any 10GB card before the need for >10GB arrives.


Well if you look at AMD RDNA2 slides from earlier presentations it’s all about 4K performance. So I really highly doubt they will design a card that chokes at 4K. It’s pretty much guarantee they have something up their sleeves even if it’s 256bit memory interface to overcome bandwidth issue. Or top end is actually HBM.
 
Well if you look at AMD RDNA2 slides from earlier presentations it’s all about 4K performance. So I really highly doubt they will design a card that chokes at 4K. It’s pretty much guarantee they have something up their sleeves even if it’s 256bit memory interface to overcome bandwidth issue. Or top end is actually HBM.

Yeah there's no way it's just 256-bit GDDR6. Either that rumor is wrong and it's actually wider or they've really come up with some cache voodoo to reduce off chip bandwidth requirements.
 
Either Big Navi isn't competing in the top end at all, or AMD is extremely confident about their cards to roll out a 256 bit bus. Whether this turns out to be another 290X or another Vega debacle, it will be interesting to see.
 
Either Big Navi isn't competing in the top end at all, or AMD is extremely confident about their cards to roll out a 256 bit bus. Whether this turns out to be another 290X or another Vega debacle, it will be interesting to see.
It is not the same AMD anymore. I think they are trying much harder these days. But in the end only time will tell.
 
Either Big Navi isn't competing in the top end at all, or AMD is extremely confident about their cards to roll out a 256 bit bus. Whether this turns out to be another 290X or another Vega debacle, it will be interesting to see.

290x was an absolute monster
 
So what is your real expectation for Big Navi? I mean gaming wise?

I hope for a RX6900 with 80CU + 16GB HBM2e as the flagship for $699, a RX6800 with 62 CU + 16GB GDDR6 for $499 and a RX6700 with 40CU + 12GB GDDR6 for $399. Maybe a RX6600 with 36CU + 8GB GDDR6 for $299. Now that I think about it, it sounds too good to be true.

AMD has been burned enough by HBM2 memory prices. The Radeon VII was a disaster from a financial point of view and I'm sure it limited their profit margins on the Vega cards.
All indications are Big Navi will be entirely on DDR6, not even DDR6x for the flagship.

The 6000 series will likely have 16 GB and 256 bit bus on the flagship with proportionate memory ie 12GB for the 192 bit lower end model.
 
Right now AMD is winning by doing nothing. I find the new rumor that its like 80CU 40CU and 30CU really fishy because its hard to see the positioning. Does the 80CU handle the 3090 easily at like $1199-$1399 and then the new RDNA2 architecture is so good that the 40CU version slots in between a 3070 and 3080, but above 2080TI for like $499 with 12GB? Only way I could see those 3 levels making sense. Certainly if it can boost to 2.5Ghz the biggest Navi will be a monster.
 
Looking forward to more news with these cards. I thought that by AMD being a month plus would hurt them, but in all actuality NVIDIA with the shortage and CTD issues, AMD is looking like its in position to disrupt this gen. Hopefully pricing is competitive and their drivers are matured enough that their 5000 series cards will be a distant memory!
 
Right now AMD is winning by doing nothing. I find the new rumor that its like 80CU 40CU and 30CU really fishy because its hard to see the positioning. Does the 80CU handle the 3090 easily at like $1199-$1399 and then the new RDNA2 architecture is so good that the 40CU version slots in between a 3070 and 3080, but above 2080TI for like $499 with 12GB? Only way I could see those 3 levels making sense. Certainly if it can boost to 2.5Ghz the biggest Navi will be a monster.

If you extrapolate based off of navi10 (5700xt) which is 40CUs I'd venture that at 2.5GHz it would come within ~20% of a 2080ti. I'd imagine with architecture improvements from rdna2 it will be comparable to a 2080ti. All a guessing game at this point.
 
If you extrapolate based off of navi10 (5700xt) which is 40CUs I'd venture that at 2.5GHz it would come within ~20% of a 2080ti. I'd imagine with architecture improvements from rdna2 it will be comparable to a 2080ti. All a guessing game at this point.

You can do some pretty simple napkin math to see where the 40 CU part could end up. If we assume the 2500 MHz is the max boost clock (no way it's the base clock), you end up with roughly (2500 MHz/1905 MHz)*(170 W/180 W)*1.5 = 85% better performance than a 5700XT. That's certainly an upper bound given that a lot of that 50% better perf/W goes out the window with the 2500 MHz clock, but they only need a 15% improvement in perf/W to match the 2080 Ti (and the rumored 3070) with these specs, assuming no other bottlenecks.
 
Tweaktown gave some expert, in-depth analysis (sarcasm) saying big Navi will get destroyed by the 3080 because 'less tflops".

Just wanted to throw that out there for thise saying it is just youtube that is destroying the quality of tech journalism.
 
Back
Top