Navi 21 XT to clock up to 2.4GHz

Today's new is that an engineering sample of the AMD RX6800XT pretty much destroys the RTX 3080 in synthetic benchmarks. Remember this is an engineering sample and the released product will likely be a little faster. The 6800XT is on Navy 21 die with 72 cu's. It is Navi 21 's 2nd best sku. The 6900XT is rumored to be equal or better than the RTX 3090 for considerably less money. It wll alll be revealed on Wednesday October 28th. All I can say is Gensen is shitting bricks now.

Don’t allow yourself to get overhyped on the back of rumors. It’s best to expect the worst and then be surprised if it ends up being amazing instead of expecting the best only to be disappointed if it can’t reach those lofty heights.
 
I don't think he was asking about overclocking, I think he was curious why the 24GB of VRAM in the 3090 doesn't show very much advantage at all over the 10GB in the 3080. It's a good question.

Waiting to see AMD's offering would be the smart move at this point. We can't say they'll have better thermals, overclocking, and availability until they actually launch though.
In gaming work loads it won’t and I doubt it will before it’s a generation or two behind. Production or hobbyist workloads it’s going to be a champ, especially if you are playing with anything that uses CUDA acceleration or playing with AI. Once the gpu accelerated storage stuff starts taking off we should see a decrease of things in active GPU memory and the improvements to garbage collection in DX12, Vulcan, Unreal, and Unity will help being allocated memory closer to actual used memory.

One of the benefits of the ray tracing is it extrapolates effects based on data sets where previously many of that needed to be programmed into a texture so calculating data on the fly for many of the effects should also lead to a reduction in data sizes needed to be loaded up. So over the next year or two I am expecting most developers to be able to use the added power to and tools to decrease the amount of things they actively need to store in GPU memory so they can work on larger maps, more terrain, more interactive environments and things like that.
 
  • Like
Reactions: Epos7
like this
Im just suprised by the number of leaks lately. There had been so little up to now. But i guess there are cards in the channel now so noone can stop them now.

It would be really interesting to go back to the leaked specs for cards currently released and see how much was true. (I'm too lazy)
But it seems that most of the leaks turn out to be pretty close.

Or is that just the fanboy in me?
 
Today's new is that an engineering sample of the AMD RX6800XT pretty much destroys the RTX 3080 in synthetic benchmarks. Remember this is an engineering sample and the released product will likely be a little faster. The 6800XT is on Navy 21 die with 72 cu's. It is Navi 21 's 2nd best sku. The 6900XT is rumored to be equal or better than the RTX 3090 for considerably less money. It wll alll be revealed on Wednesday October 28th. All I can say is Gensen is shitting bricks now.
That news also states that it will be a limited product release much like the Radeon VII’s which AMD made less than 5,000 of.
 
My prediction based on the leaks: the 6900 will be faster than the 3080 (not in ray tracing) but slower than the 3090. Price may be at around $650-$700 best bang for your money, the 3090 is irrelevant (5-7% gain) like titans for the mass market, unless you want the best ray-tracing card.
 
My prediction based on the leaks: the 6900 will be faster than the 3080 (not in ray tracing) but slower than the 3090.
That's where I was before the 3080 and 3090 were benched; as of now I think AMD has a part that will outperform the 3080 in general, whatever designation it gets, but probably not the 3090, with the rest slotting in below.

Initially, and I said this here, I figured 80 percent of the 3090. Now I'll be surprised if the gap isn't quite a bit narrower than that.

I also fully expect AMD to compete against Nvidia's features like DLSS and RTX with lower prices, even if they trade blows in performance and (this is a guess but everyone's guessing it right now) better performance per watt. Strictly to claw back market share.
 
My prediction based on the leaks: the 6900 will be faster than the 3080 (not in ray tracing) but slower than the 3090. Price may be at around $650-$700 best bang for your money, the 3090 is irrelevant (5-7% gain) like titans for the mass market, unless you want the best ray-tracing card.
There isn't exactly much of a gap between the 3080 and 3090.
 
Up to 2.4 means 2.2 is probably typical. Seems believable if there were any benefits to the 7nm process over the previous. But I thought these were the same process as the 5700? If they are on the same process, I expect it to be about the same speed as 5700's can get to. Being more transistors in the die means more heat, usually means lower clocks.

So, grain of salt and all.
 
Hopefully supply is better than the 3xxx series. I'll buy Navi just because it's in stock. Those leaked benchmarks are very promising though.
There will be a ton more N21 stock in Q4, however, given how much TAM NVIDIA has surrendered through no-supply, and if N21 is as good perf-wise as we are seeing, I would still expect N21 to sell out in Q4.
 
Up to 2.4 means 2.2 is probably typical. Seems believable if there were any benefits to the 7nm process over the previous. But I thought these were the same process as the 5700? If they are on the same process, I expect it to be about the same speed as 5700's can get to. Being more transistors in the die means more heat, usually means lower clocks.

So, grain of salt and all.
Except that architecture changes can and do make differences in power consumption or heat dissipation. Changes to make the architecture more efficient also drops power requirements lowering brute force measures in order to attain the same level of performance. Navi2 is not just Navi1 with more transistors. Navi1 couldn't scale higher than the 5700XT because of power and heat issues and required changes to additional scaling. One of the biggest problems with people's expectations of Navi2 is that they think there haven't been any changes from Navi1.
 
Don’t allow yourself to get overhyped on the back of rumors. It’s best to expect the worst and then be surprised if it ends up being amazing instead of expecting the best only to be disappointed if it can’t reach those lofty heights.
I mean, it's only 3d mark, but this does point and bode well to the claims that it will compete on raster. Not only does it beat the 3080, but it beats it by more than the 3090 did in benchmarks. That points to a solid lead over the 3080, if not keeping pace with the 3090. We all estimated near 2080ti levels of RT, @4k it seems to be a few percentage ahead, so that's good that it'll at least work, but if that's more important than raster to you may want to wait and see as games are updated to work better with AMD how they perform. If you prefer raster, then it seems AMD will have some great competition.

https://overclock3d.net/gfx/articles/2020/10/23054955967l.jpg

Also this is the 6800XT... not the 6900XT, so if the 6800XT can keep pace with a 3090, well, there should be no reason a 6900XT won't. Again, early release benchmarks (not real games and we don't know system setup), but the indications are pretty good in general.
 
Also this is the 6800XT... not the 6900XT, so if the 6800XT can keep pace with a 3090, well, there should be no reason a 6900XT won't. Again, early release benchmarks (not real games and we don't know system setup), but the indications are pretty good in general.

benchmarks shared by CapFrameX, Wccftech and harukaze5719 (based on data from KittyYYuko)

KittyYYuko’s benchmarks involve a Navi 21 XT SKU with 80 Compute Units, which is likely red team’s flagship Radeon RX 6900 XT. The GPU actually bests NVIDIA’s GeForce RTX 3090 in the Fire Strike Ultra and Fire Strike Extreme tests.

https://www.thefpsreview.com/2020/1...ut-nvidia-geforce-rtx-3080-in-new-benchmarks/
 
benchmarks shared by CapFrameX, Wccftech and harukaze5719 (based on data from KittyYYuko)

KittyYYuko’s benchmarks involve a Navi 21 XT SKU with 80 Compute Units, which is likely red team’s flagship Radeon RX 6900 XT. The GPU actually bests NVIDIA’s GeForce RTX 3090 in the Fire Strike Ultra and Fire Strike Extreme tests.

https://www.thefpsreview.com/2020/1...ut-nvidia-geforce-rtx-3080-in-new-benchmarks/
These benchmark leaks aren’t making sense:

dia-geforce-rtx-3080-rtx-2080-ti-3dmark-4k-tests-1.png

This is the 6800XT which is supposed to be 72CU. Spanks the 3080 in Firestrike, edges out the 3080 in Time Spy and gets spanked by the 3080 in Port Royal.

And this is supposedly the 80CU Big Navi (6900XT):

1603505047834.jpeg


So somehow now the 6900XT loses to the 3080 in Time Spy when the previous source showed the weaker 6800XT beating the 3080?

Something is not adding up here.
 
These benchmark leaks aren’t making sense:

View attachment 292090

This is the 6800XT which is supposed to be 72CU. Spanks the 3080 in Firestrike, edges out the 3080 in Time Spy and gets spanked by the 3080 in Port Royal.

And this is supposedly the 80CU Big Navi (6900XT):

View attachment 292089

So somehow now the 6900XT loses to the 3080 in Time Spy when the previous source showed the weaker 6800XT beating the 3080?

Something is not adding up here.
Yeah who ever made this chart is being paid by nvidia. Adds up to me now!
 
Yeah who ever made this chart is being paid by nvidia. Adds up to me now!
Don’t think so as the second source indicated Big Navi scores were actually higher in Fire Strike than what others were reporting.

But there appears to be clear confusion on which Big Navi card is actually being benchmarked in these leaks. Is it the 6800XT? Is it the 6900XT? Is the 6800XT the 80CU monster or is it the 6900XT?
 
Don’t think so as the second source indicated Big Navi scores were actually higher in Fire Strike than what others were reporting.

But there appears to be clear confusion on which Big Navi card is actually being benchmarked in these leaks. Is it the 6800XT? Is it the 6900XT? Is the 6800XT the 80CU monster or is it the 6900XT?
Well know in 5 days
 
Up to 2.4 means 2.2 is probably typical. Seems believable if there were any benefits to the 7nm process over the previous. But I thought these were the same process as the 5700? If they are on the same process, I expect it to be about the same speed as 5700's can get to. Being more transistors in the die means more heat, usually means lower clocks.

So, grain of salt and all.

RDNA 2 is on a different 7nm process than RDNA 1.
 
Yeah, should at least get some more info in a few days. Hard to say which benchmarks are what and when, but it seems there are more unanswered questions at this point. Always recommend waiting on 3rd party benchmarks from review sites before taking any leaks as proof of anything.
 
Don’t think so as the second source indicated Big Navi scores were actually higher in Fire Strike than what others were reporting.

But there appears to be clear confusion on which Big Navi card is actually being benchmarked in these leaks. Is it the 6800XT? Is it the 6900XT? Is the 6800XT the 80CU monster or is it the 6900XT?
If I remember correctly, the 80CU card is reserved to AMD only, so I wouldn't be expecting any leaks from its benchmarks.
 
Last edited:
Except that architecture changes can and do make differences in power consumption or heat dissipation. Changes to make the architecture more efficient also drops power requirements lowering brute force measures in order to attain the same level of performance. Navi2 is not just Navi1 with more transistors. Navi1 couldn't scale higher than the 5700XT because of power and heat issues and required changes to additional scaling. One of the biggest problems with people's expectations of Navi2 is that they think there haven't been any changes from Navi1.

I factored those into my guesstimation. And may have been too generous. Process shrinks are not what they used to be.

It remains to be seen. In a month we will know for sure.
 
Don’t allow yourself to get overhyped on the back of rumors. It’s best to expect the worst and then be surprised if it ends up being amazing instead of expecting the best only to be disappointed if it can’t reach those lofty heights.
You are right, but the source for these benchmarks has been confirmed by at least 3 leakers who have a long record of accuracy.
 
They do. You need the right game. If you pick a game that has minimal FOV objects then of course it doesn't matter as much. Having people saying memory means absolutely nothing on this forum is quite ridiculous but hey feel free to parrot memory doesn't matter. Obviously they add memory as a joke. 4GB? Why not just 1GB. I'm sure that's enough right? ;)
yep, and lets not forget about the consoles having 16gb. what happens when devs start to take full advantage of the hardware available to them. at that point you may be glad you have 16 vs 10? who knows. but i know i dont want my pc spec'd lower than a console, that's for sure.
 
I'm sure there was a specific set of use cases that NVidia found where the increased memory speeds made a difference enough for them to warrant the design choice, the question is will most gamers encounter them. But alternatively starting with GDDR6X out the gate gives them the option, later on, to maybe release cards with slower non X at a cheaper price or with more memory at the same price should the need arise and if there is a minimal impact on actual gaming than yay?
or lets look at it this way, if nvidia is barely edging out and/or losing to amd in real world performance if they hadn't used gddr6x they would be losing in all the benchmarks, right?
 
AMD, The cleared up somethings they said during their Financials day this year. That Future products wouldn't be using the base 7nm process. That they would be using enhanced/7nm+/7nm on a base by case basis.

It seems Zen 3 is using the same 7nm process as Zen 2 even though it’s “better”. There has been a lot of confusion around 7nm enhancements. Enhanced or improved doesn’t necessarily mean it’s using different fab tech or equipment. It can simply mean the process has matured.
 
yep, and lets not forget about the consoles having 16gb. what happens when devs start to take full advantage of the hardware available to them. at that point you may be glad you have 16 vs 10? who knows. but i know i dont want my pc spec'd lower than a console, that's for sure.
Not quite 16, while the series X does have 16GB of GDDR6 it has no system ram, so 2.5G of that is for the OS, 3.5G is used to replicate system ram and the remaining 10 is for the GPU.
The series S only has 10 total so only 6 of that is for the GPU.
 
One (feature) that looks nice is what’s called Rage mode, a one click overclocking tool that does enough to void your warranty even if AMD puts it in their drivers.

Update: AMD has clarified that Rage Mode does not void the warranty. Sanity prevails.

https://www.semiaccurate.com/2020/10/28/amd-radeon-6000-series-takes-the-gpu-crown/

The performance gains from overclocking, in turn, will vary with chip and board quality. Some chips barely make spec, others fly past it; so overclocking is always subject to the silicon lottery. To that end, AMD Isn’t offering any specific performance claims with respect to Rage Mode, but they are publishing a slide showing a combination of gains in their testing from Rage Mode and Smart Access Memory.



https://www.anandtech.com/show/1620...-starts-at-the-highend-coming-november-18th/2
 

Attachments

  • RX6800_20_575px.jpg
    RX6800_20_575px.jpg
    36.2 KB · Views: 0
Back
Top