confirmed: AMD's big Navi launch to disrupt 4K gaming

We got to see 3080 reviews before they sold out too, but that didn't help most people actually bag a card successfully.

But let's not worry to much, we still have no real indication how good or bad these amd card will actually be, and my gut keeps telling me this will be another mid-tier release.

I got curious since you only ever seem to pop up in threads to throw some shade towards AMD and then disappear. So lets put into perspective for everyone else how little your gut feeling actually means.

In the last 3.5 years, since Ryzen 1000 launch, you've posted 193 times, out of those 183 were in threads either directly or tangentially about AMD. Every single one of those posts have been negative.

This is what you say, but it's obvious to anyone who spends even a minute looking that you're just another fanboy who likes to act indignant when called out.

Let me dispel your ignorant misconception. I'm not a fan of either company, I'm a fan of technology. I couldn't give two shits who's better at any given point as long as they both remain competitive.

I'm also not dumb enough to think that Intel is completely down and out. They are a massive company with enough money and enough resources to come back swinging, which I fully expect them to do.

Don't make the stupid mistake in thinking that they're done when AMD has only just now caught up. We would all be wise in watching both companies closely for the next few years as it should prove to be quite exciting for everyone.

PS: I'm getting sick and tired of your fanboi bullshit replies. Don't be surprised if I completely start ignoring your ass.
 
They also have NVidia to consider even if their cost was 1/2 of Nvidias they only have to price themselves relative to NVidias. If they were to undercut them and find themselves where they weren’t able to meet 100% of the demand and any fraction of their buyers over the course of a year were to be found buying the cards from scalpers at a higher price their share holders would riot and AMD would be smacked with a lawsuit they would loose.
Of course they can only sell what the market can support. They would be silly to price it less than the market can support. They want to price it so their cards are just about sold out, but not MIA. I'm not to sure about the lawsuits, but I agree with everything else. I'm not sure an investor has a right to sue based on how a company decides to price their products (but I'm no lawyer either), they can choose to stop investing and move their money to somewhere else as long as AMD isn't misleading them with their financials (like claiming they are going to get a 60% profit margin, then price it at 20%, thereby misleading/lying to investors).
 
I got curious since you only ever seem to pop up in threads to throw some shade towards AMD and then disappear. So lets put into perspective for everyone else how little your gut feeling actually means.

In the last 3.5 years, since Ryzen 1000 launch, you've posted 193 times, out of those 183 were in threads either directly or tangentially about AMD. Every single one of those posts have been negative.

This is what you say, but it's obvious to anyone who spends even a minute looking that you're just another fanboy who likes to act indignant when called out.
See, there were 10 posts (~5%) that weren't anti AMD, proves he's not a fanboy, lol. That means only aobut 95% of his posts are anti AMD.. if he was really a fanboy it'd be closer to 99% ;). Anyways, that's a lot of effort you put in to find this information, but it's good to know some background when talking to people that way I know how much or little breath (typing) to waste with them.
 
Of course they can only sell what the market can support. They would be silly to price it less than the market can support. They want to price it so their cards are just about sold out, but not MIA. I'm not to sure about the lawsuits, but I agree with everything else. I'm not sure an investor has a right to sue based on how a company decides to price their products (but I'm no lawyer either), they can choose to stop investing and move their money to somewhere else as long as AMD isn't misleading them with their financials (like claiming they are going to get a 60% profit margin, then price it at 20%, thereby misleading/lying to investors).
They got sued and lost when the crypto miners were buying everything and people were buying from scalpers at higher prices for not charging enough.
 
They got sued and lost when the crypto miners were buying everything and people were buying from scalpers at higher prices for not charging enough.
Like I said, I'm no lawyer :). My guess it here is more to the story than this, like they weren't honest about the % sales going to miners vs. gamers and saying that their profits were no temporary during the mining rush, but I'm really not sure. If you have a link I'd love to get caught up on it though as I am curious exactly what it would have been for. Anyways, like I said, that's not really the part that concerns me, as they should (like any business) be trying to maximize profits (although, sometimes to make headway when you're not in the lead position, you do have to run on a lower margin than the competition) without crapping on their own customers and creating less sales and less market share. It's a balancing act and you'll never make everyone happy.

Edit: I tried doing a google search but didn't find anything to do with a lawsuit due to the pricing of cards during the mining craze, seems only the Piledriver settlement pops up (which I still think is a complete crock of shit, lol).
 
Like I said, I'm no lawyer :). My guess it here is more to the story than this, like they weren't honest about the % sales going to miners vs. gamers and saying that their profits were no temporary during the mining rush, but I'm really not sure. If you have a link I'd love to get caught up on it though as I am curious exactly what it would have been for. Anyways, like I said, that's not really the part that concerns me, as they should (like any business) be trying to maximize profits (although, sometimes to make headway when you're not in the lead position, you do have to run on a lower margin than the competition) without crapping on their own customers and creating less sales and less market share. It's a balancing act and you'll never make everyone happy.

Edit: I tried doing a google search but didn't find anything to do with a lawsuit due to the pricing of cards during the mining craze, seems only the Piledriver settlement pops up (which I still think is a complete crock of shit, lol).
I can’t find it either... I could swear it was a thing Investors were pissed that AMD had “under valued” their product in an attempt to gain market share then failed to actually gain market share. Something like that.

The Bulldozer thing I get though an 8 core processor only capable of processing 4 threads.... it’s a little arguable.
 
See, there were 10 posts (~5%) that weren't anti AMD, proves he's not a fanboy, lol. That means only aobut 95% of his posts are anti AMD.. if he was really a fanboy it'd be closer to 99% ;). Anyways, that's a lot of effort you put in to find this information, but it's good to know some background when talking to people that way I know how much or little breath (typing) to waste with them.
I would have never noticed had you not replied to him, I ignored him for being an AMD shill a long long time ago.

In any case, you can see he focuses more on what negative comments are directed toward AMD than actual facts. I see he still hasn't changed one iota.


Well, with 80cu's, I can't imagine they are going to be horrible. 5700XT is 40 CU's, so even with 80% scaling (with ZERO architectural improvements), it'll be 3080 level. The only thing that I'm not sold on yet, is the narrow bus width with the infinity cache system. Its the biggest unknown at the moment. We can extrapolate (to an extent as there are a lot of unknowns) what kind of core performance doubling the CU count will give, but the change in memory architecture is something we really haven't seen yet. My guess is the cache system will do better at lower resolutions where memory usage is lower. As memory usage goes up, so does cache misses. They are employing this in xbox/ps5 though, so I'm guessing it at least works decent. Time will tell I guess :).

How much power did they have to pump into 40cu just to get mid-tier performance? And now suddenly people are expecting some miracle from the company that keeps delivering middle of the road performance over and over again?

All I'm saying is, I'm not expecting much. This way I can either be presently surprised or not surprised at all. What I can't be is disappointed yet again as I'm sure a lot of AMD fans feel way too often.
 
Last edited:
Well, with 80cu's, I can't imagine they are going to be horrible. 5700XT is 40 CU's, so even with 80% scaling (with ZERO architectural improvements), it'll be 3080 level. The only thing that I'm not sold on yet, is the narrow bus width with the infinity cache system. Its the biggest unknown at the moment. We can extrapolate (to an extent as there are a lot of unknowns) what kind of core performance doubling the CU count will give, but the change in memory architecture is something we really haven't seen yet. My guess is the cache system will do better at lower resolutions where memory usage is lower. As memory usage goes up, so does cache misses. They are employing this in xbox/ps5 though, so I'm guessing it at least works decent. Time will tell I guess :).

Pretty much. It's been evident for a while that RDNA 2 will perform at the very least like a scaled up RDNA 1. An 80CU chip will at least be 15-20% faster than 2080ti in raw rendering - better if there are IPC improvements.

Unfortunately, there's no memory architecture magic that will get around a 256-bit bus. As you say, caching could help some at lower resolution and IQ (which is where it will help consoles notably), but a smaller memory bus will impede throughput at higher resolutions.
 
I would have never noticed had you not replied to him, I ignored him for being an AMD shill a long long time ago.

In any case, you can see he focuses more on what negative comments are directed toward AMD than actual facts. I see he still hasn't changed one iota.




How much power did they have to pump into 40cu just to get mid-tier performance? And now suddenly people are expecting some miracle from the company that keeps delivering middle of the road performance over and over again?

All I'm saying is, I'm not expecting much. This way I can either be presently surprised or not surprised at all. What I can't be is disappointed yet again as I'm sure a lot of AMD fans feel way too often.

Goodie I made his block list, but it's still more lies He blocked me because he doesn't know how calendars work, and assumed that nobody else can read one either.

Typical post, all opinions that he wants treated as facts.

https://hardforum.com/threads/intel...han-the-ryzen-9-3900x.1995105/post-1044555906

Again, no options worth listening to, he's the worst kind of shill, a bad one.
At least idiotincharge did a decent job before he got banned.
 
I can’t find it either... I could swear it was a thing Investors were pissed that AMD had “under valued” their product in an attempt to gain market share then failed to actually gain market share. Something like that.

The Bulldozer thing I get though an 8 core processor only capable of processing 4 threads.... it’s a little arguable.
Pissed, possibly. Lawsuit, can't find anything to back up this claim. I had a long response to the bulldozer thing, but realized how far off topic it was getting so I'll PM you instead ;).
 
Looking at the subject in this thread, I've got to say...I'm not ready to try for 4k gaming.

I've got a great LG OLED tv that's 4k, hooked up to my worst rig. Yeah, one day I'll upgrade that HTPC to drive 4k, but not now. Heck, I remember when 1080HD was just around the corner and was gonna be the best eva! ;) I've decided the bleeding edge costs too much.

My gaming preference these days is 1440. Yeah, it'd be great if AMD disrupts the 4k gaming space. Really. But, for this release I'm hoping for killer 1440 performance (using rasterization: I think RT is a bit too soon.)

Maybe another 3-5 years before ray-tracing, 4k, 60-120 fps becomes a consumer reality.

That's my .02...
 
I would have never noticed had you not replied to him, I ignored him for being an AMD shill a long long time ago.

In any case, you can see he focuses more on what negative comments are directed toward AMD than actual facts. I see he still hasn't changed one iota.
Maybe he is or isn't, but those are some very interesting stats he found... me personally, haven't ignored a single person on any forum ever. I just think people get passionate and caught up in things and once in a while I might even agree, even if we don't always see eye to eye. I don't care that 95% of your posts are directed towards AMD (if that's actually true, I haven't verified nor am I going to waste my time doing so), as long as the conversation stays civil :).

How much power did they have to pump into 40cu just to get mid-tier performance? And now suddenly people are expecting some miracle from the company that keeps delivering middle of the road performance over and over again?

All I'm saying is, I'm not expecting much. This way I can either be presently surprised or not surprised at all. What I can't be is disappointed yet again as I'm sure a lot of AMD fans feel way too often.
The 40cu was 225 watts. But that argument is just plain silly, considering it draw less power than the card that it competed against for the same performance. If you use this logic, then the 3080 doesn't exist either ;). But we know that it does, so comparing 5700xt/2070 performance per watt and trying to say that it's impossible to get 3080 level performance because of their perf/watt just doesn't hold up.

I don't expect a miracle, I just look at the numbers (which aren't confirmed yet, so salt shaker and all) and what we do have available (xbox and ps5 performance and CU count and frequency) and come up with what I believe is reasonable and of course, not guaranteed in any way shape or form. But, from viewing the #'s that we have available, it seems more likely than not that IF the rumors of 80CU's is accurate, it will be able to keep pace with a 3080 easily as long as this memory bus width/cache decision doesn't come back to bite them. Again, I'm not delusional and AMD has disappointed before, but at the same time I can see what data points we have and come up with (what I feel, and you're free to disagree) a reasonable expectation. I won't buy it until I see benchmarks and can verify what it actually does, just like I wouldn't buy a 3080 (or 3070) until independent benchmarks come out and I can see where the performance actually is.
 
Pretty much. It's been evident for a while that RDNA 2 will perform at the very least like a scaled up RDNA 1. An 80CU chip will at least be 15-20% faster than 2080ti in raw rendering - better if there are IPC improvements.

Unfortunately, there's no memory architecture magic that will get around a 256-bit bus. As you say, caching could help some at lower resolution and IQ (which is where it will help consoles notably), but a smaller memory bus will impede throughput at higher resolutions.

Well, there is some memory architecture magic that they use and it works very well in RDNA1. They improved it for RDNA2, so hopefully it doesn't completely castrate their card. Funny if you downclock the memory on a 5700, you only see a 20% drop in performance on average (some better, some worse, just average). The caching system of rdna1 does work more often than not, and rdna2 is getting a major improvement/overhaul of this system. That said, this is the only part that concerns me as I believe the CU's will scale well, I am curious what workloads the cache system may fall apart on. Benchmarks will reveal if this was a great decision or a mistake.

If you feel like a read about their original RDNA cache hierarchy. I remember seeing a comparison at one point but not sure where it is at the moment.
https://www.amd.com/system/files/documents/rdna-whitepaper.pdf
 
They're already sold out, my friend. Reviews won't matter.

I don't know that's going to be the case.

1). More people (like yourself) have an affinity for Nvidia and wouldn't buy Big Navi anyway. At any given point, AMD market share is in the 15-20% range of discrete cards. So less people are trying to buy them.
2). Mining isn't particularly profitable right now. I mean, yes, you will make money if you mine with your cards, but the ROI is pitiful. If the ROI was 90 days or so, these would be gone.
3). AMD seems to be taking their time to get a product that has stock available and driver's that aren't like Navi launch drivers. As long as they have a November launch for holiday sales, they will be fine. Nvidia has already revealed their cards.
 
No way he's banned?

Yeah, read some of his posts.
Apparently he's ACTUALLY a shill.

Not just a fanboy, but a person who's been vetted and coached by Nvidia to type realistic looking messages that get people to buy Nvidia products.

He's an illegitimate person who's just posting charismatic advertisements that look like forum posts. It's uncanny, because looking at just ONE of his posts reveals nothing, but in aggregate, the truth becomes obvious.

He delivers a logical appearing chain of posts that seem grounded in reasonability and common sense, but only enough to be convincing and friendly. Almost a real human, but no. He gets paid and does this as a job.

At least according to the fact of the ban.
 
Yeah, read some of his posts.
Apparently he's ACTUALLY a shill.

Not just a fanboy, but a person who's been vetted and coached by Nvidia to type realistic looking messages that get people to buy Nvidia products.

He's an illegitimate person who's just posting charismatic advertisements that look like forum posts. It's uncanny, because looking at just ONE of his posts reveals nothing, but in aggregate, the truth becomes obvious.

He delivers a logical appearing chain of posts that seem grounded in reasonability and common sense, but only enough to be convincing and friendly. Almost a real human, but no. He gets paid and does this as a job.

At least according to the fact of the ban.
Always had suspicions but I don't have a way to vet anything, which is probably why we didn't always agree and only crossed paths a few times.
 
Yeah, read some of his posts.
Apparently he's ACTUALLY a shill.

Not just a fanboy, but a person who's been vetted and coached by Nvidia to type realistic looking messages that get people to buy Nvidia products.

He's an illegitimate person who's just posting charismatic advertisements that look like forum posts. It's uncanny, because looking at just ONE of his posts reveals nothing, but in aggregate, the truth becomes obvious.

He delivers a logical appearing chain of posts that seem grounded in reasonability and common sense, but only enough to be convincing and friendly. Almost a real human, but no. He gets paid and does this as a job.

At least according to the fact of the ban.

Just checked and it's true. Everyone who's ever exchanged with him should know what a colossal waste of time it was reading and responding to his posts. Not sure he was the only one though but one down !
 
Where can we see those facts? He was definitely pro-Nvidia but if he’s a paid shill then lots of ppl on this forum must be shills too on both sides.

The fact I'm referring to is the fact that he was banned and his account title says "Nvidia Shill."
 
He called out the owner of this forum. That never works well...he paid the price.

Oh?

Well, I'm not a fan of mods or admins taking their position personally.
If he's not a shill, then it would be wrong for the mods to ban him over political opinions.

It's a very dark philosophical mindset to have the will or desire to silence those who speak against those with power.
 
Oh?

Well, I'm not a fan of mods or admins taking their position personally.
If he's not a shill, then it would be wrong for the mods to ban him over political opinions.

It's a very dark philosophical mindset to have the will or desire to silence those who speak against those with power.

He was very pro NV. I don't know if he was a paid shill or not. I do know he called out Kyle after going back and forth for a bit and Kyle banned him. It's his site so his rules.
 
He was very pro NV. I don't know if he was a paid shill or not. I do know he called out Kyle after going back and forth for a bit and Kyle banned him. It's his site so his rules.

His rules, huh?
I dunno if I agree with that concept, considering "platformism" is an idea that's popping up lately.

But let's end this and stay on topic.

AMD's new GPU is gonna disrupt 4K gaming.
 
He was very pro NV. I don't know if he was a paid shill or not. I do know he called out Kyle after going back and forth for a bit and Kyle banned him. It's his site so his rules.

He did ban him, but Kyle also let him reregister under a new name and he was back to posting, but he did something again and his new account was banned also.
 
The 40cu was 225 watts. But that argument is just plain silly, considering it draw less power than the card that it competed against for the same performance. If you use this logic, then the 3080 doesn't exist either ;). But we know that it does, so comparing 5700xt/2070 performance per watt and trying to say that it's impossible to get 3080 level performance because of their perf/watt just doesn't hold up.

I don't expect a miracle, I just look at the numbers (which aren't confirmed yet, so salt shaker and all) and what we do have available (xbox and ps5 performance and CU count and frequency) and come up with what I believe is reasonable and of course, not guaranteed in any way shape or form. But, from viewing the #'s that we have available, it seems more likely than not that IF the rumors of 80CU's is accurate, it will be able to keep pace with a 3080 easily as long as this memory bus width/cache decision doesn't come back to bite them. Again, I'm not delusional and AMD has disappointed before, but at the same time I can see what data points we have and come up with (what I feel, and you're free to disagree) a reasonable expectation. I won't buy it until I see benchmarks and can verify what it actually does, just like I wouldn't buy a 3080 (or 3070) until independent benchmarks come out and I can see where the performance actually is.

Are you misremembering? The 5700XT was released last year, and it couldn't even compete in performance per watt with a 1080ti, a card released 3 years ago.

performance-per-watt_1920-1080.png
performance-per-watt_2560-1440.png
performance-per-watt_3840-2160.png


Knowing this trend we can expect that AMDs next release will likely hit the 20 series performance per watt.
 
The shocking thing will be when it is the the Navi 22 competing with the 3080 and 3090, not the Navi 21. Half joking, or am I?
 
Last edited:
Are you misremembering? The 5700XT was released last year, and it couldn't even compete in performance per watt with a 1080ti, a card released 3 years ago.

*Snipped*

Knowing this trend we can expect that AMDs next release will likely hit the 20 series performance per watt.
I must be misremembering, because I thought it was a more efficient card that it is. My apologies, that graph tells the story :).

I don't know about the trend comment, the trend was from the Vega 64 -> Vega VII -> 5700 XT... ~20% from vega 64 -> VII then another ~20% to 5700 XT... if this is another ~20%, it puts them right on target for the 3x00 series. If you look at the graphs, this is the AMD trend. Their leaks have suggested much more, but I have a hard time believing their 50% more efficient claim (except in maybe one specific benchmark or something or comparing a mobile part vs. a high end part or something silly), but I wouldn't doubt a 20% increase as they've done the last couple of generations. Just another data point to think about, but really nothing we can predict or guess is going to be spot on. Their cache system could be a real winner and net extra performance, or it could be a complete let down. We are pretty confident in the leaks suggesting 80 CU's. We don't know the final clocks, but I would guess it'll be somewhere between xbox and ps5 (ps5 clocks are a bit higher than I would think an 80CU beast would hold), so possibly at or just under 2ghz (They can hold 1.8+ in the xbox with crappy thermals and 56 CU's). The 5700xt boosted to 1.9ghz, with a game clock near 1.75ghz , so anything that isn't less than this should still be ok. Some leaks suggesting 2.1ghz have come out, but if that's true it's probably max boost clock and only for very short periods or something, but who knows, it's all just speculation. Also, the screen shots they've teased have shown dual 8-pin connectors, so the absolute max it could pull is 375watts, but they won't play that game again being so close to pcie spec, so 350w is probably the absolute maximum.
 
r/AMD’s stblr has shared the alleged number of Compute Units, core clocks, and other key specifications for red team’s next-generation RDNA 2 GPUs

Here’s a brief breakdown (via VideoCardz and Wccftech)

(Sienna Cichlid): 4 versions available as per Redgamingtech's video
  • 80 Compute Units
  • 5,120 Stream Processors
  • Boost clocks of up to 2,200 MHz
  • 22.5 TFLOPs of compute
(Navy Flounder): above 3070?
  • 40 Compute Units
  • 2,560 Stream Processors
  • Boost clocks of up to 2,500 MHz !!??
  • 12.8 TFLOPS of compute
(Dimgrey Cavefish): below 3070??
  • 32 Compute Units
  • 2,048 Stream Processors
https://www.thefpsreview.com/2020/0...s-double-the-amount-of-the-radeon-rx-5700-xt/
 
r/AMD’s stblr has shared the alleged number of Compute Units, core clocks, and other key specifications for red team’s next-generation RDNA 2 GPUs

Here’s a brief breakdown (via VideoCardz and Wccftech)

(Sienna Cichlid): 4 versions available as per Redgamingtech's video
  • 80 Compute Units
  • 5,120 Stream Processors
  • Boost clocks of up to 2,200 MHz
  • 22.5 TFLOPs of compute
(Navy Flounder): above 3070?
  • 40 Compute Units
  • 2,560 Stream Processors
  • Boost clocks of up to 2,500 MHz !!??
  • 12.8 TFLOPS of compute
(Dimgrey Cavefish): below 3070??
  • 32 Compute Units
  • 2,048 Stream Processors
https://www.thefpsreview.com/2020/0...s-double-the-amount-of-the-radeon-rx-5700-xt/

Back of the envelope maths...(using the above for Navy Flounder).
5700XT has 40 Compute Units and 2,560 Stream Processors and 256-bit Memory Bus. That makes it a good comparison. (See: https://www.amd.com/en/products/graphics/amd-radeon-rx-5700-xt )

If there's a 15% IPC improvement, then that, plus the boost clock (2,500 MHz as listed in the quote, above) leads to this:

Assume 5700XT is 100%. The Navy Flounder could be: 100% * 1.15 (IPC improvement) * (2,500/1920) (The Boost Clock ratios) ~= 150%.

Obviously, the majority of that 50% boost in performance comes from the clock speed. (Looking at TFLOPS, 12.8/9.75 = 1.31, a 31% improvement.)

Navy Flounder/Navi 22 should be a pretty big boost over 5700XT.

Again, that's a rough calculation based on some rumint. Add salt to flavor.

I guess we'll find out for sure in late October or early November.
 
On the Justia Trademarks website, we can find the first references to AMD Infinity Cache. The name refers to AMD Zen technology called Infinity Fabric. AMD’s proprietary interconnect system architecture for CPU and GPU cores. The Infinity Cache could be a new technology coming to AMD Radeon graphics cards.

It is rumored that AMD Big Navi has a relatively small memory bus of 256-bit and it is currently uncertain how the special cache works.

The alleged AMD Radeon RX 6900 graphics card would therefore feature 16GB GDDR6 256-bit memory paired with Infinity Cache.

The first rumors about Infinity Cache have first appeared on the RedGamingTech YouTube channel.

https://videocardz.com/newz/amd-infinity-cache-coming-to-big-navi#disqus_thread
 
On the Justia Trademarks website, we can find the first references to AMD Infinity Cache. The name refers to AMD Zen technology called Infinity Fabric. AMD’s proprietary interconnect system architecture for CPU and GPU cores. The Infinity Cache could be a new technology coming to AMD Radeon graphics cards.

It is rumored that AMD Big Navi has a relatively small memory bus of 256-bit and it is currently uncertain how the special cache works.

The alleged AMD Radeon RX 6900 graphics card would therefore feature 16GB GDDR6 256-bit memory paired with Infinity Cache.

The first rumors about Infinity Cache have first appeared on the RedGamingTech YouTube channel.

https://videocardz.com/newz/amd-infinity-cache-coming-to-big-navi#disqus_thread

Is below paper on a new L1 cache mechanism related to "Infinity Cache" !?


Adwait Jog (@adwaitjog) Tweeted:
@massemibrahim's video on our PACT 2020 paper is now up:



https://twitter.com/adwaitjog/status/1313100678978588672?s=20

Adwait Jog (@adwaitjog) Tweeted:
A new paper from our group got accepted at PACT 2020: we proposed a new L1 cache design for GPUs that reduces cache wastage and improves bandwidth! Kudos to Mohamed (@massemibrahim) for all the hard work! Joint work with AMD (cc: @onurkayiran, Yasuko, @RunBaconHockey) #PACT2020
https://twitter.com/adwaitjog/status/1283795376638746627?s=20
 
There also seems to be a patent filing (courtesy of overclockers.co.uk forum post)

https://www.overclockers.co.uk/forums/threads/rdna-2-128mb-infinity-cache-rumours.18899110/

These's some rumours going around about a 128MB Infinity Cache.

I'd say its very likely to be correct, a patent was published today from AMD called 'ADAPTIVE CACHE RECONFIGURATION VIA CLUSTERING', link here here:
https://www.freepatentsonline.com/y2020/0293445.html

Basically, what I think it will allow, is greater L1 cache capacity for GPUs. It does this by dynamically creating 'a plurality of compute units' clusters. The dynamically created clusters allow a 'backup' pool of shared L1 cache to be accessed, if the L1 cache for a CU (Compute Unit) isn't sufficient.

L1 cache is the fastest (lowest latency) type of GPU cache. So the more L1 capacity, the less the slower L2 and L3 caches are used.


ADAPTIVE CACHE RECONFIGURATION VIA CLUSTERING
Document Type and Number:
United States Patent Application 20200293445
Kind Code:
A1
Abstract:
A method of dynamic cache configuration includes determining, for a first clustering configuration, whether a current cache miss rate exceeds a miss rate threshold. The first clustering configuration includes a plurality of graphics processing unit (GPU) compute units clustered into a first plurality of compute unit clusters. The method further includes clustering, based on the current cache miss rate exceeding the miss rate threshold, the plurality of GPU compute units into a second clustering configuration having a second plurality of compute unit clusters fewer than the first plurality of compute unit clusters.

Inventors:
Ibrahim, Mohamed Assem (Santa Clara, CA, US)
Kayiran, Onur (Fairport, NY, US)
Eckert, Yasuko (Bellevue, WA, US)
Loh, Gabriel H. (Bellevue, WA, US)
Application Number:
16/355168
Publication Date:
09/17/2020
Filing Date:
03/15/2019
View Patent Images:
Download PDF 20200293445
Export Citation:
Click for automatic bibliography generation
Assignee:
ADVANCED MICRO DEVICES, INC. (Santa Clara, CA, US)
 
Back
Top