AMD vision in one patent: the Chiplet!

JustReason

razor1 is my Lover
Joined
Oct 31, 2015
Messages
2,483
http://www.bitsandchips.it/english/52-english-news/8604-amd-vision-in-one-patent-the-chiplet

Zen has a modular (LEGO-like) design. AMD created Zen in order to have a malleable uArch: customers can add or remove SIMDs and instruction sets, customize the different levels of cache, etc.
But this is only the beginning of AMD vision. We know that Apple, like other companies, is using SiP (System-in-Package) technology to produce some of its own products (E.g. Apple S1), but AMD went over.


chiplet.png


AMD Chiplet seems to be an evolution of Palo Alto Research Center (PARC) studies.

Chiplet is an evolution of today Interposer employment: the Interposer has a pattern of sites for mounting chiplets.

As we have seen so far, companies put a single GPU or CPU or a FPGA on a Interposer with some memory chips (HMC or HBM). Thanks to Chiplet, we can mix different kinds of chips on the same Interposer: GPU, ASIC, CPU, FPGA, Modem LTE, Ethernet CTRL, different memories (DDR4 or HBM), etc.

Chiplet could be a revolution because we can create a whole mainboard (Ethernet CTRL, USB CTRL, etc.) using a single CPU package!

Good news is that is the entire article. Cant say I am surprised especially after ZEN and TR/EPYC. Does have some limitless possibilities.
 
Well Master_shake_ you will have to wait for the 3 professionals on this forum who are the only people apparently qualified to pass judgment on AMD's designs, strategies, patents, and products.
Thankfully, they are all 3 fully aware of their purpose and are sure to respond quickly. I would say inside of a day, we should know their pontifications upon what a terrible implementation and design idea AMD is operating from.

As for me, I'm clearly not qualified to recognize the implications myself, so I shall just sit here in my supposed ignorance and think that AMD has come up with a stunningly good solution and design implementation in their recent technology and product offerrings. When we hear from the 3 qualified, I am certain we will all collectively say "Oh, thats why AMD sucks still. It was so obvious all along, but we just couldn't see it without them sharing their enlightened thoughts."
 
How is this different than SoCs designs other than using the word "chiplet"?
 
For the love of God, B&C link us the fucking patent. This is also old news.

Overclock3d

TheNextPlatform's articles, 2015, 2017.

And finally, this article from HPCWire, which I feel personally is best to read up on AMD's efforts.

Please note that they link the fucking papers they talk about B&C.
 
How is this different than SoCs designs other than using the word "chiplet"?

By way of HPCWire, from a AMD paper titled "Design and Analysis of an APU for Exascale Computing",

The paper includes a fair amount of discussion around choices made. For example, “Rather than build a single, monolithic system on chip (SOC), we propose to leverage advanced die-stacking technologies to decompose the EHP into smaller components consisting of active interposers and chiplets. Each chiplet houses either multiple GPU compute units or CPU cores. The chiplet approach differs from conventional multi-chip module (MCM) designs in that each individual chiplet is not a complete SOC. For example, the CPU chiplet contains CPU cores and caches, but lacks memory interfaces and external I/O.”

Chiplet benefits, according to AMD, include die yield, process optimization, and re-usability. On the latter point, AMD reported, “The decomposition of the EHP into smaller pieces enables silicon-level reuse. A single, large HPC-optimized APU would be great for HPC markets, but may be less appropriate for others. For example, one or more of the CPU clusters could be packaged together to create a conventional CPU-only server processor.”
 
By way of HPCWire, from a AMD paper titled "Design and Analysis of an APU for Exascale Computing",

Interesting quote. It seems on the surface Intel may have miscalculated again when we count up the pros and cons of monolithic vs chip on chip. I suppose while one has the best fab in existence, the idea of going monolithic is the natural order or logical step. I will be curious to see if Intel continue their massive monolithic designs for the next gen.
 
Interesting quote. It seems on the surface Intel may have miscalculated again when we count up the pros and cons of monolithic vs chip on chip. I suppose while one has the best fab in existence, the idea of going monolithic is the natural order or logical step. I will be curious to see if Intel continue their massive monolithic designs for the next gen.

Called EMIB, it's Intel's version of smaller chips, etc.
 
Intel must be sniffing the stuff. I always use fiber in my epoxy. In AMD's case that would Infinity Fabric. It seems the arguement against monolithic is that concentration of heat in a monolithic die would limit frequency scaling, reduce yields as errors arise and lead to shorter lifespan of chips as increased heat promotes electron tunneling. MCM on the other hand increases yield on the snmaller dies and heat generation is spread out over a larger area. Theoretically this would produce better scaling, cooler running and extended chip life.
 
This would be great for NUC style machines, super small footprints, potentially battery powered. Though I couldn't imagine AMD's tech being THAT energy efficient, There is tons of possibilities for future consoles being the size of a deck of cards.
 
Well Master_shake_ you will have to wait for the 3 professionals on this forum who are the only people apparently qualified to pass judgment on AMD's designs, strategies, patents, and products.
Thankfully, they are all 3 fully aware of their purpose and are sure to respond quickly. I would say inside of a day, we should know their pontifications upon what a terrible implementation and design idea AMD is operating from.

As for me, I'm clearly not qualified to recognize the implications myself, so I shall just sit here in my supposed ignorance and think that AMD has come up with a stunningly good solution and design implementation in their recent technology and product offerrings. When we hear from the 3 qualified, I am certain we will all collectively say "Oh, thats why AMD sucks still. It was so obvious all along, but we just couldn't see it without them sharing their enlightened thoughts."

I like this post enough to quote it and say that I like it.
 
Back
Top