GLOBALFOUNDRIES Throws in the Towel on 7nm FinFET Process Development

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,077
GLOBALFOUNDRIES (GF) has placed an indefinite hold on the development of a 7nm FinFET program and is instead focusing on 14/12nm FinFET. GF is also reducing their workforce by 5%. This leaves only Intel, TSMC, and Samsung to develop the most cutting edge processes.

"Lifting the burden of investing at the leading edge will allow GF to make more targeted investments in technologies that really matter to the majority of chip designers in fast-growing markets such as RF, IoT, 5G, industrial and automotive," said Samuel Wang, research vice president at Gartner. "While the leading edge gets most of the headlines, fewer customers can afford the transition to 7nm and finer geometries. 14nm and above technologies will continue to be the important demand driver for the foundry business for many years to come. There is significant room for innovation on these nodes to fuel the next wave of technology."
 
with the imited supply of 7nm, the products using that not will be priced high, and with GF throwing the towel that makes it even worse.
 
with the imited supply of 7nm, the products using that not will be priced high, and with GF throwing the towel that makes it even worse.

Says the person who knows nothing of TSMC's actual production capability at 7nm, nor how much GF was expected to produce....
 
I am thinking Global Foundries have gone "smart" and fired just exactly those 5% that actually can and will give a damn, and kept all the brown noses that dont give a damn.
 
AMD is using TSMC for 7nm.

Whatever that means. How comparable are TSMC's, Samsung's, and Intel's next-gen-geometry processes?
Or rather, how comparable will they be when they are all actually shipping volume product?

Who knows?

We may be heading for a boring period where process improvements are rare and minuscule. It may take an unpredicted breakthrough in one or more of physics, chemistry, optics, and material science to provide the kind of improvement we used to regularly see in every other generation of Intel's CPUs. And that's a little sad.
 
"Lifting the burden of investing at the leading edge will allow GF to make more targeted investments in technologies that really matter to the majority of chip designers in fast-growing markets such as RF, IoT, 5G, industrial and automotive," said Samuel Wang, research vice president at Gartner. "While the leading edge gets most of the headlines, fewer customers can afford the transition to 7nm and finer geometries. 14nm and above technologies will continue to be the important demand driver for the foundry business for many years to come. There is significant room for innovation on these nodes to fuel the next wave of technology."
That was a nice way of saying "we failed".
 
Fake news, GF will launch 7nm Q2 2019, and beat intel and nVidia to death and both companies will be bankrupt by 2020, all AMD chips will destroy everything.
/s

What is with tech news, this back and forth shit with half of it clearly just made up is getting old.
Not saying this story isnt true, but seriously, i've heard so much random shit and quotes from "industry sources" that end up all wrong.

Maybe i'm just getting old and should read the local livestock prices instead.

*edit, and then i read the actual article and its GF website themselves LOL. But still, tech news.
Shoulda bought TSMC stock?
 
Fake news, GF will launch 7nm Q2 2019, and beat intel and nVidia to death and both companies will be bankrupt by 2020, all AMD chips will destroy everything.
/s

What is with tech news, this back and forth shit with half of it clearly just made up is getting old.
Not saying this story isnt true, but seriously, i've heard so much random shit and quotes from "industry sources" that end up all wrong.

Maybe i'm just getting old and should read the local livestock prices instead.

*edit, and then i read the actual article and its GF website themselves LOL. But still, tech news.
Shoulda bought TSMC stock?

Probably won't affect TSMC stock price much since AMD already committed to them for 7nm production.
 
Whatever that means. How comparable are TSMC's, Samsung's, and Intel's next-gen-geometry processes?
Or rather, how comparable will they be when they are all actually shipping volume product?

Who knows?

We may be heading for a boring period where process improvements are rare and minuscule. It may take an unpredicted breakthrough in one or more of physics, chemistry, optics, and material science to provide the kind of improvement we used to regularly see in every other generation of Intel's CPUs. And that's a little sad.

Silicon will be dead within 5 to 6 years for leading edge cpu, gpu, and memory products. Other materials are in development and will be ready before 5 years to advance computing into its next golden age.
 
Not a big surprise. We all smelled the blood in the water in glofo's 7nm. But this will leave them in a bad spot in 2-3 years.
 
We may be heading for a boring period where process improvements are rare and minuscule. It may take an unpredicted breakthrough in one or more of physics, chemistry, optics, and material science to provide the kind of improvement we used to regularly see in every other generation of Intel's CPUs. And that's a little sad.

We've known that for a while; experts seem to think we can do 4nm on existing tech (which was in doubt for a while), but below that physics itself (transistor leakage) likely becomes an unsolvable problem. There's POSSIBLY one more die shrink; after that, silicon really doesn't have any "free" performance improvements left in it.
 
Silicon will be dead within 5 to 6 years for leading edge cpu, gpu, and memory products. Other materials are in development and will be ready before 5 years to advance computing into its next golden age.

I read this same exact statement 5 years ago. And 5 years before that.

Granted, finally reaching the end of die shrinks will help speed the process up a bit, but you're still looking at a totally new architecture/construction. It's going to be a decade or so until you see anything that's ready to replace silicon in the consumer space.

In short: Get ready for a performance plateau.
 
In short: Get ready for a performance plateau.
For single-thread, certainly.
For multi-threaded apps, I expect CPU and GPU vendors to throw down more and more cores.

And don't be surprised if SMT/HyperThreading goes the way of the dodo: when performance is limited by the watts per square millimeter you can pull off the die, it may not make sense to execute two threads at once on the same tiny area of silicon. Pulling out all the SMT/HT support from the cores will make them smaller and more energy efficient-- and therefore faster-- and probably less vulnerable to side-band attacks too.
 
14/12 nm is fine for most everyday electronics and those get ordered in bulk. So I can see them trying to position themselves for that market, they can then transition to smaller processes once the process is refined by other companies And it is less of a guessing game.
 
Silicon will be dead within 5 to 6 years for leading edge cpu, gpu, and memory products. Other materials are in development and will be ready before 5 years to advance computing into its next golden age.
Hooray for phase change
 
For single-thread, certainly.
For multi-threaded apps, I expect CPU and GPU vendors to throw down more and more cores.

And don't be surprised if SMT/HyperThreading goes the way of the dodo: when performance is limited by the watts per square millimeter you can pull off the die, it may not make sense to execute two threads at once on the same tiny area of silicon. Pulling out all the SMT/HT support from the cores will make them smaller and more energy efficient-- and therefore faster-- and probably less vulnerable to side-band attacks too.

Disagree on both counts.

You can't just keep adding cores. For one, your limited by die space/yields, Secondly, there are major OS scheduling concerns as core count ramps up, and you saw that in the 2990x benchmarks where scaling between 16 and 32 cores was significantly less then you would expect.

As for SMT, the reason for SMTs existence is you can get close to an additional core of performance [for certain workloads] for just a fraction of the cost/die space. For example, HTT adds just 10% to the CPU die in exchange for ~30% boost in performance.
 
I think we will see chips the size of my face before silicon is "dead". I know big die chips and mass parallel processing don't make sense now, it will in 10 years or so, when these investments are legacy and silicon goes down in price tremendously.
 
I am thinking Global Foundries have gone "smart" and fired just exactly those 5% that actually can and will give a damn, and kept all the brown noses that dont give a damn.
I guess i' ve seen this too huh? Its disgraceful.. i guess the problem is outspoken (with logic and merit) often comes with competent, so instead of paying attention to WHAT the person is saying, its only treated as an irritant.
 
This company has never been as good as the other fabs. But now they publicly state it, the only companies that will use them going forward will be the VIAs of the tech world.
 
As for SMT, the reason for SMTs existence is you can get close to an additional core of performance [for certain workloads] for just a fraction of the cost/die space. For example, HTT adds just 10% to the CPU die in exchange for ~30% boost in performance.
You're assuming perf per square mm of silicon is a relevant metric. I contend it won't be anymore, especially as processes mature and yields rise.
Perf per watt will be, and not just in portable devices. It's a physics-driven limit: ICs can only run so hot (especially since leakage and electromigration go up with temperature), and there's a limit to how fast you can extract heat from the die without going to expensive cooling technologies (that may be completely inappropriate for a portable device and dense servers).
 
Wasn't die stacking suppose to help with this issue? But so far I've only seen it used in memory chips and flash. Unless it can be used on cpu and gpu's. We may have an issue. Problem is the die size of a gpu is really large. So its much harder to get good yields for gpus.
 
Based on Juanrgas theory of German buying preferences, we should be seeing a large drop in German sales, now that TSMC will be making almost all of AMDs parts.
 
Silicon will be dead within 5 to 6 years for leading edge cpu, gpu, and memory products. Other materials are in development and will be ready before 5 years to advance computing into its next golden age.

In the best case new materials would eliminate current silicon frequency wall and allow for a new GHZ race. But new materials aren't going to eliminate the ILP wall neither limits imposed by stuff such as Amdahl law. So a new golden age is not coming from going beyond silicon.
 
For single-thread, certainly.
For multi-threaded apps, I expect CPU and GPU vendors to throw down more and more cores.

Apart from limits imposed by Amdahl's law to multithreading (there is a limit beyoind which adding more cores doenst speed up), more cores require more space in the die, so we hit the performance plateau again once reaching silicon shrink limits, as we are doing. And better we don't talk about costs

Dltjnf-WwAAgmmG.jpg
 
Zen 2 on TSMC?

Zen2 was always TSMC. AMD tapeout Zen the past year on TSMC.

Based on Juanrgas theory of German buying preferences, we should be seeing a large drop in German sales, now that TSMC will be making almost all of AMDs parts.

Incorrect conclusion. Didn't you read what I wrote about brand fidelity?

And a question to you, what Tomsharware sub has officially critized the recent "buy Nvidia" scandal of central Toms? Tomshardware France? Tomshardware Germany? Other?
 
Wasn't die stacking suppose to help with this issue?
Die stacking is used for memory products because heat dissipation is not a major problem there.
For CPUs, heat dissipation is a primary limiter of performance, and stacking die makes it worse.
 
Zen2 was always TSMC. AMD tapeout Zen the past year on TSMC.



Incorrect conclusion. Didn't you read what I wrote about brand fidelity?

And a question to you, what Tomsharware sub has officially critized the recent "buy Nvidia" scandal of central Toms? Tomshardware France? Tomshardware Germany? Other?


I've seen your prattle on about a number of things. I think you covered your position pretty well in this thread.

https://hardforum.com/threads/intels-desktop-marketshare-in-mindfactory-de.1963497/

Your claim is that Germany is a country biased towards buying AMD not only because they built and owned fabs there before 2009, but that the appreciation of the German people carried over until today. It's the same nonsense where you tried to use the etymology of the word fan in a tweet by Lisa Su to prove conspiracy theorist style that even AMD was aware of the fanatic way they are viewed in Germany.

Ignoring that all 12/14nm production was in NY, this should put the final nail into your fucking nonsense about German sales.
 
Wouldn't be surprised if GF had problems with 7nm, but the argument they make with concerns to R&D cost is pretty substantial frankly, as Anandtech's article points out, the owner of GF, Abu Dhabi's sovereign wealth fund Mubadala, has spent over $20 billion on GF without the company making a profit yet. For them to pursue 7nm means, as anandtech points out, One, maintaining future R&D into 5nm and 3nm to remain competitive vs TSMC and Intel, either making a new or upgrade a Fab, and risking current revenue producing volumes now. Thats a lot of freaking money to invest in a company that already struggles to find customers now. Like another $20 billion of investment. So cutting 7nm and just focusing on making money on 14nm and bigger, sounds like a good idea for GF tbh.
 
I've seen your prattle on about a number of things. I think you covered your position pretty well in this thread.

https://hardforum.com/threads/intels-desktop-marketshare-in-mindfactory-de.1963497/

Your claim is that Germany is a country biased towards buying AMD not only because they built and owned fabs there before 2009, but that the appreciation of the German people carried over until today. It's the same nonsense where you tried to use the etymology of the word fan in a tweet by Lisa Su to prove conspiracy theorist style that even AMD was aware of the fanatic way they are viewed in Germany.

Ignoring that all 12/14nm production was in NY, this should put the final nail into your fucking nonsense about German sales.

So you ignored my point and my arguments once again, but now you also ignored my question made to you. The answer was "Tomshardware Germany".
 
So you ignored my point and my arguments once again, but now you also ignored my question made to you. The answer was "Tomshardware Germany".

Your argument is nonsense, you've covered it ad nauseum and it still rings with strong notes of simple thoughts tinged with bias. Does every point you make need to be reargued until everyone is tired of listening?

As far as TH and buy Nvidia,


Steve at GN also STRONGLY criticized the buy Nvidia..... Scandal (strong word), is he part of the brainless slobbering masses of AMD fans also, or perhaps critcial thinking exists all over the world.
 
Last edited:
Your argument is nonsense, you've covered it ad nauseum and it still rings with strong notes of simple thoughts tinged with bias. Does every point you make need to be reargued until everyone is tired of listening?

As far as TH and buy Nvidia,


Steve at GN also STRONGLY criticized the buy Nvidia..... Scandal (strong word), is he part of the brainless slobbering masses of AMD fans also, or perhaps critcial thimkith exists all over the world.

YES!
critcial thimkith exists. LOL

Funny how that made the argument less powerful when it was the punch line. LOL
 
Back
Top