Intel 2018 Architecture Day @ [H]

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,532
Intel 2018 Architecture Day

Intel invited some of the press out to California to talk about where it is headed in terms of its overall business in the very near future. If news in the CPU world has excited you recently, you will surely want to be up to date with how Intel sees its future in the desktop market.

If you like our content, please support HardOCP on Patreon.
 
Last edited:
"One of the things you will see more amplified, is schedule predictability." Along with this, Murthy proclaims that Intel now has in place, "Aggressive risks with clear contingencies." This marks a huge shift in how Intel does business. The days of process node, architecture, and other IP being tied together as a end point are over. He explains that Intel's failure to meet it previous 10nm roadmaps, has been a teaching tool in how it needs to move forward.

My translation:
Intel = in panic mode
 
Last edited:
Thanks for the writeup Kyle. It was interesting that they tried to play down spectre as much as possible based on your experience. All it would take is a few high profile spectre attacks and their stock tanks in a historic fashion. Hardware is a hard sell if it's not secure.

It's reassuring to see the Intel chiplet designs making their way to consumer products in the near future. We should be able to expect a more competitive product in both price and performance without a mammoth die and the harvesting of parts that would otherwise be scrapped.

The one API to rule them all is more or less a way to lock developers into their ecosystem, and we've seen it before, from various sources. Typically only pays off if you're the monopoly.

The lack of a fixed roadmap (that I could see) for node size and architecture was really telling in how far Intel has come from efficient. Throwing a dart and taking wherever it lands doesn't yell market leader to me.
 
Perhaps a more articulate and focused panic mode? :D

7-10nm has gotta come whether it's Intel or AMD first and, in my opinion, it seems like Intel is still scurrying like crazy to still be first to market with a major node shrink across their entire CPU lineup.
If Intel fails at another "aggressive risk" trying to get 10nm out the door quickly, then my question would be: will their "contingency" be more of the same (just add more cores and maybe even force another socket and/or chipset change)?

We shall see.

All that being said, I'm rather excited for both AMD's 7nm and Intel's 10nm, whenever they make their appearances, because that is when I will be looking to upgrade my trusy/rusty ole 3770K, and the best bang for the buck will be treated to my wallet.
Did you actually read the entire article? I don't think you did.
 
and things just continue to get more interesting in the hardware market, thank you AMD for finally making something competitive to light a fire under intel's butt.. hopefully AMD has something up their sleeve to keep this going for a while longer.
 
The lack of a fixed roadmap (that I could see) for node size and architecture was really telling in how far Intel has come from efficient. Throwing a dart and taking wherever it lands doesn't yell market leader to me.
I thought the same thing at first. There are a lot of changes happening behind the scenes right now and I think there are a few more bumps in the road till we see a public roadmap with that detail. But when architecture is no longer tired to node directly, I am wondering just how public Intel will be with long term planning.
 
Buy AMD..... lol Otherwise Intel can rest on their laurels for another decade.... 5% IPC gain/frequency gain per new product was awful.
 
  • Like
Reactions: Jza
like this
Buy AMD..... lol Otherwise Intel can rest on their laurels for another decade.... 5% IPC gain/frequency gain per new product was awful.

The problem is or was Intel management I'm sure that folks get this by now. They now hired enough people to get them going now it only takes a while for them to get back on track.
Intel did not only screw us over with IPC but with 4c8t and the good multi core software scaling now is hard to find if back then the ecosystem had more cores we would have seen way better benefits in software today ...
 
It seems to me that Intel is getting hit from all sides and they're rushing around like headless chickens trying to figure out how to defend their business on 4 fronts. They're like Juggernaut and just weren't expecting to have to go in so many directions at once for some reason even though almost everyone else saw it coming. Anyone here not see ARM CPUs taking a bite out of intel back in the early smartphone days especially after Intel rolled out their mobile processors? Anyone here not think that AMD would eventually catch up if they got a chance? How about server CPUs, is it much of a surprise that Amazon is doing this? It wouldn't surprise me that they're changing things up internally, I think they need to if they want to stop people taking bites out of their income. Its a good thing as it might help jumpstart newer and better things and Intel has a lot of smart people who can do it.
 
Jim Keller has it made doesn't he? Gets paid a kings ransom to design Ryzen and put AMD back on track, then gets hired by Intel to kinda do the same thing.

Not begrudging him, dude is basically a genius and when you're 1 of only 2 or 3 people in the world that can deliver a service, you get to name your price.
 
oh, GAWD, architecture porn....BUFFERS!!!! GV MEH MOAR BUFFERS!!!!
 
I have some humble pie of my own to eat. In the other Intel thread (about 10nm) I pointed out that I didn't think Intel would jump on the Chiplet bandwagon (as Kyle said, 3D or not, is still very chiplet-like in nature). So... There it is. I was wrong and am surprised, but pleasantly so :)

That's not all though... we know I'm not a fan on Intel, as a company though, but that I do still recognize their superiority in CPU horsepower. With that 'bias' in mind, this is a great moment because it shows that my love for AMD is not blind fanboyism... Because honestly, this got me excited for future Intel products!! This emotion catches me off guard waaayy more than Intel deciding to dabble in chiplets.

Next year will surely be interesting for Intel and AMD fans! :)

Thanks Kyle :D
 
Buy AMD..... lol Otherwise Intel can rest on their laurels for another decade.... 5% IPC gain/frequency gain per new product was awful.
This. I love technology moving forward, but the truth is when intel or ngreedia succeed we lose bang for buck.
 
Please stay on topic and go to pm of you need to correct each other on the minutia. Continuing the bickering in my review thread will get you a vacation. This is your only warning.
 
Last edited:
Yes. What Intel failed to address is that node shrinks with smaller TDP packages are king for their revenue via massive volume on the enterprise side, since it reduces overall operating costs for data centers and large personnel computing asset sites.
Last quarter earnings beg to differ.
 
Last edited:
I have some humble pie of my own to eat. In the other Intel thread (about 10nm) I pointed out that I didn't think Intel would jump on the Chiplet bandwagon (as Kyle said, 3D or not, is still very chiplet-like in nature). So... There it is. I was wrong and am surprised, but pleasantly so :)

That's not all though... we know I'm not a fan on Intel, as a company though, but that I do still recognize their superiority in CPU horsepower. With that 'bias' in mind, this is a great moment because it shows that my love for AMD is not blind fanboyism... Because honestly, this got me excited for future Intel products!! This emotion catches me off guard waaayy more than Intel deciding to dabble in chiplets.

Next year will surely be interesting for Intel and AMD fans! :)

Thanks Kyle :D

Me too, I think it just the general feeling of seeing something new with Intel rather than a rehash of Skylake.
 
It's a good step that they now have 2 ports (1 and 5) that can shuffle. I hope compilers finally default to pemute rather than insert. Leave the past behind.
 
My translation:
Intel = in panic mode
No, a bureaucracy never panics. Intel is just going into "get off your butt and earn your paycheck" mode.

Intel's been resting on the twin cheeks of volume and market share for over a decade. But now cell phones have created foundry players with similar revenue streams, who can and have made the investments in fab technology that Intel used to use to stay ahead of the game. Then AMD put out a good architecture on a good foundry process, and pulled the chair out from under Intel.

So sitting back, tweaking a 2-decade old architecture and porting it to new processes doesn't cut it anymore.
And the big risk is that the culture that took over Intel in the early-2000's made rapid innovation impossible in the face of entrenched bureaucratic interests.

Hopefully, that culture will change. We'll see.
 
Intel did not only screw us over with IPC but with 4c8t and the good multi core software scaling now is hard to find if back then the ecosystem had more cores we would have seen way better benefits in software today ...
I think Intel did this for strategic reasons. With such a lead in single core performance, a migration to many cores would only allow their competitors to look better. Yes that didn't help software evolve at the rate it should have and stagnated the industry as a whole, but it helped Intel maintain its stronghold. Even the weak Bulldozer would have looked so much better if software was more evolved at the time of its release.
 
Hhmm, Foveros, 3D-stacked CPU? Interesting indeed. Also yet more info that Intel will be busting out chiplet designs. Yeah, 2019 will be a very interesting year. Can't wait to see what happens with Zen 2. CPU wars are really gonna heat up next year.
 
God I hope AMD can keep up with Intel dumping money on R&D again. If Intel hits the R&D motherload early and we sit on other decade of bleh I'll scream.
 
Is one API actually an efficient approach or is this some nonsensical thing higher mgmt thought of?
 
While it sounds great as to getting the best interconnected, densest/fastest architecture possible crammed onto a die using this new 3D stacking approach... I'm a bit concerned as to how they are going to effectively cool that same die. Seems that this is going to create some interesting problems with getting heat out of that tightly stacked space... which AMD's more traditional 2D chiclet approach won't have.

I'm sure the eggheads at Intel wouldn't be taking this path unless they already have a workable plan thought through. Definitely going to hold out for 10nm Sunny Cove now - hopefully arriving by early Fall 2019. The 9900K is going to have a rather short time in the spotlight it seems.
 
I have noticed that not many people are talking about the active interposer. That's a pretty fucking cool idea TBH. They said they basically put the entire chipset into the interposer, so it's not just an interposer but it's the chipset too.
 
Maybe I am missing something, but did Intel finally got 10nm working (reasonable yield)?

You would think Intel would spend half the presentation saying how it finally got 10nm working.
 
Thanks for the article/review Kyle.

Exciting time for Intel and consumers. It's been a long time since we could see them working in so many different directions.
 
Maybe I am missing something, but did Intel finally got 10nm working (reasonable yield)?

You would think Intel would spend half the presentation saying how it finally got 10nm working.
They have 7nm with EUV working according to plan for more than a year out. 10nm they're expecting to be out in 2019.
 
Maybe I am missing something, but did Intel finally got 10nm working (reasonable yield)?

You would think Intel would spend half the presentation saying how it finally got 10nm working.

Based on what was said in the article I'm guessing that the 10nm process is going to be used for only certain things. Basically, the process is crap compared to what it was supposed to do and instead of tossing any more money at it to get it working like it was supposed to they are only going to use the process for the few things which work on it with decent yields; at least at this point. All of that wasn't stated outright but they did say that what works on 10nm will be on 10nm and everything else won't be 10nm.
 
Gonna get my crystal ball out.

It’ll be 5-8% faster, require a new motherboard because of a couple of needless pin changes and so double the effective cost and it will make negligible difference in gaming perfomance.
 
The question about Spectre/Meltdown would have been the first one I asked as well. Disappointing that a firm answer was not provided. And what about all the new spectre and meltdown bugs...
 
The question about Spectre/Meltdown would have been the first one I asked as well. Disappointing that a firm answer was not provided. And what about all the new spectre and meltdown bugs...

the old stuff is v2 new stuff is v3 that was mentioned, the software fixes are v1. so it sounds like 10nm should have the v2 fixes while 7nm should have v3 or at least that's what it sounds like with the limited information that was given.
 
Based on what was said in the article I'm guessing that the 10nm process is going to be used for only certain things. Basically, the process is crap compared to what it was supposed to do and instead of tossing any more money at it to get it working like it was supposed to they are only going to use the process for the few things which work on it with decent yields; at least at this point. All of that wasn't stated outright but they did say that what works on 10nm will be on 10nm and everything else won't be 10nm.

As stated at the Architecture Day, 10nm will be used for everything: mobile, desktop, server, dGPU, and custom foundry customers.
 
Is Intel using the fixes Google came up with or have they come up with something not-stupid-broken in-house?
 
As they say, the proof will be in the pudding. Intel needs to be panicked, otherwise it will be more of the same. They need to know that things have to change and embrace the challenge.
Thus far I don't see much changing. They have become so complacent that they may have no will left to produce compelling products or leadership to make a change. I've always gone with Intel and way back in the day I used some AMD, but I put my money where the most bang for the buck is and I'm loving that AMD keeps marching forward. It's almost like Intel has forgotten how to respond to market pressures and therefore we see these types of dog and pony shows.
Let's not forget that Amazon is also looking to replace Intel with their AWS offerings. That is pretty big news in itself.
 
Is Intel using the fixes Google came up with or have they come up with something not-stupid-broken in-house?

They have to come up with their own hardware fixes. Google cannot show Intel how to fix their IP. They may have provided some guidance in the firmware space but I doubt Google has any level of staff that has enough knowledge on Intel's architecture to provide fixes at the hardware level.
 
They have to come up with their own hardware fixes. Google cannot show Intel how to fix their IP. They may have provided some guidance in the firmware space but I doubt Google has any level of staff that has enough knowledge on Intel's architecture to provide fixes at the hardware level.
Retpoline.
 
Back
Top