Apple ARM Based MacBooks and iMacs to come in 2021

What about "bigger" Navi aka Navi 31 ?


I haven't got a clue. But AMD has been known to increment their high-end product number by 10 when they give it a die shrink, so maybe they have a 5nm shrink already in the works?

That still means nothing newer than RDNA2 will grace the platform.
 
Well, now we have it from the horse's mouth.
So I was right, and BigNavi will likely be the last major 3rd-party GPU released on Mac.

I wouldn't take that Slide to say Apple ARM macs will never house a third party GPU again. Apple tends to be very short on details of future products, this slide may just indicate that status of the first generation ARM Macs release this year, which are going to be more Mainstream and will skip discrete GPUs. But Power Macs use Big GPUs a lot for compute, and Power Macs sell in miniscule numbers.

Is Apple going to build a huge discrete GPU just that that tiny market? Or are Power Macs going to just have much weaker GPU compute?
 
That still means nothing newer than RDNA2 will grace the platform.
Given that AMD typically uses a GPU uArch for... four or five years?... it's quite likely that Apple will have moved on completely before AMD has something new that they'd even need to consider supporting.
 
But Power Macs use Big GPUs a lot for compute, and Power Macs sell in miniscule numbers.

Is Apple going to build a huge discrete GPU just that that tiny market? Or are Power Macs going to just have much weaker GPU compute?
There's very little reason that Apple will need to depend on another company for GPU compute if their own uArch is solid. Quite likely they'd want to do that regardless, as GPUs from AMD and Nvidia are architected for a broader set of use cases than perhaps Apple intends to target. For example, Apple could go heavy on dedicated logic for their content-creation customers, perhaps making it extensible to some degree for newer codecs or a wider variety etc., bringing more flexibility for those workloads at the expense of others.
 
There's very little reason that Apple will need to depend on another company for GPU compute if their own uArch is solid. Quite likely they'd want to do that regardless, as GPUs from AMD and Nvidia are architected for a broader set of use cases than perhaps Apple intends to target. For example, Apple could go heavy on dedicated logic for their content-creation customers, perhaps making it extensible to some degree for newer codecs or a wider variety etc., bringing more flexibility for those workloads at the expense of others.


It's not a little reason, it's a big reason. ;)

Unfavorable economics of doing a VERY low volume, big chip.

It could happen. I just wouldn't say that slide from the WWDC presentation is some kind of proof that it's a done deal.
 
The major question is gaming and how important that will be on ARM Mac.
A lot of people suspect that one of the reasons to move to ARM, amongst other strategic reasons, is to get more game development and more games.
Obviously they have full access to every iOS title right out the gate. And it's obvious they want to extend Apple Arcade to the desktop. But for future game development, especially cross platform development, they might care to have better cross compatibility between ARM and future PC graphics card architectures.
Apple has had as an example ray-tracing development in macOS through an SDK since 2018 and VR around the same time. If they want to continue in any real capacity on either or both of those, they'll have to either have a GPU they make themselves that can compete in that space or buy one.

To be clear: I'm not saying I know their path forward. Just merely pointing out there are a lot of moving parts that they have to consider both in terms of their own GPU development, precisely what they're trying to target and do as well as how that relates to buying third party vendor GPU's like AMD.
I would figure that Apple would continue to want to play nice with AMD, considering that AMD has been more than accommodating for years to give Apple custom hardware, such as the new GPU's in the 16" MBP and also the custom Vega II's. And their partnership seems far less constrained than with Intel for CPU development.
They can't hope to keep pros at the top end though unless they have something that can compete at the top end. So whether that's AMD or not, Apple has their work cut out for them.
 
It's not a little reason, it's a big reason. ;)

Unfavorable economics of doing a VERY low volume, big chip.

It could happen. I just wouldn't say that slide from the WWDC presentation is some kind of proof that it's a done deal.
Thing is, Apple could just build a uniform 'compute complex' made up of a processor and a stack or two of HBM, then hook those up to a hub (which could just be a PCIe hub itself). Throw in as many complexes as they like; it's not any different than using multiple Tesla or Fire Pro cards for compute in parallel.

And if the compute complex is reasonably sized, they could toss one or two into a Macbook, a few more into iMacs, and perhaps dozens into Mac Pros.
 
Sure, all kind of things could happen, and I don't feel strongly one way or another about it, but I see no real evidence yet. The more interesting questions always revolved around what does an ARM Mac Pro look like. Both the GPU and CPU questions look large there.

If they are ditching third party GPUs completely, then ARM Macs are bad news for both Intel (CPU) and AMD (GPU).
 
But for future game development, especially cross platform development, they might care to have better cross compatibility between ARM and future PC graphics card architectures.
I think it's pretty clear that Apple has the skill to build gaming-capable GPUs for consumer devices. I'd assume that they'd want to continue to evolve that IP for most of their lineup, perhaps as the 'base' of whatever HPC solution they wind up deploying for heavier work.
Apple has had as an example ray-tracing development in macOS through an SDK since 2018 and VR around the same time.
VR is still trying to get out of its own way; this is perhaps a suitable target for Apple given their propensity for producing higher-end computer appliances, but the one basic challenge for them is that all of the 'pieces' needed to make VR truly seamless are fairly nascent in development maturity. Quite a bit of it is running at the edge of what is practicable for consumer electronics.

And of course Apple has the challenge of courting developers. I have no doubt that they'd be able to make real contributions to VR, I just wonder if there's an ROI that they'd find attractive enough to actually go after themselves.

Raytracing is an entirely different story. Nvidia has done quite a bit of the initial heavy lifting, but I think that's going to be a tough nut to crack for Apple; it simply requires significant amounts of dedicated hardware that's then not particularly useful for anything else.

I would figure that Apple would continue to want to play nice with AMD, considering that AMD has been more than accommodating for years to give Apple custom hardware, such as the new GPU's in the 16" MBP and also the custom Vega II's. And their partnership seems far less constrained than with Intel for CPU development.
AMD has had to be 'flexible' for their few dedicated customers given their lack of technology; that's still the case today really, for their GPUs, even though they're in much better financial and market positions than they were when they were simply the lowest bidder for the two major consoles. The problem that Apple will likely face is that like Intel, AMD is unlikely to want to produce a proprietary solution for Apple and then not be able to use that technology to some extent elsewhere. I assume that there's still some flexibility available but it'll likely be constrained more to a 'parts bin' than perhaps Apple is willing to continue to rely upon.
 
Sure, all kind of things could happen, and I don't feel strongly one way or another about it, but I see no real evidence yet. The more interesting questions always revolved around what does an ARM Mac Pro look like. Both the GPU and CPU questions look large there.
Well, my assumption is that they want to get away from producing slightly-rejiggered versions of what Dell and HP already produce with the same basic components and overall performance envelopes. They'll likely want to produce something that's more in line with what their customers do, and less of a 'generic' system. Really, they're going to have to do that because the performance they have on offer today stops being worthwhile when it can be parallelized on a cluster. You don't buy Apple for that.

If they are ditching third party GPUs completely, then ARM Macs are bad news for both Intel (CPU) and AMD (GPU).
Not really?

I think that HPC is going to start moving away from GPUs; in a sense, that's good for compute, because those markets will be served with products that are more efficient for their purpose, and better for gaming, because GPUs produced for gaming won't be set up for stupid levels of precision that's simply not needed for consumer workloads (including compute).

I think Apple is going to pick a different mix; they have no real market for generic compute, and they have no real market for high-end gaming. I expect something focused on image processing, supposing the Apple product in question has more compute power than is needed to run it as a terminal for either cloud-based apps or something running in a datacenter.
 
It’s rumors at this point. Both pages are siteing the same slide and guessing. Not really definitive.

It is an educated guess.

One thing I have seen come up more and more in the Intel vs. Amd threads really applies to this whole move by Apple.

That is the fact that amd CAN'T unseat Intel. Not because Intel can out engineer everyone but because amd is reliant on tsmc and tsmc doesn't have the capacity to support converting every Intel sale to an amd sale.

Apple is going to be cutting into that tsmc capacity. They need to not be reliant on anyone they are about to piss off.
 
Well, now we have it from the horse's mouth.
So I was right, and BigNavi will likely be the last major 3rd-party GPU released on Mac.

Can I get you folks a side of locked-in pie with that Mac Pro purchase?

why use another companies hardware when you can do it better?
 
I wouldn't expect to see official Apple ARM GPU OpenGL. Apple is going to continue pushing every developer to metal.

Frankly its for the best. Metal is probably the best of the 3D APIs right now. Ya we all love Vulcan as it should technically be cross platform. But ya Apple seems to be dead set against supporting it.

Will this hurt gaming on Macs... probably, possibly... and perhaps it doesn't freaking matter. Apple is going to expand the mobile gaming industry.... and no matter how we PC overlord feel about it the game industry has been going mobile first for years now and that isn't going to stop. As long as people can play their ipad type games on the new imacs they will be happy.

Where it gets interesting is professional software. Something tells me Apple is going to (will have to) dangle some money to ensure all the big names support metal. To be honest considering all the interesting bits that will be baked into their silicon... developers are going to do that anyway. Creative software that uses GPU under Apple... is going to have to use Metal to expose all the AI/Decode assist hardware. All the majors will have Metal enabled software to show off if not when the macbooks launch then for sure when the Mac pros hit.
 
I wouldn't expect to see official Apple ARM GPU OpenGL. Apple is going to continue pushing every developer to metal.

Frankly its for the best. Metal is probably the best of the 3D APIs right now. Ya we all love Vulcan as it should technically be cross platform. But ya Apple seems to be dead set against supporting it.
I'm not sure how 'best' could be quantified, and I'm certainly not going to assert that you're wrong, either; I just wonder if it really matters if it only runs on Mac OS.
Will this hurt gaming on Macs... probably, possibly... and perhaps it doesn't freaking matter. Apple is going to expand the mobile gaming industry.... and no matter how we PC overlord feel about it the game industry has been going mobile first for years now and that isn't going to stop. As long as people can play their ipad type games on the new imacs they will be happy.
I agree that this does depend on how one defines 'gaming'. Given that most of the gaming that happens on an Apple product is in the form of a 'mobile' game, then I don't think it hurts one bit. For higher-end stuff, I think the answer is something along the lines of 'no more than it hurts to be on Apple in the first place'. OS X is already a unique development target; I expect most developer frameworks (i.e., whatever Adobe uses internally, stuff like Unreal Engine) to simply add support for what they need. With Vulcan and DX12 being commonly developed for today, supporting another similar API isn't going to rate very high in terms of effort.
Where it gets interesting is professional software. Something tells me Apple is going to (will have to) dangle some money to ensure all the big names support metal. To be honest considering all the interesting bits that will be baked into their silicon... developers are going to do that anyway. Creative software that uses GPU under Apple... is going to have to use Metal to expose all the AI/Decode assist hardware. All the majors will have Metal enabled software to show off if not when the macbooks launch then for sure when the Mac pros hit.
Realistically this is all already in place. Everyone has been developing for IOS for years, and that required the implementation of support for APIs that hook up to Apple's logic. Otherwise, it meant trying to run compute-heavy code on the ARM cores, and well, that's a no-go from a productivity standpoint and every other standpoint beyond a proof of concept for whatever task is being accomplished.

Scale that up to a laptop, a desktop, a workstation? Same difference, Apple is going to have to provide hardware more suited to the task than their ARM cores, and developers are going to have to make sure that they implement the APIs necessary to utilize that hardware.
 
Scale that up to a laptop, a desktop, a workstation? Same difference, Apple is going to have to provide hardware more suited to the task than their ARM cores, and developers are going to have to make sure that they implement the APIs necessary to utilize that hardware.

I agree with you completely accept the last part a bit.... ARM cores are every bit as effective at general compute as a x86 core. No one was crazy enough to spend a few billion dollars developing a ARM based desktop core before now is all. It didn't make sense for AMD (although they sort of did before Lisa said focus on x86 for now) Intel Samsung or HP or anyone else to do it... as the software ecosystem didn't exist. Microsofts half assed RT experiment using qualcomm was doomed to fail as those really where just barely clock bumped mobile cores.

Apple has decided they have expanded their mobile ecosystem with ipads enough at this point that one good 2 year long pu$h will get them there. I know we haven't seen anything about the actual CPUs they are getting ready for these. But I would eat my shorts with a knife and fork if there just clock bumped Ipad chips like something Qualcomm would whip up for a bargain basement price. They have grabbed a bunch of extremely talented designers the last 2 years.... folks that are not mobile chip designers. It seems clear now they have been working on an actual desktop class ARM chip. We have NEVER seen one of those at this point. We have seen a handful of ARM super computer CPU designs.... one of which currently powers the fastest (at least for now) HPC machine in the world. We where never going to get 48 core server class arm chips in our desktops.... I however have a feeling Apple may just have a 16 (or more) core desktop class chip to talk about sooner rather then later. When they get ready with the slimmed down macbook version I am sure they will start talking about the coming imac desktop parts just to get people hyped about switching.
 
I'm not sure how 'best' could be quantified, and I'm certainly not going to assert that you're wrong, either; I just wonder if it really matters if it only runs on Mac OS.

I agree that this does depend on how one defines 'gaming'. Given that most of the gaming that happens on an Apple product is in the form of a 'mobile' game, then I don't think it hurts one bit. For higher-end stuff, I think the answer is something along the lines of 'no more than it hurts to be on Apple in the first place'. OS X is already a unique development target; I expect most developer frameworks (i.e., whatever Adobe uses internally, stuff like Unreal Engine) to simply add support for what they need. With Vulcan and DX12 being commonly developed for today, supporting another similar API isn't going to rate very high in terms of effort.

Realistically this is all already in place. Everyone has been developing for IOS for years, and that required the implementation of support for APIs that hook up to Apple's logic. Otherwise, it meant trying to run compute-heavy code on the ARM cores, and well, that's a no-go from a productivity standpoint and every other standpoint beyond a proof of concept for whatever task is being accomplished.

Scale that up to a laptop, a desktop, a workstation? Same difference, Apple is going to have to provide hardware more suited to the task than their ARM cores, and developers are going to have to make sure that they implement the APIs necessary to utilize that hardware.

The metal API does that work for them, all developers have to do is leverage metal.
 
The metal API does that work for them, all developers have to do is leverage metal.
To be fair, Apple has two videos out that basically amount to: how to use Metal correctly. This is because AMDGPU/Intel/Nvidia Metal and Apple Silicon Metal have differences in utilization, performance, and quality due to the fundamentally(?) different architectures. Apple also claims these factors can be affected by how the Metal API is utilized. That being said, it does appear Apple (referencing their videos) has done quite a bit of work in that aspect, but developers still have to watch out.

Of course, Apple also lets developers know that Metal, as it has been used in iPhone and iPad for years, already provides some familiarity to the Apple Silicon approach to Metal.

I agree with you completely accept the last part a bit.... ARM cores are every bit as effective at general compute as a x86 core. No one was crazy enough to spend a few billion dollars developing a ARM based desktop core before now is all. It didn't make sense for AMD (although they sort of did before Lisa said focus on x86 for now) Intel Samsung or HP or anyone else to do it... as the software ecosystem didn't exist. Microsofts half assed RT experiment using qualcomm was doomed to fail as those really where just barely clock bumped mobile cores.

Apple has decided they have expanded their mobile ecosystem with ipads enough at this point that one good 2 year long pu$h will get them there. I know we haven't seen anything about the actual CPUs they are getting ready for these. But I would eat my shorts with a knife and fork if there just clock bumped Ipad chips like something Qualcomm would whip up for a bargain basement price. They have grabbed a bunch of extremely talented designers the last 2 years.... folks that are not mobile chip designers. It seems clear now they have been working on an actual desktop class ARM chip. We have NEVER seen one of those at this point. We have seen a handful of ARM super computer CPU designs.... one of which currently powers the fastest (at least for now) HPC machine in the world. We where never going to get 48 core server class arm chips in our desktops.... I however have a feeling Apple may just have a 16 (or more) core desktop class chip to talk about sooner rather then later. When they get ready with the slimmed down macbook version I am sure they will start talking about the coming imac desktop parts just to get people hyped about switching.
Another to be fair...

To be fair, Microsoft did try back in the early days of the Surface tablet, when they first provided Windows RT using a Nvidia Tegra SoC. Of course, they are basically abandonware stuck on the RT version of Windows 8 nowadays... Though those were 32bit only. MSFT abandoned the RT push right as Nvidia released the first non-Apple, 64bit arm implementation.

I guess Qualcomm didn't want to share the limelight, or nobody else wanted in on the most recent Windows RT attempt, but having just one vendor (this time pushing $1000+ HW, instead of $400 HW), does seem a bit more limiting. I'm more surprised Qualcomm took so long with the Snapdragon 7c HW, it seemed like it would better expand their Windows on arm (WoA) opportunities if the entry level Surface was also running their chips, instead of a rather anemic Intel 14nm dual core (again).
 
Back
Top