Apple ARM Based MacBooks and iMacs to come in 2021

I guess there goes the Hackintosh community. I do like macOS but I don't like Apples hardware old tech at top dollar prices.

And Bootcamp, Parallels, Wine users, running Windows software on Macs.

IMO, this makes Macs worse by removing this very useful option.

Needless mess IMO, they could go dual source with AMD excellent new 7nm APUs/CPUs. That would be a much better option than the mess of swapping architecture.
 
IMO, this makes Macs worse by removing this very useful option.
As 'useful' as this option is, it doesn't make Apple money on any useful scale. They'd rather be selling through their app store.

Needless mess IMO, they could go dual source with AMD excellent new 7nm APUs/CPUs. That would be a much better option than the mess of swapping architecture.
They're likely to go AMD for CPUs in 'traditional' systems, but ARM for now is most likely going to be used to provide a passable desktop experience at a lower cost for iMacs, and at lower cost and longer battery life for Macbooks.
 
It will be a tablet mainboard hooked up to a bus for expansion hardware like a hard drive and input devices running a windows-based iOS UI. (Not Microsoft Windows, just windows like every other desktop/laptop OS UI uses.)
...you mean a GUI?
You realize that ARM processors aren't limited to mobile and embedded devices, right?

There are high-end workstations and servers with ARM processors with PCI-E, Infiniband, 10GBase-T NICs, etc., just FYI.
So, why would it use a "tablet" mainboard?

What will be interesting, hardware-wise, will be how Apple transitions to running apps on their actual laptop/desktop machines. Because they're going to have to have some shared capabilities. People will be upset if software they use on their iPad-based MacBooks and iMacs doesn't work on the higher-end machines.
They already did this with Rosetta in 2006 when migrating from PPC64 to x86.
I'm not sure why this is suddenly a massive hurdle in 2020...

I think the plan, based on what Apple's done so far, will be to use ARM co-processors in their x86 machines as a device within a device, and from there move on to merging iOS and MacOS.
That isn't what any of these articles are stating, though, and if Apple were going to do that, it would have happened 3-5 years ago.
Also, MacOS and iOS share the same kernel, resources, etc., and only vary on the front-end, so they are, and have been for over a decade, already "merged".
 
And Bootcamp, Parallels, Wine users, running Windows software on Macs.

IMO, this makes Macs worse by removing this very useful option.

Needless mess IMO, they could go dual source with AMD excellent new 7nm APUs/CPUs. That would be a much better option than the mess of swapping architecture.
For the end-users and customers, I agree.
Apple isn't in the market for us, it is in the market for the almighty profit. ;)
 
And Bootcamp, Parallels, Wine users, running Windows software on Macs.

IMO, this makes Macs worse by removing this very useful option.

People don't get on any ecosystem with the intent of running emulation. If you're on a Mac it's because you want to be on macOS and run Mac apps. If you're on Windows it's because you want to run Windows apps. You pick the platform to do the work you need to do. This isn't rocket science. I don't complain that Windows doesn't run a single macOS app.

Needless mess IMO, they could go dual source with AMD excellent new 7nm APUs/CPUs. That would be a much better option than the mess of swapping architecture.
We'll see.

Apple isn't in the market for us, it is in the market for the almighty profit. ;)
Tell me of one company that isn't. By definition and tax law if you're not a "non-profit" organization, you're "for profit". You're arriving at very common knowledge for any fortune 500 company.
 
I'll have to say that I'm still quite skeptical here. I get the laptop angle; running MacOS X on ARM with the desktop experience shouldn't be too difficult once Apple has the appropriate CPU parts available, but in general, these are likely to succeed mostly in terms of providing said desktop experience as opposed to a tablet experience -- not as being higher performance computers.

To me, the iMac seems to exist for those instances where either the computer is not expected to move, or where it is expected to provide higher performance than would be available from a Macbook. In the first case the iMac could be cheaper, so I get that angle for switching to ARM, but in the second, I expect ARM to fall far short.

Maybe the first case is enough for Apple to produce them?
I agree, and while I do think the performance will be there, outside of their mobile offerings' performance, we don't have any true and concrete numbers at this time.
What I'm more seeing will occur from this, is that with Apple moving to ARM across the board, this will signal and allow the rest of the market to follow suite.

Thus, this will open new doors for development, software optimization, software and hardware development, etc. with an ARM-based focus, and will eventually gear everything away from x86-64.
This isn't going to happen overnight by any stretch, but by 2030, this might not be too far outside the realm of possibility.

It would make sense even for businesses if the heavy lifting, whatever that may be, is largely constrained to local or remote cloud resources, with the iMac simply being a responsive 'terminal' so to speak.
It's ironic you say that, since I thought this would have happened nearly a decade ago with mobile and desktop environments alike, much in the vein of VDI.
I think the reason this hasn't, at least fully happened, is due to the limited ISP capabilities and infrastructures in place (assuming consumer and not enterprise), thus still leaving a large amount of end-users and market share still requiring standalone performance without cloud/3rd party compute assistance in order to function and operate.

It could eventually be a possibility, but again, it will probably be a ways down the road.
Seeing as how taxed cellular networks are with the COVID-19 pandemic, sadly, I think we are seeing what happens when there is excessive strain on our current cellular environments, and what would happen if everything suddenly moved to cloud-based assistance.
 
Tell me of one company that isn't. By definition and tax law if you're not a "non-profit" organization, you're "for profit". You're arriving at very common knowledge of any fortune 500 company.
Hey now, the dark cyberpunk future is upon us, and it might be a wise decision to pick the megacorporation you want to side with, before they decide to pick you... :borg:
 
...you mean a GUI?

No, I mean a windows environment. An Apple Watch has a GUI. A Kindle has GUI. It will be iOS made to emulate MacOS.

Apple wouldn't have done a thing differently 5 years ago. 5 years ago Intel was still the safe bet. How long do you think these companies turn around new hardware architectures? It takes 5-6 years of development to make a processor.

Core was a miracle in that regard; Intel had two separate processor development teams going at the same time.
 
I agree, and while I do think the performance will be there, outside of their mobile offerings' performance, we don't have any true and concrete numbers at this time.
What I'm more seeing will occur from this, is that with Apple moving to ARM across the board, this will signal and allow the rest of the market to follow suite.
It's absolutely possible for ARM or other ISA to replace x86 for "desktop" computing. The performance needs are simply not there for higher-power devices for most users.

However, as x86 development hasn't at all stagnated and is likely to continue to push into higher-performance envelopes, while ARM ISAs remain focused on more efficient envelopes, it's hard to imagine ARM outright replacing x86. Particularly if our everyday computing paradigm changes, i.e., demand for performance at most or all 'desktop' platform levels increases too.

Remember that there's no real reason to move away from x86. ARM largely succeeds at being more efficient because that's what it's targeted for; that's what sells. x86 succeeds at being higher-performance for the same reason, that's where it's targeted.

If the same effort went into either or both to target the other's current advantage, it's very likely that parity would be achieved.

It's ironic you say that, since I thought this would have happened nearly a decade ago with mobile and desktop environments alike, much in the vein of VDI.
The biggest issue is the software, after the internet. The software has to be designed around this 'interface here, workload there' paradigm, and that's simply extremely difficult to do well. It's far easier to just give users more performance, since that's also available.

Essentially, the business case for the necessary investment hasn't been there - yet.

Cloud computing, IoT, and so on are still in their infancy with respect to things like integration and security. We're getting there, and perhaps Apple is seeing the work that's been done by cloud-accelerated app developers (including Adobe, for example), as a milestone to begin pushing ARM as a desktop interface ISA for their average user.
 
No, I mean a windows environment. An Apple Watch has a GUI. A Kindle has GUI.
So, you mean a desktop GUI, then.

It will be iOS made to emulate MacOS.
Why would it emulate itself...?
Might want to re-read my post.

Apple wouldn't have done a thing differently 5 years ago. 5 years ago Intel was still the safe bet.
I said that because MacOS and iOS are effectively the same OS, so if Apple were going to add in an ARM co-processor for such functions, they would have already done it.
There is no need for Apple to add another layer, though, and especially not at this point.

How long do you think these companies turn around new hardware architectures? It takes 5-6 years of development to make a processor.
If Apple had wanted to move to ARM-based desktops and laptops back in 2015, they certainly could have, but the performance would not have been there yet, and their Metal API (proprietary) wasn't enforced across the board as the new standard, so it would have been a much higher uphill battle for them to get this done.

Core was a miracle in that regard; Intel had two separate processor development teams going at the same time.
What does this have to do with anything we are talking about?
I'm not talking about 2006, I'm talking about 2020/2021.
 
Post 49 snip
I said it wouldn't happen overnight, and may take over a decade to get ARM to replace x86-64, assuming that happens at all.
I really hope you are right in that the x86-64 ISA will continue to innovate and not stagnate, though I do still believe it will reach its true physical limits sooner than later.

As for the software development side of cloud-assist, that is true as well, agreed - we are getting there, but it will take time.
Hopefully by the time the software is ready and matured, the Internet infrastructure will be as well.
 
People don't get on any ecosystem with the intent of running emulation. If you're on a Mac it's because you want to be on macOS and run Mac apps. If you're on Windows it's because you want to run Windows apps. You pick the platform to do the work you need to do. This isn't rocket science. I don't complain that Windows doesn't run a single macOS app.

Windows is the dominant desktop ecosystem, so of course people don't care about missing Mac applications.

Macs OTOH have a much sparser software ecosystem, and people will often need to run applications that are Windows only. This was often a way to get Macs into Windows shops.

Parallels appears to have a thriving business, enabling Windows emulation on Macs, running for nearly as long as there have been Intel Macs.

There is obviously a significant market for Windows emulation on Macs, and losing that ability will undoubtedly cost Apple future Mac business from those users that NEED to run Windows applications.
 
Windows is the dominant desktop ecosystem, so of course people don't care about missing Mac applications.
I do. It's one of the reasons I'm not on Windows. And why people are on Macs in the first place. Being the dominant platform doesn't mean you don't want software from other platforms. That's ridiculous. That's like saying you don't want Nintendo or Xbox exclusives because Playstation is the dominant platform.

Macs OTOH have a much sparser software ecosystem, and people will often need to run applications that are Windows only. This was often a way to get Macs into Windows shops.
Not sure for whom. I think there are probably about zero Windows PC users that would ever switch to macOS or a Mac just because it's capable of emulating Windows while not being Windows.
Lets try an informal poll starting with you. Hey Snowdog, Mac's can emulate Windows, want to buy one? No? Weird....

Parallels appears to have a thriving business, enabling Windows emulation on Macs, running for nearly as long as there have been Intel Macs.

There is obviously a significant market for Windows emulation on Macs, and losing that ability will undoubtedly cost Apple future Mac business from those users that NEED to run Windows applications.
The bottom line is if you "need" to run Windows applications you're running Windows on a PC. Your whole arguments contradicts itself. Why are you on a Mac if Windows is the dominant system? Why are you on a Mac if all you want to do is run Windows software? It's a fallacy.
People emulate and run bootcamp sure, but you're grossly overstating how necessary it is. Generally speaking doing either or both is a hassle. People quickly get trained out of that habit unless it's absolutely necessary pretty quick, because it's not a particularly pleasant loop to have to do.
 
I do. It's one of the reasons I'm not on Windows. And why people are on Macs in the first place. Being the dominant platform doesn't mean you don't want software from other platforms. That's ridiculous. That's like saying you don't want Nintendo or Xbox exclusives because Playstation is the dominant platform.

No, it's like saying, if you could run PS4 exclusives on the XB1, why wouldn't you want to?

Hey Snowdog, Mac's can emulate Windows, want to buy one? No? Weird....

Sure I would, if there was a decent affordable, expandable Mac (AKA xMac). I actually started looking at buying a Mac from the moment they started using x86 parts and it was obvious that you could run Windows software on them. I use Windows not because I love the OS, but because that is where the HW and Software is. If Apple built an affordable HW form factor that suited me, and with plentiful Windows emulation options, I would be in. The missing piece was the HW.

People emulate and run bootcamp sure, but you're grossly overstating how necessary it is. Generally speaking doing either or both is a hassle. People quickly get trained out of that habit unless it's absolutely necessary pretty quick, because it's not a particularly pleasant loop to have to do.

If people get quickly get trained out of it. Parallels would have went out of business a while back and there wouldn't be multiple options for dual booting or running Windows applications on MacOS.
 
No, it's like saying, if you could run PS4 exclusives on the XB1, why wouldn't you want to?
And then what is your reasoning again for thinking people wouldn't want the reverse of that? Something something dominant platform?
But to the point, where this analogy breaks down is because it isn't necessary. There are plenty of software alternatives that most people on macOS use if a piece of software isn't available on both platforms. The analogy breaks down because of substitutionary goods. However to my argument, the Mac substitutions are better than the Windows alternatives. And Windows itself being a substitute for macOS is poor.

Sure I would, if there was a decent affordable, expandable Mac (AKA xMac). I actually started looking at buying a Mac from the moment they started using x86 parts and it was obvious that you could run Windows software on them. I use Windows not because I love the OS, but because that is where the HW and Software is. If Apple built an affordable HW form factor that suited me, and with plentiful Windows emulation options, I would be in. The missing piece was the HW.
So never. And you're more or less proving my point. You are on the platform that does what you want to do. You're not on a platform that doesn't do what you want to do. Regardless of reason. Regardless of what is necessary. You're on the platform that does what you want. I don't see why you're having a hard time figuring out that it's the same for Mac users. We're on that platform because it does what we need it to do. Not because it is capable of emulating other platforms.
Also, why would you ever want to emulate things? It's a band-aid. Generally speaking people using VMware or Parallels are doing so because they have to do some sort of dev work, but emulation is certainly never ideal in comparison with running native. Emulation is always slower and takes more system resources. I mean, you're making it sound like literally people boot into macOS, open Parallels, and then do everything inside of Parallels. Which is why I'm telling you a) they don't, and b) that's a terrible way to do things. It's most certainly not fast enough to game on, I can tell you that. The penalty is way too high.

If people get quickly get trained out of it. Parallels would have went out of business a while back and there wouldn't be multiple options for dual booting or running Windows applications on MacOS.
I don't disagree that people buy and use the application. But I also think they use it far less than you think they do. People also buy full blown Microsoft Office all the time but don't necessary use Excel or Access all the time either. Or get the entire Adobe Suite but never run Adobe Illustrator or InDesign.
I have VMware Fusion and a Bootcamp installation. I use it exclusively to play games. There is zero software on Windows outside of games that I run. But I haven't bothered to boot into Windows for probably close to two months at this point. I finished Outer Worlds and that was that. If these companies would port games to macOS I wouldn't ever bother and even as it is it's incredibly rare for a game to be worth it. In other words, I would game a lot more if these titles were on macOS, because I don't like booting into Windows and dislike Windows.

Gaming is non-essential. I could lose my ability to Bootcamp and it wouldn't be a major loss. I would probably be more productive that way honestly.
 
Last edited:
And then what is your reasoning again for thinking people wouldn't want the reverse of that? Something something dominant platform?

Some would, but more would probably want the PS5 emulation because they have more/better exclusives. People tend to want Macs for MacOS (or Apple HW design), not SW that is exclusively on Mac.

I mean, you're making it sound like literally people boot into macOS, open Parallels, and then do everything inside of Parallels.

Where? I think it's quite the opposite really. What I am portraying is more like: "I would have a Mac, but I really want to run these three pieces of Windows SW", or "I need to run this piece of SW for work", or "I want to run a few old favorite games." (bootcamp in that case).

Just because you belittle the value of this, doesn't mean everyone sees it your way.
 
Some would, but more would probably want the PS5 emulation because they have more/better exclusives. People tend to want Macs for MacOS (or Apple HW design), not SW that is exclusively on Mac.
One of Apple's great criticisms (which I disagree with, but none the less) is that they sell overcosted hardware. It makes zero sense to pay more to get less and then also not natively run the software that you need to to get actual work done.

Where? I think it's quite the opposite really. What I am portraying is more like: "I would have a Mac, but I really want to run these three pieces of Windows SW", or "I need to run this piece of SW for work", or "I want to run a few old favorite games." (bootcamp in that case).

Just because you belittle the value of this, doesn't mean everyone sees it your way.
You're describing again why you're not on a Mac (because you need to run Windows software and play Windows games). You've already stated you're not on one now as the proposition doesn't make sense for you (hardware configurations not to your liking). So I'm on a Mac and telling you that it's not all it's cracked up to be. And you're not on a Mac telling me it's the best thing since sliced bread.
You can disagree with my "opinion", but we can also see who has actually voted with their dollars. However, I don't disagree that there are others that have differing opinion. I'm just telling you that far less people are emulating regularly than you think and it's far less important for the user base.
 
Where? I think it's quite the opposite really. What I am portraying is more like: "I would have a Mac, but I really want to run these three pieces of Windows SW", or "I need to run this piece of SW for work", or "I want to run a few old favorite games." (bootcamp in that case).

Just because you belittle the value of this, doesn't mean everyone sees it your way.
Apparently Apple finds little to no value in this, since they are going to be migrating away from x86-64, so don't blame UnknownSouljer for pointing out the obvious, blame Apple for making an efficient business change.
 
Apparently Apple finds little to no value in this, since they are going to be migrating away from x86-64, so don't blame UnknownSouljer for pointing out the obvious, blame Apple for making an efficient business change.
The thing is, and I had this argument also in another similar ARM coming to Mac thread, is that there isn't even a gurantee that you'll lose virtualization support anyway. Granted the software would have to be more complex to emulate x86-64 on ARM, but it's not as if it couldn't be done (this is a particular strong suit of VMware). Bootcamp functionality would be lost however unless Windows has an ARM version, but then that would also have to be worth running, which it isn't.

Bootcamp and virtualization was a draw for Apple when Steve Jobs came back and they needed to attract people back to the Mac. This was after Apple had a long decline in the late 90s and early 00s. It was a way of saying: "hey we can run all the software, try us again." But the point in doing that was just to open the door to get people onto their hardware. I would say that obviously most people who stayed with the platform found that they liked the platform and the ecosystem and ultimately Apple software (namely first party software). And again as substitutes for Windows software became more plentiful, Windows and virtualization became less useful and less necessary. I would argue that it's not necessary at all at this point. And it hasn't been for about 4-5 years at minimum, although I would argue longer since about 2010 or so (with the specific exception of if you're a programmer).
 
Last edited:
Why would it emulate itself...?

They are not the same operating system.

What does this have to do with anything we are talking about?

We're talking about hardware development and CPU architecture. It takes about 5 years to go from an idea to working silicon. If you want to know what a company is thinking now, you have to look back 4-5 years ago and consider their options were then. They can't look at what developments happened even two years ago and make a determination for what will be this year. That's not enough time.
 
They are not the same operating system.
They are more similar, and go hand-in-hand, much more than you are eluding to.
This is another big push that Apple has made in the last few years.

If you want to know what a company is thinking now, you have to look back 4-5 years ago and consider their options were then.
Yep, which is basically what I have been saying in every single Apple ARM thread on here.
Instead of looking back 4-5 years ago, most comments are focusing on what happened around 14-15 years ago, and again, technology and the market change.
 
Apparently Apple finds little to no value in this, since they are going to be migrating away from x86-64, so don't blame UnknownSouljer for pointing out the obvious, blame Apple for making an efficient business change.

It's rumor and speculation at this point, not fact. Apple has given no indications they are ditching x86-64 for ARM.
 
It will be interesting to see what they can do. Apple has dominated the SOC performance market for a long time now, even against intel's offerings. If they can make an ARM processor that even competes with Intel and AMD's mid range, it will change the market.
 
It's rumor and speculation at this point, not fact. Apple has given no indications they are ditching x86-64 for ARM.
So, why would Apple not make a public statement of it yet, especially if they aren't 100% ready to go with their new platforms and replacement strategy?
I will give you a hint.
 
You have statistics on that?
Do you?

I have anecdotal evidence. There isn't a ton that is available online about sales data and direct stats from either Corel or VMware. But even in 2008 when Virtualization was new and exciting it was only a million.
https://www.computerworld.com/artic...-software-market-up-50--in-north-america.html
Even at a 4x increase it would be less than 25% of the Mac's sold within a year. It is definitely not ubiquitous. As for the annecdotal evidence, most people I know with Mac's don't do virtualization. VMware/Parallels/Bootcamp isn't Microsoft Office, or I guess Pages/Numbers/Keynote (which is free).

It's rumor and speculation at this point, not fact. Apple has given no indications they are ditching x86-64 for ARM.
Yes, but said 'rumor' is given by an insider that has a long track record of being right. For what it's worth, I think it's wise to take all rumors with a grain of salt and take a long view with a wait and see approach (to put all the colloquialisms in one sentence).
However, the writing really is on the wall for this one. Even if you're skeptical about 2021, it's still a matter of when and not if.
 
Last edited:
Who cares? It doesn't effect anyone that still wants a real PC. The typical Apple users couldn't care any less if it ARM or and Intel in the computer.
 
I said it wouldn't happen overnight, and may take over a decade to get ARM to replace x86-64, assuming that happens at all.
I really hope you are right in that the x86-64 ISA will continue to innovate and not stagnate, though I do still believe it will reach its true physical limits sooner than later.
I'm not really aware of any 'physical limits'; as they exist, they apply to all architectures.

As we approach real physical limits with respect to per-core performance, a greater focus on coordinated software and hardware development will be needed. ARM is indeed headed in this direction, or rather, their licensees, but given the mobile / efficiency focus over chasing outright performance, we're back to waiting for someone like Apple to even try and run desktop software on something other than x86.

I'd posit that the ISA itself will come to mean very little so long as it is extensible and doesn't get in the way of getting work done. x86 itself is limited, but it has been extended heavily without real consequence, and internally it's something different anyway.

The ISA becomes the language that instructions are stored in before being given to the CPU, but it's not indicative of how a particular CPU works; thus, the same code running on otherwise normalized x86 and ARM CPUs (that do not exist) should be able to run at the same speed and within the same power envelopes.

The only advantage of ARM is that ARM is a company that isn't Intel. That may seem like an advantage today, but again, there's no reason to think that ARM would act differently in Intel's shoes, or vice versa. See the last time AMD had a better CPU core.
As for the software development side of cloud-assist, that is true as well, agreed - we are getting there, but it will take time.
Hopefully by the time the software is ready and matured, the Internet infrastructure will be as well.
The internet infrastructure can be there tomorrow, if there's a business case for it. In general consumers simply don't have the required workloads and that level of internet connectivity isn't even a significant business need for most traditional small and medium businesses. Even remote desktop functionality / VDI isn't that taxing in terms of typical desktop work, and that's usually done for security reasons more than compute reasons (though there are certainly traditional compute reasons as well).
 
The only advantage of ARM is that ARM is a company that isn't Intel.
You just keep crushing my hopes and dreams.

That may seem like an advantage today, but again, there's no reason to think that ARM would act differently in Intel's shoes, or vice versa. See the last time AMD had a better CPU core.
I'm sure SoftBank would never do that with the ARM ISA, and AMD never sold a $1600 dual-core CPU back in 2006 and is a saint of a corporation.
 
But they don't have keyboards built-in....and iMacs are hardly workstations; I had one as a workstation for a few years and it was a struggle. These will be big iPads on a stand.

The iPad Pro 2020 has a keyboard lifted directly from the 16” Macbook Pro...

Apple have been planning this for ages when they went to METAL for OSX to make app porting easy. It is one of the reasons the Macbook Pros ran so hot. Why the hell can we get the same performance out of an iPad Pro with passive cooling and this stupid thing needs 2 fans running flat out.
 
iPads already have ARM CPUs in them.

And IMO would be the preferable mobile platform for Apple and most consumers going forward. iPad supports Pen/Touch/KB/Mouse, it already outsells Macs and recently added Mouse support is another shot in the arm for iPad, in the push to full laptop replacement. For the mobile side where you get the only tangible benefit for ARM (battery life), Macs are just the dwindling old school with only KB/M support, and on the mobile side, it looks like Apple is setting iPads to cannibalize Macs.

Switching Macs to ARM seems like deck chair rearrangement on the Titanic.

So, why would Apple not make a public statement of it yet, especially if they aren't 100% ready to go with their new platforms and replacement strategy?
I will give you a hint.

Transitions are quite different. The Intel Transition was announced at WWDC, June 2005. First machines January 2006. They need to get developers going early, or there is little SW to release with the HW.

IMO, if there is nothing at WWDC, then this is another in a long line of empty rumors.

This has been rumored since at least 2011. It may happen someday or it may not.

But given a nearly a decade of rumors on this, I'll believe it when Apple says so.
 
Last edited:
The iPad Pro 2020 has a keyboard lifted directly from the 16” Macbook Pro...

Thanks for proving my point.

Why the hell can we get the same performance out of an iPad Pro with passive cooling and this stupid thing needs 2 fans running flat out.

Because they're different operating systems, and they can't run the same software well.
 
Why?
Given that iPads now support KB/Mice/Touch/stylus, and runs on ARM, it seems kind of pointless to switch KB/Mouse only Macs to ARM.

I would agree. Macbooks are very different beasts than iPads. Making them more similar doesn't seem like a win.

Apples silicon will be a game changer. i7-i9 performance with passive cooling and 1/4 the power consumption!

Apple aren’t idiots like Microsoft, this has been in the works for years and most of the apps will be ready to go already for the platform.

Unless the damn thing is 64 cores, you're not going to get i9 performance at a 1/4 the wattage. Not happening. Compared to x86, ARM performance is much lower.

Yes because they were completely stupid switching from IBM to Intel.

VERY big difference there. Intel was commodity. IBM powerpc was not. I can see the argument that ARM is the definition of commodity. However, the performance is much, much, lower.

Getting rid of an expensive (and failing) Intel contract, in place of using in-house CPUs and designs - how is that not a win for Apple?
It will eventually be a win for everyone else, as Apple will lead the way to mainstream (non-mobile) ARM technologies and support, which will thus eventually become a true competitor (and hopefully killer) of x86-64.

Forces vendors to re-code VERY popular software that has been x86 for a decade or longer. Developers are lazy.

It saves them money. It does nothing to assist their market share if anything it will hurt it.

I completely agree.
 
Who cares? It doesn't effect anyone that still wants a real PC. The typical Apple users couldn't care any less if it ARM or and Intel in the computer.

I agree. The average Mac user wouldn't even notice.
 
IMO, if there is nothing at WWDC, then this is another in a long line of empty rumors.
If you mean WWDC 2021, then I agree. If you mean this year, then no. They would never announce a product a year in advance. Generally they announce a product and have it released within a month. For the latest on that, you can look into the recent iPad Pro 4 launch. The track record on this behavior is long.

I agree. The average Mac user wouldn't even notice.
True, but neither would the average PC user. Most computer users in general aren't particularly knowledeable. If I asked someone on the street the difference between RISC and CISC, they'd have no idea. Or an i5 or i7. Or AMD and Intel. If I even asked a general question like: what is the fastest graphics card? I'd expect that they'd not be able to answer in any meaningful way (as in a failure so bad that they could probably not even name "a" graphics card, let alone the fastest one).
 
True, but neither would the average PC user. Most computer users in general aren't particularly knowledeable. If I asked someone on the street the difference between RISC and CISC, they'd have no idea. Or an i5 or i7. Or AMD and Intel. If I even asked a general question like: what is the fastest graphics card? I'd expect that they'd not be able to answer in any meaningful way (as in a failure so bad that they could probably not even name "a" graphics card, let alone the fastest one).

The average PC user would notice if they changed to completely different architecture. Because the hardware is the only thing that sets them apart and the OS's main selling point is backward compatibility. Look at Surface RT. You don't think the people that bought them didn't notice?
 
The average PC user would notice if they changed to completely different architecture. Because the hardware is the only thing that sets them apart and the OS's main selling point is backward compatibility. Look at Surface RT. You don't think the people that bought them didn't notice?
If they are a general browser/office user that occasionally uses Facebook and spends an inordinate amount of time on Youtube? No. In fact, I would bet there are people that bought them and then got upset after the fact when they couldn't do certain things they could before. Then likely had to be educated, and some percentage of those people didn't understand the education. I guarantee it. I also gurantee that there was a large subset that didn't use any applications that required backwards compatibility. Especially if they are a general office/social media user as described above.

But, as a counterpoint, that would have also required that enough Surface RT's got sold to ordinary people to even be a relevant point of conversation. It got canned quick for a reason. I'm guessing the sales numbers weren't particularly good.
 
If you mean WWDC 2021, then I agree. If you mean this year, then no. They would never announce a product a year in advance. Generally they announce a product and have it released within a month. For the latest on that, you can look into the recent iPad Pro 4 launch. The track record on this behavior is long.

First you need a transition announcement which isn't the same as a product announcement. The only track record that counts in this regard is other CPU transitions. For those you need to give lead time to the development community.

The Apple switch to IBM based RISC chips was announced in 1991 with the AIM partnership. Years before the first consumer PPC computer debut in 1994.

The Intel Transition was announced ~6 months before the first Intel power Macs.

Going by these precedents, WWDC seems a requirement for a late 2020, or early 2021 ARM Mac. To give heads up to the development community, and give them a head start on application porting.
 
  • Like
Reactions: Axman
like this
First you need a transition announcement which isn't the same as a product announcement. The only track record that counts in this regard is other CPU transitions. For those you need to give lead time to the development community.

The Apple switch to IBM based RISC chips was announced in 1991 with the AIM partnership. Years before the first consumer PPC computer debut in 1994.

The Intel Transition was announced ~6 months before the first Intel power Macs.

Going by these precedents, WWDC seems a requirement for a late 2020, or early 2021 ARM Mac. To give heads up to the development community, and give them a head start on application porting.
That's only necessary if the move to ARM requires any extra work from developers.
Look, whatever, you think it's bogus. That's fine. A year and a half isn't that long a wait. We can have this discussion then if either of us even care at that point to get our "just desserts" of smearing it in the others face.
 
Back
Top