Apple M1

You’ve seen zero benchmarks, but you already know the claims are BS?

Sometimes things don't pass the smell test.

Things in the CPU industry are typically evolutionary and predictable. There might be small surprises here or there, but major leaps out of nowhere essentially don't happen.

So, their 2x and 3x claims don't pass the smell test, unless - as has been previously mentioned - they are comparing them to extremely low end budget machines (whatever happened to dollar to dollar comparisons?)

Either that, or they are touring areas where these SoC's have hardware acceleration for one or two small things, benching the hell out of that, while ignoring overall performance.

I think this Apple chip will be fit for it's purpose. that purpose being passsble general desktop performance with a few tricks up it's sleeve to accelerate codex encoding and photoshop filters to the point where it is usable. They wouldn't be launching it if it weren't. And this is fine and exactöy what some people buy their laptops for.

It's not about to take on any real computers any time soon though.

Heck, my 8 year old Latitude with an Ivy Bridge i5-3320M is perfectly responsive in basic desktop apps. Just add some ASIC hardware accelerators to that, and you have the same thing, just a bit bulkier and lacking in Apple "magic".
 

How fast can that iPhone compile Chrome? (Or the Linux kernel, if you prefer. Or, really, any large project.)

Oh, but at least it'll have 20 hours of battery life...which it'll probably need to get that big compile done.
 
I think this Apple chip will be fit for it's purpose.

Yeah. I think there's a lot of talking past each other here. This new chip could both be fast at the stuff it's designed to be fast[1] at--like juanrga likes to keep highlighting benchmarks it's good at--but not nearly as good at other stuff (still waiting for an estimate or even a wild-ass guess on how fast it can do a big compile, but I guess I'm not going to get one until someone who's not fluffing the M1 tries it out.

[1] and that's perfectly fine. But it's not the whole story. Personally I don't have any particular use for a CPU that's really fast only at stuff I don't do. YMMV.
 
I've avoided Mac hardware all my life, but I was in the room when Steve Jobs unveiled the first iPhone at WWDC 2007 and knew it was objectively a big deal; and seeing the announcements today it also feels like a major turning point -despite the protests of entrenched x86ers and PC gamers that think scoffing at it does anything.

Like it or not, the paradigm is shifting, these are only the beginning. I'll continue to run x86 for the foreseeable future, but I know change when I see it - and I'm not exactly a genius, it just happens to be hitting all of us right over the head.

Right.

The iPhone was not revolutionary. It was 100% evolutionary. Absolutely everything in the industry was moving in that direction of candyvar touch screen phobes with more desktop like capability. Android was already under development as well.

Don't get me wrong. Apple made it to market first, and their release was much sleeker and more polished ä for the first couple of years, but it wasn't anything exceptional.
 
Yeah. I think there's a lot of talking past each other here. This new chip could both be fast at the stuff it's designed to be fast[1] at--like juanrga likes to keep highlighting benchmarks it's good at--but not nearly as good at other stuff (still waiting for an estimate or even a wild-ass guess on how fast it can do a big compile, but I guess I'm not going to get one until someone who's not fluffing the M1 tries it out.

[1] and that's perfectly fine. But it's not the whole story. Personally I don't have any particular use for a CPU that's really fast only at stuff I don't do. YMMV.

Yeah. After JuanRGA's comments on that other thread about the Cavium (now Marvell) ThunderX2 I was talking to some of the Proxmox Dev's the other day about why they haven't released a Proxmox release for these platforms.

Their answer? They have an ARM64 release ready, but just don't have anything to validate it on. The Thunder X2 at post good benchmarks in some reviews, bit in practice they found it impossible to work with. Difficult to get to boot right, and impossibly slow in real world applications like a kernel compile.

Now, this Apple chip is likely faster. Apple has really done down good work on the ARM front, but I still don't think ARM is quite there yet for desktop applications.
 
Sometimes things don't pass the smell test.

Things in the CPU industry are typically evolutionary and predictable. There might be small surprises here or there, but major leaps out of nowhere essentially don't happen.

So, their 2x and 3x claims don't pass the smell test, unless - as has been previously mentioned - they are comparing them to extremely low end budget machines (whatever happened to dollar to dollar comparisons?)

Either that, or they are touring areas where these SoC's have hardware acceleration for one or two small things, benching the hell out of that, while ignoring overall performance.

I think this Apple chip will be fit for it's purpose. that purpose being passsble general desktop performance with a few tricks up it's sleeve to accelerate codex encoding and photoshop filters to the point where it is usable. They wouldn't be launching it if it weren't. And this is fine and exactöy what some people buy their laptops for.

It's not about to take on any real computers any time soon though.

Heck, my 8 year old Latitude with an Ivy Bridge i5-3320M is perfectly responsive in basic desktop apps. Just add some ASIC hardware accelerators to that, and you have the same thing, just a bit bulkier and lacking in Apple "magic".

Well I things do in fact change overnight in the chip industry. Just never from the same company. Once a Chip company has a market they have little incentive to do anything but incremental updates. Intel over the last 10 years has had little to no competition of course they didn't do anything revolutionary.

Predictability comes from having no completion. Intel has only ever been pushed twice. The first time they dumped P4.... and changed directions to keep up. Would Intel have ever released a chiplet design... not likely. Would AMD have ever done it if they where not miles behind ? Also not likely. Would NV have swung hard with Ampere or built a more conservative chip if they didn't get wind of AMD building something special ? I am guessing they would have played it safer. Competition is what is often lacking in the chip industry which is why generation to generation we get 8-10% bumps instead of 50%+.

And your right on the x86 with stuff bolted on. Buckle up PC master race. Alder lake and beyond Intel will be doing exactly that. big.Little is coming to x86.
 
Yes and no, but look at what PC's you could buy in that price range, and compare it to the mac mini. Then ask yourself, which would you rather troubleshoot with your mother/grandmother?

Honestly, I'll troubleshoot any PC with anyone over any Mac any day.

Mac's have this reputation for being user friendly. That may have been the case - comparatively - back in tbe 80's and 90's, but when I had to set up an configure my fiance's 2013 era 27" iMac I was tearing my hair out I was so frustrated with the thing.
 
Last edited:
Apple has always wanted end to end control over not just the UI experience but also the hardware experience. That of course can lead to some interesting products. I liken it quite a bit to the way that consoles are currently being designed, although the integrated RAM could act like a giant cache instead of more traditional RAM. I would be interested to see how that would perform in other more desktop oriented CPUs regardless of manufacture.

We can infer a few things though. This is based on the A14 which is a known quantity. We can extrapolate based on that a rough performance increase. I'm betting that a lot of their performance claims are related to memory speeds, and in reference to their own past products.

The tight integration of OS to hardware is hard to deny. Apple has shown that with their iPhone line. Clearly this is something that PCs and other mobiles will trail in.
In the end, this is why I am not interested.

I want the control over all products I buy.

I don't want to be dictated how to use he software and I want the hardware to be as customizable as possible.

I don't think they have taken any performance crowns, but even if they have, in the end it is completely irrelevant to me. Customization wins every time.
 
That is it, Apples are great if you fit in their mold.

I know far to many Apple fans IRL that think they can do everything and anything, but won't even sit at a PC to test their bias. Yet I always have to sit at their Apple to see just how great it is.

Yep. Apple = Great for happy path consumer use plus a sprinkling of photo, audio and video editing.

And only as long as you want to do everything exactly the Apple Way.

If you have any other preferences? Well screw you.
 
The current Firestorm core, constrained in a A14 phone, is able to outperform the fastest Intel core and is only pennies behind a top Zen3 core in single thread performance

View attachment 298017

The cores in the M1 will run circles around any x86 core. Moreover the Apple design is ultraefficient. The A14 is below 5W, whereas the R9 5950X goes up to 49W in 1T.

More of these bullshit meaningless SPEC benchmarks.
 
thats the face all the networks/sysadmin guys around me are making too....

Well, considering how terrible mac's are in multi-platform environments, they are seeing all of this buzz and just know idiots stew going to buy the damned things and then demand that IT get it working on their networks and make their lives absolute hell on earth, like any corporate environment that includes Apple products.
 
First bench of a M1 from an early owner. It is Afinity photo. (Don't ask me anything about this application. I don't know)

Emn3_5GWMAAApCl?format=jpg&name=large.jpg

For comparison this is a 2019 iMac with 6-core 3.7ghz and AMD 580X

Emn5FKtW8AImNTo?format=jpg&name=medium.jpg

Source: https://forum.affinity.serif.com/index.php?/topic/124022-benchmark-1900-results/page/6/
 
So many people voicing their opinions here should not until the facts and tests play out. Just because you don't like Apple doesn't mean they can't make an awesome cpu which they have proved to make in their iPads and iPhones compared to Android devices.
 
Last edited:
I don’t think Apple cares at all for the ”spec game”.

I mean sure they are happy to charge you (an arm and a leg) for more RAM or a bigger SSD, but they play that game on their own terms.

If you look at phones, they don’t really talk about how much memory they have, but they definately get by with a lot less then Android. They don’t really talk about the size of the battery, rather they focus on batterylife. They happily equip all phones from the cheapest to the most expensive with the same processor. They don’t usually focus onclockspeed / cores etc over generations, but rather on what end user functionality the changes will bring.

In general this scheme actually works really well. If you are a manufacturer stuck in a competition of specs, you have little reason to optimise. For Apple, making the system more efficient means they can put a smaller battery and keep battery life the same. They have an incentive to think end to end.

I totally get your point on this and to some extent agree.

I do think it is silly of Apple to do this however. 8gb of ram - even if the OS/Apps/Everything are 25-30% more efficient just isnt going to perform the same as 16gb of real ram. By increasing the base cost a little and giving much more reasonable pricing for more RAM/Storage I would see the computers as more attractive in terms of price/performance.

For someone who wants a nice computer for web browsing, email, light school work these look like nice systems (provided older code works well) the moment you need to do some real work I dont think the system is going to do so well.

Another thought I just had..... Anyone know how well Boot Camp works on the M1 machines? I would be very interested to see some benchmarks from within Boot Camp.
 
That’s like saying it isn’t so much about AMD as it is x86 for the new Ryzen CPUs.

Which is true. Countless former loyal Intel users are now running Ryzen CPUs, not because it's an AMD chip but because it's x86 and faster than the competition in most cases. It obviously also serves as an example of what can be done with the architecture.

That's what people almost always want, something that is faster. It's the same reason why it's important to know in which ways the new apple laptops are faster or not.

Some of you guys seem to bend over backwards to take a jab at Apple and downplay anything they do.

Explain to me how the desire for actual verifiable benchmark numbers equates to a "jab" against Apple? If it turns out that the the "M1 is faster than the chips in 98 percent of PC laptops sold in the past year" then I will be happy to congratulate them on a job well done. Marketing means nothing. Bring on the benchmarks.
 
Last edited:
Explain to me how the desire for actual verifiable benchmark numbers equates to a "jab" against Apple? If it turns out that the the "M1 is faster than the chips in 98 percent of PC laptops sold in the past year" then I will be happy to congratulate them on a job well done. Marketing means nothing. Bring on the benchmarks.
And one they could easily do, which would show good results if the chip was truly a number crunching powerhouse, would be x264. It's open source, there's already ARM code, which they could further optimize if they wanted, and it is one of those that scales very well with CPU power so is a good quasi-synthetic kind of benchmark. It is also really common on desktop benchmarking sites. Every time a new CPU comes out, x264 gets run on it by all the review sites so there's lots of comparison data.

However, that is nowhere to be seen. Maybe it just hasn't been done yet, and maybe it'll be just staggering how good it is... but of course it leaves people to wonder and to say "Well let's see some numbers!"
 
And one they could easily do, which would show good results if the chip was truly a number crunching powerhouse, would be x264. It's open source, there's already ARM code, which they could further optimize if they wanted, and it is one of those that scales very well with CPU power so is a good quasi-synthetic kind of benchmark. It is also really common on desktop benchmarking sites. Every time a new CPU comes out, x264 gets run on it by all the review sites so there's lots of comparison data.

However, that is nowhere to be seen. Maybe it just hasn't been done yet, and maybe it'll be just staggering how good it is... but of course it leaves people to wonder and to say "Well let's see some numbers!"

Agreed. That would be a good comparative benchmark, if you can actually make it apples to apples, instead of having it run in a special case ASIC with particular settings, like Apple has built into these things.
 
Honestly, I'll troubleshoot any PC with anyone over any Mac any day.

Mac's have this reputation for being user friendly. That may have been the case - comparatively - back in tbe 80's and 90's, but when I had to set up an configure my fiance's 2013 era 27" iMac I was tearing my hair out I was so frustrated with the thing.
As somebody who routinely works with Windows, Linux, and Mac. I can safely say that the Macs are much easier to troubleshoot because the users can't really touch anything and Apple keeps their ecosystem pretty locked down and clean. The biggest annoyance there for me is when my users do their OS updates but forget their App updates which break their dependencies and they call me in a panic because Adobe or FinalCut won't launch. But outside things like that they are very hands-off machines, I would say over a 5-year lifecycle they tend to be my lowest cost devices once I have to factor in management time, repairs, and technician time, and software licensing. Windows comes in second with Linux being my most costly of the systems because when they go bad they go really bad there is no "yeah lets just unckeck this" option there. Honestly Mac OS is the blend of Windows and Linux most people have been asking for, it just has a high cost of entry, but in terms of mass deployments the only systems that are faster and cheaper to deploy in large numbers are the Chromebooks and that is because Google's admin tools there are awesome. Jamf for the Apple stuff also works great but like all Apple partners they tend to only be told about the iOS changes a few days in advance so if I get any users who are too eager for the new features it tends to be an issue. I'm hoping to get Intune configured this winter so I can better manage the Windows stuff and that should bring my Win10 machines more in line wth the Apples for TCO.
 
As somebody who routinely works with Windows, Linux, and Mac. I can safely say that the Macs are much easier to troubleshoot because the users can't really touch anything and Apple keeps their ecosystem pretty locked down and clean. The biggest annoyance there for me is when my users do their OS updates but forget their App updates which break their dependencies and they call me in a panic because Adobe or FinalCut won't launch. But outside things like that they are very hands-off machines, I would say over a 5-year lifecycle they tend to be my lowest cost devices once I have to factor in management time, repairs, and technician time, and software licensing. Windows comes in second with Linux being my most costly of the systems because when they go bad they go really bad there is no "yeah lets just unckeck this" option there. Honestly Mac OS is the blend of Windows and Linux most people have been asking for, it just has a high cost of entry, but in terms of mass deployments the only systems that are faster and cheaper to deploy in large numbers are the Chromebooks and that is because Google's admin tools there are awesome. Jamf for the Apple stuff also works great but like all Apple partners they tend to only be told about the iOS changes a few days in advance so if I get any users who are too eager for the new features it tends to be an issue. I'm hoping to get Intune configured this winter so I can better manage the Windows stuff and that should bring my Win10 machines more in line wth the Apples for TCO.

I've never managed an enterprise environment, so I can't speak to the shit that the lowest common denominator may screw up.

Maybe I'm just lucky, but people in my house don't seem to screw much up.

My "user friendliness" frustrations have been with stuff that is just painful or frustrating to get working the way you expect out of the box.

Just getting her 2013 iMac to remember the NAS network shares and their credentials took a goddamn exorcism.

Windows "just worked". Linux was a little bit more complicated, but easy for me because I'm used to it and had fought those battles before.

I have no more recent experience than ~2013 though, because after that damn iMac died I swore to never let another Apple product on my network ever again.
 
I've never managed an enterprise environment, so I can't speak to the shit that the lowest common denominator may screw up.

Just getting her 2013 iMac to remember the NAS network shares and their credentials took a goddamn exorcism.

Windows "just worked". Linux was a little bit more complicated, but easy for me because i'm used to it and had fought those battles before.
Multiply x100 or x1000 or more.. that's Enterprise.
 
I've never managed an enterprise environment, so I can't speak to the shit that the lowest common denominator may screw up.

Maybe I'm just lucky, but people in my house don't seem to screw much up.

My "user friendliness" frustrations have been with stuff that is just painful or frustrating to get working the way you expect out of the box.

Just getting her 2013 iMac to remember the NAS network shares and their credentials took a goddamn exorcism.

Windows "just worked". Linux was a little bit more complicated, but easy for me because I'm used to it and had fought those battles before.

I have no more recent experience than ~2013 though, because after that damn iMac died I swore to never let another Apple product on my network ever again.
Yeah...... Apple and most NAS's are the Devil, and even when they are working the file menus for them are just. Bleh. The Buffalo ones have been pretty solid for us, but I have all that going back to a Sharepoint server now and that has been pretty bulletproof to date. You know once you get past all the Sharepoint, let's say Nicities.... But I am loving the mac Mini 2018's they have been little tanks, granted they are just being used as caching servers for all the iPads' but getting those in place has cut down on so much BS it's amazing.
 
Yeah...... Apple and most NAS's are the Devil, and even when they are working the file menus for them are just. Bleh. The Buffalo ones have been pretty solid for us, but I have all that going back to a Sharepoint server now and that has been pretty bulletproof to date. You know once you get past all the Sharepoint, let's say Nicities.... But I am loving the mac Mini 2018's they have been little tanks, granted they are just being used as caching servers for all the iPads' but getting those in place has cut down on so much BS it's amazing.

I'm glad it's not just me.

I was trying to get it to play nicely with a FreeNAS install at the time (I have since migrated to a home spun ZFS4Linux & KVM & LXC box).

I started SMB. That didn't work well. Then I tried NFS. That was also a cluster. I even tried setting up AFP using Netatalk and that was even worse.

In the end I think I wound up using some combination of a symlink hack and SMB. Can't remember though. It's been a while. I do recall hating the damn Apple credential library or whatever it was called. (Keychain?)
 
I totally get your point on this and to some extent agree.

I do think it is silly of Apple to do this however. 8gb of ram - even if the OS/Apps/Everything are 25-30% more efficient just isnt going to perform the same as 16gb of real ram. By increasing the base cost a little and giving much more reasonable pricing for more RAM/Storage I would see the computers as more attractive in terms of price/performance.

For someone who wants a nice computer for web browsing, email, light school work these look like nice systems (provided older code works well) the moment you need to do some real work I dont think the system is going to do so well.

I would not get that with 8GB even for my kids. Personally I can’t get by at work without 32.

But I assume going above 16 would require moving from 2 memory banks to 4, and that likely requires more engineering. Obviously they’ll be working on that. For Mac Pro’s they’ll need to go way higher and they surely already know it’s doable.
 
I would not get that with 8GB even for my kids. Personally I can’t get by at work without 32.

But I assume going above 16 would require moving from 2 memory banks to 4, and that likely requires more engineering. Obviously they’ll be working on that. For Mac Pro’s they’ll need to go way higher and they surely already know it’s doable.

Curious. What do you do at work?

I've been working from home since March. Ever since my work machine started to refuse connecting to the work VPN, I've been working on my old 2012 era Dell Latitude E6430s. It has a dual core (with HT) Ivy Brige i5-3320m and 8GB of DDR3 RAM.

I have never once noticed it slow down from lack of RAM.

As I am sitting here editing a document change, Task Manager is telling me I am using 39% of the RAM.

Granted, my professional work is not very taxing on a computer. It consists mainly of Office 365, a web browser for some web apps, Minitab and an occasional light jaunt in SolidWorks, but that's about it.

I see the reasoning for 16GB if buying an unupgradeable machine today. You have to have future compatibility. 32GB however seems a bit superflous.

(I mean, I have 64GB in my Threadripper, but that's not for work :p )
 
Curious. What do you do at work?

I've been working from home since March. Ever since my work machine started to refuse connecting to the work VPN, I've been working on my old 2012 era Dell Latitude E6430s. It has a dual core (with HT) Ivy Brige i5-3320m and 8GB of DDR3 RAM.

I have never once noticed it slow down from lack of RAM.

As I am sitting here editing a document change, Task Manager is telling me I am using 39% of the RAM.

Granted, my professional work is not very taxing on a computer. It consists mainly of Office 365, a web browser for some web apps, Minitab and an occasional light jaunt in SolidWorks, but that's about it.

I see the reasoning for 16GB if buying an unupgradeable machine today. You have to have future compatibility. 32GB however seems a bit superflous.

(I mean, I have 64GB in my Threadripper, but that's not for work :p )
I run several VMs at the same time every day and I chew through RAM like nothing. Also exporting/editing huge video files eats up a lot of RAM as well. It's entirely dependent on your workflow. If I had 8 GB I couldn't work. 16GB is a bare minimum.
 
I run several VMs at the same time every day and I chew through RAM like nothing. Also exporting/editing huge video files eats up a lot of RAM as well. It's entirely dependent on your workflow. If I had 8 GB I couldn't work. 16GB is a bare minimum.

Ah, I do that kind of stuff for fun (Which is why the Threadripper has 64GB) but work is just basic Office stuff, and 8GB is fine.
 
I run several VMs at the same time every day and I chew through RAM like nothing. Also exporting/editing huge video files eats up a lot of RAM as well. It's entirely dependent on your workflow. If I had 8 GB I couldn't work. 16GB is a bare minimum.
To be fair though the Mac Mini platform as a whole would be 100% ineffective for your workflow. Even when they did offer 32gb of ram and had an upgradable HDD the CPU and power limitations on the platform and the form factor would leave you completely incapable of doing any of that.
For most people I recommend the Mac Mini’s to, they want a computer for their living room or a small office. It’s mostly for email, light photo editing, making things like greeting cards and Calendars, most of the time it’s just a Facebook/Pintrist machine that connects to a decent inkjet printer. I mean really they could be getting by with an iPad if it weren’t for the small screens. For a lot of artsy things or simple media tasks Apple has most of that built in with their typical Apple quality on a PC they are more or less featured but usually more expensive. But they are small and quiet and if you just want it to live under your TV in the living room with a wireless keyboard and mouse/touch pad. Then they are a hard one to beat at that price point.
 
To be fair though the Mac Mini platform as a whole would be 100% ineffective for your workflow. Even when they did offer 32gb of ram and had an upgradable HDD the CPU and power limitations on the platform and the form factor would leave you completely incapable of doing any of that.
For most people I recommend the Mac Mini’s to, they want a computer for their living room or a small office. It’s mostly for email, light photo editing, making things like greeting cards and Calendars, most of the time it’s just a Facebook/Pintrist machine that connects to a decent inkjet printer. I mean really they could be getting by with an iPad if it weren’t for the small screens. For a lot of artsy things or simple media tasks Apple has most of that built in with their typical Apple quality on a PC they are more or less featured but usually more expensive. But they are small and quiet and if you just want it to live under your TV in the living room with a wireless keyboard and mouse/touch pad. Then they are a hard one to beat at that price point.
Well that depends on how powerful the M1 chips ends up being. Not that I would use an M1 chip Mac anyway since Macs are going back to the days of virtualized Windows instead of being able to run it natively and that's just going to be awful, unless Microsoft has something in the works for Apple Silicon.
 
Well that depends on how powerful the M1 chips ends up being. Not that I would use an M1 chip Mac anyway since Macs are going back to the days of virtualized Windows instead of being able to run it natively and that's just going to be awful, unless Microsoft has something in the works for Apple Silicon.
Well even if they were super powerful attempting to run multiple simultaneous VM's with only 4 cores is hard, VM's do not like assigning more virtual cores than there are logical ones, some versions of VM ware will let you do it but sweet Jesus is it painful for everything involved.
 
Anyone know how well Boot Camp works on the M1 machines?

It is not supported on these new systems at all.

not sure what you would boot camp it to anyways, though. Ubuntu on ARM maybe? Can you even buy standalone licensing for windows ARM?

better to think of it as a juiced up iPad running macOS. It likely won’t ever support running another OS.
 
It is not supported on these new systems at all.

not sure what you would boot camp it to anyways, though. Ubuntu on ARM maybe? Can you even buy standalone licensing for windows ARM?

better to think of it as a juiced up iPad running macOS. It likely won’t ever support running another OS.
yeah, Boot camp is ..... annoying who has the time to save their work shutdown, restart in the new OS and start their workflow all over. Too annoying, if it's for work I will recommend Parallels every time, the break in the workflow from Boot Camp is far greater than most people give it credit for.
 
I’ve also seen a lot of people asking about it because “I want to play games on it so I want boot camp” but... I don’t think those people understand what the whole ARM thing actually means.
 
yeah, Boot camp is ..... annoying who has the time to save their work shutdown, restart in the new OS and start their workflow all over. Too annoying, if it's for work I will recommend Parallels every time, the break in the workflow from Boot Camp is far greater than most people give it credit for.

I don't know man. I dual boot all the time. Having lived through Windows 95 through ME I'm obsessively hitting Ctrl-S every few seconds, and never keep more than one project open at a time, so it is a piece of cake to shut everything down and reboot. especially with how fast NVMe based systems boot these days.
 
Honestly, I'll troubleshoot any PC with anyone over any Mac any day.

Mac's have this reputation for being user friendly. That may have been the case - comparatively - back in tbe 80's and 90's, but when I had to set up an configure my fiance's 2013 era 27" iMac I was tearing my hair out I was so frustrated with the thing.

Having done internet tech support for a major carrier as well as currently working to do application support for a different large company I prefer helping ppl with Windows then Mac.

Windows people are much more willing to work with you and tend to have a much better handle on how to do things (open a directory, copy a file, etc). While some Mac users are quite good it is much more common to have someone who is saying it isnt working and not know a thing about where their data is or how to do so much as a ping. (Yes many Win customers are clueless when it comes to a ping,

*pause*

Ok back from a call, working at home and all that. Just had a Mac user that couldn't double click an application to get it started and we had to work around it.
 
I would not get that with 8GB even for my kids. Personally I can’t get by at work without 32.

But I assume going above 16 would require moving from 2 memory banks to 4, and that likely requires more engineering. Obviously they’ll be working on that. For Mac Pro’s they’ll need to go way higher and they surely already know it’s doable.

My current x570 board has the ability to use 32gb dimms. I have two of them installed. I do not know what the memory management capabilities of the m1 are, you'd think it's be doable specially considering everything is soldered.
 
Just guessing here, but the M1 might be sitting on a memory size limitation that comes from it's roots in Apple's A-series designs. The highest-end (Apple) mobile devices have 6GB RAM, I think? So maybe they just designed the chipsets to support up to 16GB in whatever bank configuration, you know, that's still a ways off for mobile devices. When they started prepping the chip for use in Mac OS systems they looked at it like: 16GB will be fine for "regular users" who are just browsing the web and doing some simple video editing or whatever. So no need to make it support more for this first wave of systems.

The puzzler for me is the Macbook Pro. With the Macbook Air having the same available config, the only advantage it seems to provide is having a larger battery and active cooling. Making it able to support more memory would have really helped to differentiate it from the Air. Since they didn't I'm assuming the 16GB memory config is some kind of limitation in the M1.

I'd bet that the second wave of systems - bigger Macbook Pros, iMac, Mac Pro - have a different chip with support for much more memory. But they're probably also still a ways off.
 
Back
Top