Apple announces M1 Pro and M1 Max

I haven't seen many benchmarks unfortunately, because most people seem to focus on the "wow" numbers, like "wow that h.264 encode went fast!" and not on more general purpose stuff. Now Intels benchmarks are biased of course, but if their testing is any indication, then yes.

I would love to see some neutral objective general purpose benchmarks for comparison purposes.
You'll see them once people get them in their hands. Apple doesn't release these early to reviewers like they do the iPhones.
 
Apple doesn't really send out test systems so the only way to get one to review is to order it up and give it a go. SO once they start shipping we should start seeing reviews creep in.
For you and longblock454 : Apple historically seeds review units to a handful of outlets (and no, not just flattering ones) right after an introduction and gives them a review embargo for Tuesday or Wednesday the following week. Things are a bit different as Apple is shipping sooner than usual for events like this, but you'll probably see reviews popping up right as the first units reach customers.
 
For you and longblock454 : Apple historically seeds review units to a handful of outlets (and no, not just flattering ones) right after an introduction and gives them a review embargo for Tuesday or Wednesday the following week. Things are a bit different as Apple is shipping sooner than usual for events like this, but you'll probably see reviews popping up right as the first units reach customers.
Not this time. As many of the reviewers have said, they ordered theirs at the time of the event as well so they will get them when we do.
 
just an anecdote about software optimization.

my macbook pro m1 can take quite a while converting an image to png in photoshop. it can hang for maybe up to 10 seconds at 99%.

my 11th gen i7 vaio does it instantaneously. like, before you hit save instantaneously.

i may get the new one regardless.
 
I look forward to seeing some proper tests on these. The first effort was really surprising with that crazy efficient GPU.

Now we get to see if it scales.
 
I look forward to seeing some proper tests on these. The first effort was really surprising with that crazy efficient GPU.

Now we get to see if it scales.
Anandtech has a decent article on it based on the information released so far.

“In terms of performance, Apple is battling it out with the very best available in the market, comparing the performance of the M1 Max to that of a mobile GeForce RTX 3080, at 100W less power (60W vs 160W). Apple also includes a 100W TDP variant of the RTX 3080 for comparison, here, outperforming the NVIDIA discrete GPU, while still using 40% less power.”

https://www.anandtech.com/show/1701...m1-max-giant-new-socs-with-allout-performance
 
just an anecdote about software optimization.

my macbook pro m1 can take quite a while converting an image to png in photoshop. it can hang for maybe up to 10 seconds at 99%.

my 11th gen i7 vaio does it instantaneously. like, before you hit save instantaneously.

i may get the new one regardless.

I'd imagine that is a case where you are falling back on the general purpose cores for something they don't have an FPGA/ASIC that can accelerate it.
 
Anandtech has a decent article on it based on the information released so far.

“In terms of performance, Apple is battling it out with the very best available in the market, comparing the performance of the M1 Max to that of a mobile GeForce RTX 3080, at 100W less power (60W vs 160W). Apple also includes a 100W TDP variant of the RTX 3080 for comparison, here, outperforming the NVIDIA discrete GPU, while still using 40% less power.”

https://www.anandtech.com/show/1701...m1-max-giant-new-socs-with-allout-performance

I'll believe that when I see it.

I mean, most of their performance claims with the original M1 have been complete marketing nonsense, so I imagine these claims are the same.

It's sad what Anandtech has become. They used to actually challenge marketing bull and test things for themselves, not just post sponsored content.
 
Do you feel the systems will not handle general purpose workloads very well?

At a meta level, what is “general purpose?” I mean, back in the day CPU’s didn’t have FPU’s in them. Most software could get by without needing an FPU but that’s only because they had to. FPU logic was more expensive so it became “specialized”. What if Apple is simply mainstreaming specialized processing capabilities so that more software can now readily take advantage of it? Maybe by tuning Mac OS to take advantage of more specialized units, Apple can get the performance numbers in while keeping battery life great?

Just throwing that out there.
 
I’m quite excited and ordered the 64GB M1 Max although I expect the 32GB model would have been sufficient for me. I have to admit, my TR3 3960X desktop has struggled at times editing ProRes 8k30 video (with a RTX3080!) and I’m really curious how well the new MBP will hold up under the same use case (in Premiere). I‘l know in a few weeks!

Gosh, how I wish Unreal Engine ran under MacOS natively, or that I lived in a universe in which DirectX frameworks existed on Apple silicon platforms, because I’d love to be playing Far Cry 6 on this laptop :). Alas, at least for gaming, Windows and x85 aren’t going anywhere for me anytime soon….
I went with the base 32GB version. I'm hoping that should be enough. Yeah, I mean, once you're over $4000, $400 shouldn't mean anything in the grand scheme of things, but I'm debating cancelling this and just going 64 GB.
 
I'd imagine that is a case where you are falling back on the general purpose cores for something they don't have an FPGA/ASIC that can accelerate it.
How did you come to this conclusion? I would say this is probably some strange bug/ oddity that probably needs to be fixed for M1.

https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/4
Apple M1 already competes with higher TDP intel and AMD chips on a broad range of single threaded workloads (and even some of the multithreaded workloads). Are you saying that they have dedicated hardware for all of those tests?

Also apple m1 does not have any reconfigurable logic (FPGA) that I have heard of. And I dont believe the mac minis have any FPGA cards. Let me know if you know otherwise.
 
Anandtech has a decent article on it based on the information released so far.

“In terms of performance, Apple is battling it out with the very best available in the market, comparing the performance of the M1 Max to that of a mobile GeForce RTX 3080, at 100W less power (60W vs 160W). Apple also includes a 100W TDP variant of the RTX 3080 for comparison, here, outperforming the NVIDIA discrete GPU, while still using 40% less power.”

https://www.anandtech.com/show/1701...m1-max-giant-new-socs-with-allout-performance
In a sense, this is what PC vendors were dreading the most. Apple now has laptops that theoretically hang with the latest gaming-grade laptops, but without the ridiculous power consumption or short battery lives. There were always asterisks with previous GPUs: they're workstation chips, they have to operate within tight thermal limits... no more. Not that you're about to play the latest AAA games on a MacBook Pro, but Apple appears to have caught up in performance in many respects (ray tracing notwithstanding). And arguably has some advantages in using system memory with the GPU.

I do wonder how this might skew the pro software market. I wouldn't expect a flood of ports, but if you know a MacBook Pro can handle a task better than an equivalent mobile Windows workstation? That's a convincing reason to write a Mac version.
 
Just finished watching the presentation.
They look and perform great for the specs provided, but dang, it's pricey for the one I would consider buying,
MacBookPro M1Max-Price.jpg
 
At a meta level, what is “general purpose?” I mean, back in the day CPU’s didn’t have FPU’s in them. Most software could get by without needing an FPU but that’s only because they had to. FPU logic was more expensive so it became “specialized”. What if Apple is simply mainstreaming specialized processing capabilities so that more software can now readily take advantage of it? Maybe by tuning Mac OS to take advantage of more specialized units, Apple can get the performance numbers in while keeping battery life great?

Just throwing that out there.

Not quite a good comparison. Sure an FPU helps with floating point math, and as such it is a kind of special purpose accelerator, but the workloads it supports, floating point math, is still pretty general purpose. You can utilize that to do almost anything in software.

The types of special purpose chips (ASIC's and FPGA's) Apple is using to overcome the weakness of ARM are specifically designed for one task, and one task only. Like one for encoding h.264 video, another for encoding HEVC, another fro compressing jpegs, etc. etc.) By their very nature these are discrete and targeted at a very specific task, and are thus inflexible. What happens if the codec changes? Buy a new mac? Lol. What happens if you want to use a codec that Apple hadn't thought of or condoned?

As an example, Apple's first foray into this with the $2000 Afterburner card for the Mac pro only accelerated Apple's proprietary "ProRes" codecs, nothing else. You wanted to use something else, you had to fall back on general purpose CPU/GPU computing, which luckily the x86 Mac Pro had LOTS of. These new ARM designs, not so much.
 
Not too bad versus some of the past imo, a 64GB (if the talk of what it could do with only 16GB are relevant for someone workflow), 4TB ssd, 6000x3300 resolution for a 14 inch, 3 thunberbolt 4 ports, someone needing that is either having a giant workflow need or quite the rich hobbyist. AppleCare being cheap for something at that price tag.

There is no, one could get close or even better at a lower price option, but is buying it for the software above it of some of the Apple era.

In a sense, this is what PC vendors were dreading the most. Apple now has laptops that theoretically hang with the latest gaming-grade laptops, but without the ridiculous power consumption or short battery lives
Is battery life while gaming on a laptop that big of a factor ? It must exist, but that sound like quite a niche issue, people with money that goes for long time unplug gaming on a laptop device.
 
Not quite a good comparison. Sure an FPU helps with floating point math, and as such it is a kind of special purpose accelerator, but the workloads it supports, floating point math, is still pretty general purpose. You can utilize that to do almost anything in software.

The types of special purpose chips (ASIC's and FPGA's) Apple is using to overcome the weakness of ARM are specifically designed for one task, and one task only. Like one for encoding h.264 video, another for encoding HEVC, another fro compressing jpegs, etc. etc.) By their very nature these are discrete and targeted at a very specific task, and are thus inflexible. What happens if the codec changes? Buy a new mac? Lol. What happens if you want to use a codec that Apple hadn't thought of or condoned?
At a meta level, what is “general purpose?” I mean, back in the day CPU’s didn’t have FPU’s in them. Most software could get by without needing an FPU but that’s only because they had to. FPU logic was more expensive so it became “specialized”. What if Apple is simply mainstreaming specialized processing capabilities so that more software can now readily take advantage of it? Maybe by tuning Mac OS to take advantage of more specialized units, Apple can get the performance numbers in while keeping battery life great?

I think there is 2 level (and a bit circular) of it is good for who, the consumer and developer.

I feel like it could be really good for a large group of consumer that do common things but I also feel that it give giant edge to large (without saying in house) developer versus the rest, where they get synergy with specific hardware or specific hardware design for what they do sometime proprietary versus the rest, a bit like if video card would have not supported something a la openGL-Vulkan all along and only DirectX. It really depend how it is done and how open it is, but at first glance the potential to be really good for the mass and just hurting people with rarer workflow seem obvious and really big.

Imagine doing what GPU did for graphic to most common operation done by computers...., obviously coming at the cost of predetermining a lot what will be done on them.
 
Not too bad versus some of the past imo, a 64GB (if the talk of what it could do with only 16GB are relevant for someone workflow), 4TB ssd, 6000x3300 resolution for a 14 inch, 3 thunberbolt 4 ports, someone needing that is either having a giant workflow need or quite the rich hobbyist. AppleCare being cheap for something at that price tag.

There is no, one could get close or even better at a lower price option, but is buying it for the software above it of some of the Apple era.


Is battery life while gaming on a laptop that big of a factor ? It must exist, but that sound like quite a niche issue, people with money that goes for long time unplug gaming on a laptop device.
Not as much, but then again we also haven't seen many laptops where you can play for more than a short while on battery. This might buck the trend, at least for titles that aren't too taxing.

I suspect it'd matter more for noise and retaining a battery charge. Lots of gaming laptops sound like jet turbines under load, and even plugged in will drain substantial power. It'd be nice to have a relatively quiet system that will maintain or even gain power under while you're playing.
 
Just finished watching the presentation.
They look and perform great for the specs provided, but dang, it's pricey for the one I would consider buying,
$5k. Ouch. Apple charges crazy prices for upgrading memory / storage/ whatever else. And since its all integrated/soldered you dont really have a choice if you want to upgrade anything :/
 
Yep noise-worrying about heat if placed on the bed or anything soft and so on, specially has work from home raised and playing on a long train commute scenario would diminish (if any train has no plug.. ?) it is hard to imagine someone with the kind of money being a significant amount of time playing games far from power.
 
In a sense, this is what PC vendors were dreading the most. Apple now has laptops that theoretically hang with the latest gaming-grade laptops, but without the ridiculous power consumption or short battery lives. There were always asterisks with previous GPUs: they're workstation chips, they have to operate within tight thermal limits... no more. Not that you're about to play the latest AAA games on a MacBook Pro, but Apple appears to have caught up in performance in many respects (ray tracing notwithstanding). And arguably has some advantages in using system memory with the GPU.

I do wonder how this might skew the pro software market. I wouldn't expect a flood of ports, but if you know a MacBook Pro can handle a task better than an equivalent mobile Windows workstation? That's a convincing reason to write a Mac version.
Metal has had raytracing for a while.
https://developer.apple.com/documentation/metalperformanceshaders/metal_for_accelerating_ray_tracing

The new consoles all share CPU memory with the GPU, there are some benefits as well as some drawbacks, but the drawbacks can be mitigated if the CPU cores have enough cache and the HDD read speeds are fast enough then it becomes something that a user won't really notice.

As far as pro markets go, the Mac users are more entrenched than ever, but CUDA and AVX-512 usage has also gone up significantly in many fields as well so it comes down to what you are doing/using. It's a crapshoot up there because regardless you are spending 5k+ on your workstations so it comes down to specifics on software and hardware requirements.
 
Yep noise-worrying about heat if placed on the bed or anything soft and so on, specially has work from home raised and playing on a long train commute scenario would diminish (if any train has no plug.. ?) it is hard to imagine someone with the kind of money being a significant amount of time playing games far from power.

I thought there were safety requirements laptops and other mobile devices had to pass, for outer shell temperature. I can't remember the details though.
 
As far as pro markets go, the Mac users are more entrenched than ever, but CUDA and AVX-512 usage has also gone up significantly in many fields as well so it comes down to what you are doing/using. It's a crapshoot up there because regardless you are spending 5k+ on your workstations so it comes down to specifics on software and hardware requirements.

That I'll agree with. If you are going to cross-shop a high end professional Mac against a pro model PC you are going to land in a similar price territory. We make fun of their prices in our little hobby, but we are also not buying top end enterprise parts (Xeons, Epyc's, large quantities of fully buffered ECC RAM, professional GPU's (Quadro's, Radeon Pro's) etc. You do that in a PC build you'll quickly add up the pricetag as well.

Where the mac price winds up being insane is when an ordinary non-professional buys one to use for web browsing and office apps, something you can do just fine with a $150 chromebook.
 
$5k. Ouch. Apple charges crazy prices for upgrading memory / storage/ whatever else. And since its all integrated/soldered you dont really have a choice if you want to upgrade anything :/

They pretty much all do I think, charging crazy price for memory-storage, the difference is mostly between DIY and the rest of laptop maker than Apple and the rest in that regard.

Build a 4tbssd, high res type of resolution with 64 gig of ram on, good gpu laptop on a competitor, without that connectivity and it will cost something similar, dell xps usa charge $450 to go from 16 to 64 gig of ddr4 3200, $800 to go from 1 to 4 TB.....

https://www.dell.com/en-us/work/sho...0-laptop/ctox15w11p1c3002?view=configurations

This is $4378 without the software, much less screen res and connectivity, cpu-gpu, when you spec close to it
 
That I'll agree with. If you are going to cross-shop a high end professional Mac against a pro model PC you are going to land in a similar price territory. We make fun of their prices in our little hobby, but we are also not buying top end enterprise parts (Xeons, Epyc's, large quantities of fully buffered ECC RAM, professional GPU's (Quadro's, Radeon Pro's) etc. You do that in a PC build you'll quickly add up the pricetag as well.

Where the mac price winds up being insane is when an ordinary non-professional buys one to use for web browsing and office apps, something you can do just fine with a $150 chromebook.
Yeah, when your computer pays your bills 5k is cheap. I have a buddy who recently purchased a round of the upgraded afterburner cards for their work stations cost them some 40k but it let them take on an extra 4 or 5 projects per month as the render times were reduced so drastically, so they expect that 40K to pay for itself in 3-5 months based on the business they no longer have to turn away.
 
$5k. Ouch. Apple charges crazy prices for upgrading memory / storage/ whatever else. And since its all integrated/soldered you dont really have a choice if you want to upgrade anything :/
In a pro-market 5K is not a lot at all, I have had clients whose needs for single workstations easily put them in the 20K per box range and Apple's prices for memory aren't too exaggerated when you compare them to any other OEM. Apple's only actual drawback here is the lack of repair options, so Apple Care, and UPS's become a necessity to protect the investments. In regards to business deployments, things like memory or HDD upgrades aren't really a thing. Local HDD storage is temporary, you might be keeping a local project there but once it's done it goes to the appropriate network storage location, and software requirements for big things don't change too frequently in any drastic way so for single-purpose machines by the time your software has moved on to a point where you need to worry about ram differences you probably need a CPU & GPU upgrade as well that is assuming you upgrade your software at that time, and don't hold it off for a year or 3 while you get training, resources, and other requirements in line.

I mean yeah there are going to be lots of hobbyists and enthusiasts who want the new MBP's because "ooh shiny" but they are the same subset of users who would build a Threadripper system for gaming, Apple is more than happy to take their money but they aren't the target audience for the product.
 
I went with the base 32GB version. I'm hoping that should be enough. Yeah, I mean, once you're over $4000, $400 shouldn't mean anything in the grand scheme of things, but I'm debating cancelling this and just going 64 GB.
Yeah. I don’t expect to actually get $400 of resale value out of the 64GB when I’m through with it, but it might let me or a family member eke a few more years of life out of it and I don’t know what my use cases will look like in a few years so I figured why not?
 
BB has preorders for the Pro, $2000-$3000 or so depending on config.
 
Yeah. I don’t expect to actually get $400 of resale value out of the 64GB when I’m through with it, but it might let me or a family member eke a few more years of life out of it and I don’t know what my use cases will look like in a few years so I figured why not?
I'm more concerned about the unified memory. I didn't really see that at the time when I placed the order. I mean, my home desktop machine has 128 GB of RAM along with a 3090 & Titan RTX. I've been doing a lot of ML and Computer Vision lately. That stuff can eat up Video RAM like crazy. (Regular RAM, not so much, as it's more dependent upon the GPU).

I normally don't go laptops typically, but my old Mac is now out of XCode compatibility range, and lately, with a little more traveling, I've come to respect laptops a bit more than I did in the past.
 
I thought there were safety requirements laptops and other mobile devices had to pass, for outer shell temperature. I can't remember the details though.
It not to worry about burning the stuff the laptop is on, the issue is more that soft surface can take the laptop shape and block the natural airflow trap many laptop will depend on to cool themselve (laptop elevate themselve a bit from the hard table surface usually, to leave some room for air).
 
I wonder how they are measuring the performance for those claims.

Certainly this is not in per thread general purpose CPU computing?

We are talking things mixed in done on special purpose FPGA/ASIC chips, right?

I am very much anti special-purpose chips as an alternative to CPU's, as it is a huge affront to computing freedom. I mean, its highly effective in certain applications (like video encoding) but I can also buy a PCIe video encoding accelerator board if I really wanted to.
This is weird. If you’re given documented, programmatic APIs to use the hardware, how is it an affront to “computing freedom?”
 
x86 Mov command is turing complete, every other command is a special purpose accelerator.

First time I am looking at a Mac laptop seriously since they nixed the 17's. Its comparing pretty well to my desktop too... except for the price.
 
x86 Mov command is turing complete, every other command is a special purpose accelerator.

First time I am looking at a Mac laptop seriously since they nixed the 17's. Its comparing pretty well to my desktop too... except for the price.
It's a bit tricky to compare, moreso than with most laptops, since the upgrades in one area are often tied to another. Get a higher-end CPU and you get a higher-end GPU; upgrade your system RAM and it also improves your GPU memory.
 
Feels like Apple is creating these chips for mobile first. I am interested to see how they perform vs x86 when there is no power & thermal constraint of a mobile platform.
 
just an anecdote about software optimization.

my macbook pro m1 can take quite a while converting an image to png in photoshop. it can hang for maybe up to 10 seconds at 99%.

my 11th gen i7 vaio does it instantaneously. like, before you hit save instantaneously.

i may get the new one regardless.
It's because of the video audio integrated operations.
 
Feels like Apple is creating these chips for mobile first. I am interested to see how they perform vs x86 when there is no power & thermal constraint of a mobile platform.
In many ways they are. ARM's currently at its strongest in mobile, and the clear majority of Mac sales (as with many computer brands) are laptops. Apple either thinks it can scale up to workstation performance or is willing to make a tradeoff knowing that its core sales will be stronger. Apple's market share is up even with only the M1 Macs in play; that could go up further with the new MacBook Pros, and any growth in desktops will be icing on the proverbial cake.
 
Back
Top