Apple announces M1 Pro and M1 Max

In many ways they are. ARM's currently at its strongest in mobile, and the clear majority of Mac sales (as with many computer brands) are laptops. Apple either thinks it can scale up to workstation performance or is willing to make a tradeoff knowing that its core sales will be stronger. Apple's market share is up even with only the M1 Macs in play; that could go up further with the new MacBook Pros, and any growth in desktops will be icing on the proverbial cake.
They are 60w chips so very much smack in the Mobile & SFF power usage range, ARM's biggest strength is their efficiency at lower wattage ranges, it can scale up very well, Amazon's Graviton series server chips and Nvidia's Grace CPU's are awesome examples of this, but you then start playing into x86'd home turf and it becomes a much more level playing field.

I am sure they have a desktop-class workstation chip in the works something ranging from 95-120 watts, but at that form factor many of the design decisions that they have made for the M1 chip make far less sense because that is just too much to cool in such a small space, you are going to want to separate that CPU and GPU. Because up there you are going to be wanting to pair a 120w CPU with a 300w GPU and having that packaged as a single SoC is just a good way to make a big mess unless you get really fancy with cooling solutions which gets needlessly complex even for Apple's tastes.
 
just an anecdote about software optimization.

my macbook pro m1 can take quite a while converting an image to png in photoshop. it can hang for maybe up to 10 seconds at 99%.

my 11th gen i7 vaio does it instantaneously. like, before you hit save instantaneously.

i may get the new one regardless.
It's because of the video audio integrated operations.
I've never really considered converting to a PNG to be that intensive of an operation. I don't have photoshop but with Faststone converting a 4608x3456 image to PNG took less than a second on my 3900x.
Either somethings wrong with Photoshop for ARM Mac or ARM really sucks for converting images.
I'm going to lean towards Adobe being the problem since it doesn't take that long for my Android phone to convert images to PNG (750G snapdragon)
 
Also, with that much GPU power wouldn't they be advertising game performance and FPS? You would think they would do that but we don't know what they are measuring. Could just be GPGPU compute stuff...
Also, we all know how GPU drivers can be hard to get right for the latest games and Apple is pretty new to this whole GPU thing, just look at Intel...
 
Also, with that much GPU power wouldn't they be advertising game performance and FPS? You would think they would do that but we don't know what they are measuring. Could just be GPGPU compute stuff...
Also, we all know how GPU drivers can be hard to get right for the latest games and Apple is pretty new to this whole GPU thing, just look at Intel...
Apple doesn't give a shit about gaming because game makers don't give a shit about Apple. Apple would need to be less douchy to companies to try to stimulate some Apple native game development.

That ain't gonna happen. At least I wouldn't expect it to happen.

I'm not a member of the creepy Apple cult or the even creepier Apple-Hate cult but Apple has a pretty crappy attitude toward companies it feels it can't control.

I'd love to see them take some steps to make those Macs decent gaming machines and to see those new 27" cinema displays support high(ish) refresh rates, but I won't be holding my breath.
 
Also, with that much GPU power wouldn't they be advertising game performance and FPS? You would think they would do that but we don't know what they are measuring. Could just be GPGPU compute stuff...
Also, we all know how GPU drivers can be hard to get right for the latest games and Apple is pretty new to this whole GPU thing, just look at Intel...
Mac users aren't exactly avid gamers, there are very few AAA titles that get Mac ports, and even fewer that get native releases.
https://www.macgamerhq.com/apple-m1/native-mac-m1-games/

In terms of GPU power and FPS the two aren't exactly a linear relationship, I mean I have dozens of boxes running RTX 8000's and dual-socket Xeons, but they get trounced FPS wise by an i5 running an old 1080TI.

The Mac Book Pro's in terms of both price and feature set is aimed at people who use those machines to pay their bills, and if they are using one of the many software sets out there that have already been optimized for the Apple Silicon then these boxes are a steal even fully loaded at $5K. If their software isn't yet but will be soon then it's still a powerful machine, and if you are just wanting one for your day-to-day browsing and Netflix usage then good for you on having that kind of cash to burn.

The only way that gaming on Apple is going to take off (outside of iOS, which is huge) is if Apple gets their own studio together and starts publishing their own titles.
 
Apple doesn't give a shit about gaming because game makers don't give a shit about Apple. Apple would need to be less douchy to companies to try to stimulate some Apple native game development.

That ain't gonna happen. At least I wouldn't expect it to happen.

I'm not a member of the creepy Apple cult or the even creepier Apple-Hate cult but Apple has a pretty crappy attitude toward companies it feels it can't control.

I'd love to see them take some steps to make those Macs decent gaming machines and to see those new 27" cinema displays support high(ish) refresh rates, but I won't be holding my breath.
Apple is technically the largest game publisher on the planet, more money is spent on iOS games than any other individual platform. Gaming on laptop's/desktops hasn't been a priority for Apple and their laptop/desktop business plan and strategy does not at all play into that, any market that they would aim to serve for AAA titles at this stage could be better served by a console and that would be something incredibly hard for them to break into out of the blue.

I do think that Apple could do some great stuff if they were to launch their own gaming studio, they certainly have the ability to bring in talent, and more than enough resources to throw at it. I could see them doing it just to spite Epic really, I would buy an MBP just to play their blatant Fortnight rip-off.
 
x86 Mov command is turing complete, every other command is a special purpose accelerator.

First time I am looking at a Mac laptop seriously since they nixed the 17's. Its comparing pretty well to my desktop too... except for the price.
True.

I keep being reminded of this every time I run into artificial limitations in automation control software (*cough* Rockwell Automation), specifically the fact that their compiler implements every possible operation as an inline call, which bubbles memory usage upon the simplest combination of operations, leading to memory exhaustion on a ‘modern’ controller assembled in 2021 that rocks 640KB of memory.

Please also forgive the non-deterministic performance of their so-called “motion controllers.”

It’s a $1500 processor before adding any IO hardware to the BOM. Programming the thing costs a minimum of $3000 in software licensing. That’s before you run into being up-charged another $3000 to unlock IEC-standard language packs.

Want to be able to reliably source-protect your hard-won code with a uniquely generated private key stored on a $5 SD card? $5000.

Oh wait. You need operator visualization and interaction on a deprecated Windows CE platform with software written in the Cretaceous period? $4500.

Need tech support to figure all this out? That’ll be another $1000.

Then you realize they’ve artificially limited your feature set in anti-competitive fashion against their own “premier” code and are told to pound sand or threatened with litigation when you point it out?

Minus the processor that’s a brick without the above, these are annual figures.

Or:

I can pony up $3-4k every decade for an Apple machine, $100 in developer licensing, get support, verbose documentation, and a solid test bed to vet embedded code on readily available STM32 dev kits with ARM SOCs that are openly documented, with a bundle of assorted of IO interfaces, also freely documented.

Throw in a fantastic display, battery life, touchpad, secure OS, x86 (when will it die?) virtualization support, and bleeding edge specialized ASICs onboard with open API access?

“But I can’t freedom of compute the Secure Enclave!”

Okay.
 
Last edited:
Apple is technically the largest game publisher on the planet, more money is spent on iOS games than any other individual platform. Gaming on laptop's/desktops hasn't been a priority for Apple and their laptop/desktop business plan and strategy does not at all play into that, any market that they would aim to serve for AAA titles at this stage could be better served by a console and that would be something incredibly hard for them to break into out of the blue.

I do think that Apple could do some great stuff if they were to launch their own gaming studio, they certainly have the ability to bring in talent, and more than enough resources to throw at it. I could see them doing it just to spite Epic really, I would buy an MBP just to play their blatant Fortnight rip-off.
Not exactly the same game market/audience though. But I agree that Apple couldn't care less about "real" games.
With these products it seems like you can only use all of that power that the M1 chip has with only Apple's software or what Apple "says" you can use with it. Does it have AV1 hardware decoding/encoding? It seems all the newer stuff includes AV1 except Apple... But of course they add their proprietary ProRes format....

I would like to see handbrake x264 265 AV1 CPU encoding comparisons with other x86 CPUs.
All of that memory bandwidth available for the CPU sounds like it would have a massive advantage....
 
Last edited:
Not exactly the same game market/audience though. But I agree that Apple couldn't care less about "real" games.
With these products it seems like you can only use all of that power that the M1 chip has with only Apple's software or what Apple "says" you can use with it. Does it have AV1 hardware decoding/encoding? It seems all the newer stuff includes AV1 except Apple... But of course they add their proprietary ProRes format....

I would like to see handbrake x264 265 AV1 CPU encoding comparisons with other x86 CPUs.
All of that memory bandwidth available for the CPU sounds like it would have a massive advantage....
If ProRes was bad I’d give them shit for that but it’s actually good.

Apple targets a specific market and they seem to nail it more than not.
 
Looks like they are claiming 70% faster CPU multi perf on the M1 max vs M1, which puts it a good 10% faster than a 7700k but at a low tdp.
5nm really pays off!
 
I found a benchmark with an ARM version of Handbrake. A 7700k is 46% faster than the M1. Still impressive but a 3950x is about 4x faster (4.8x when CCX OC'd) than a M1

Not exactly fair putting a 14w ultrabook against a 190w desktop CPU. If you want a closer comparison try the M1 vs the 5900HS.

Those two are about neck and neck, but the 5900HS is usually paired with at least a RTX 3060, so graphics wise it dominates the M1. But if you compare the M1 to the HS’s Vega 8 based GPU then again they are about even.
 
If ProRes was bad I’d give them shit for that but it’s actually good.

Apple targets a specific market and they seem to nail it more than not.
Pro-res having proprietary encoders is the bane of my existence. It might dominate the market if they opened it up. As it stands companies are trying to use it to leverage against Black Magic Design who are taking no prisoners right now.
 
Pro-res having proprietary encoders is the bane of my existence. It might dominate the market if they opened it up. As it stands companies are trying to use it to leverage against Black Magic Design who are taking no prisoners right now.
I have to renew my licenses in Feb.... not looking forward to that
 
Not exactly fair putting a 14w ultrabook against a 190w desktop CPU. If you want a closer comparison try the M1 vs the 5900HS.

Those two are about neck and neck, but the 5900HS is usually paired with at least a RTX 3060, so graphics wise it dominates the M1. But if you compare the M1 to the HS’s Vega 8 based GPU then again they are about even.
Yeah, undervolting and setting a low cTDP like 35w for a 5950x would be interesting...
We haven't really seen a high power ARM CPU yet (95w+) so the M1 Max will at least be more interesting
 
Yeah, undervolting and setting a low cTDP like 35w for a 5950x would be interesting...
We haven't really seen a high power ARM CPU yet (95w+) so the M1 Max will at least be more interesting
Yeah, I think the whole M1 MAX package tops out at 90W, now that's fairly low power for a GPU, CPU, and RAM but I want to see real reviews from actual testing. It'll be interesting when outlets I trust give these a few vigorous floggings.

I miss those [H] reviews at times like this, even if Macs weren't something they tested here.
 
They are 60w chips so very much smack in the Mobile & SFF power usage range, ARM's biggest strength is their efficiency at lower wattage ranges, it can scale up very well, Amazon's Graviton series server chips and Nvidia's Grace CPU's are awesome examples of this, but you then start playing into x86'd home turf and it becomes a much more level playing field.

I am sure they have a desktop-class workstation chip in the works something ranging from 95-120 watts, but at that form factor many of the design decisions that they have made for the M1 chip make far less sense because that is just too much to cool in such a small space, you are going to want to separate that CPU and GPU. Because up there you are going to be wanting to pair a 120w CPU with a 300w GPU and having that packaged as a single SoC is just a good way to make a big mess unless you get really fancy with cooling solutions which gets needlessly complex even for Apple's tastes.

Too complex for apple does NOT make an ounce of sense. Remember the liquid cooled G5 system? They'll do it if they want. There's also nothing stopping them from making their own GPU either.
 
Too complex for apple does NOT make an ounce of sense. Remember the liquid cooled G5 system? They'll do it if they want. There's also nothing stopping them from making their own GPU either.
Yeah but the 5G's weren't trying to move 450+ watts from a surface area the size of a Credit card, you would need a full LCL for that and not one of those AiO jobs, not something that is viable to ship or enclose those require maintenance and that isn't what Apple is selling. They would need to divide it into two packages, a CPU and dedicated GPU, which is not the M1 package so modifying that for workstations probably isn't what Apple is doing they would want to build a new one for that.
 
Yeah but the 5G's weren't trying to move 450+ watts from a surface area the size of a Credit card, you would need a full LCL for that and not one of those AiO jobs, not something that is viable to ship or enclose those require maintenance and that isn't what Apple is selling. They would need to divide it into two packages, a CPU and dedicated GPU, which is not the M1 package so modifying that for workstations probably isn't what Apple is doing they would want to build a new one for that.
450+ watts from the surface area of a credit card is very doable. Look at the 3090 and 6900XT, and their watts/sq cm.
 
450+ watts from the surface area of a credit card is very doable. Look at the 3090 and 6900XT, and their watts/sq cm.
Those are 330 and 350w respectively but yeah I suppose so, but I would still think they would want to separate the two into their own individual chips on their own boards, otherwise, you would think they would have announced them this go around with the MBP's.
 
I have an overclocked 3990X that puts out 400+w and it stays chilly on an Emermax AIO with Noctua industrial fans. It's pretty loud though.
 
Workstations can be multiple chips quite happily. Apple is also used to building the case around the heatsink.

With PCIe 5 there are some just silly fast options for interconnecting various processors and ram.
 
This design reminds of AMD previous ideas using HBM on an interposer with an APU or just CPU and of course GPU's do that now. AMD already mapped those small footprint designs out which never came to fruition. Anyways with HBM, it would make the Apple bandwidth for memory look palty. With the high price of these upper Apple Mac's, will AMD be bold enough to have their own solution wiping the floor? Also will Apple expand this line in the future or next generation using HBM?
 
This design reminds of AMD previous ideas using HBM on an interposer with an APU or just CPU and of course GPU's do that now. AMD already mapped those small footprint designs out which never came to fruition. Anyways with HBM, it would make the Apple bandwidth for memory look palty. With the high price of these upper Apple Mac's, will AMD be bold enough to have their own solution wiping the floor? Also will Apple expand this line in the future or next generation using HBM?
Don't think AMD will. They're making as many desktop, server, and embedded chips as they can right now, and selling most or all of them. And Apple has this market pretty well covered already–either their customers already have Apple hardware and this is just a welcome upgrade, or they don't use the Apple ecosystem and won't benefit from it anyway (although that may change in the future).
 
This design reminds of AMD previous ideas using HBM on an interposer with an APU or just CPU and of course GPU's do that now. AMD already mapped those small footprint designs out which never came to fruition. Anyways with HBM, it would make the Apple bandwidth for memory look palty. With the high price of these upper Apple Mac's, will AMD be bold enough to have their own solution wiping the floor? Also will Apple expand this line in the future or next generation using HBM?
Apple did sign a big agreement with Hynix and Hynix did just announce their HBM3. With the unified System and GPU memory Apple is using now that could be a thing.
 
Apple did sign a big agreement with Hynix and Hynix did just announce their HBM3. With the unified System and GPU memory Apple is using now that could be a thing.
Yes considering one stack of HBM3 is spec at 819gb/s bandwidth with up to 32gb? (64gb?) of space but probably 2022/23 time frame. Form factor, size and maybe simplicity will win out. If Apple can start inwards into more laptop Workstations, larger share of the mobile market, that may push AMD/Intel to pack more punch in their mobile space solutions. I don't see anything that competes with the M1 Max from AMD or Intel. AMD TR Mobile :D, HBM + 16 core beast + Optional GPU on interposer.
 
Last edited:
Yes considering one stack of HBM3 is spec at 819gb/s bandwidth with up to 32gb? (64gb?) of space but probably 2022/23 time frame. Form factor, size and maybe simplicity will win out. If Apple can start inwards into more laptop Workstations, larger share of the mobile market, that may push AMD/Intel to pack more punch in their mobile space solutions. I don't see anything that competes with the M1 Max from AMD or Intel. AMD TR Mobile :D, HBM + 16 core beast + Optional GPU on interposer.
Yeah, AMD's current best mobile chip is the 5900HS, and that goes back and forth with the M1 (standard) in terms of both CPU performance and GPU performance due to AMD still using those old Vega 8 architectures with their current mobile line up. The Pro and Max are untouched in the mobile space currently overall, Intel and AMD can beat them out on specific things depending on the system pairings but as a single chip in that power/thermal envelope, Apple is currently unmatched.

There are some leaks of the new AMD Rembrandt APU making the rounds and it is looking OK, but unless that engineering sample gets some serious tuning it will still loose out to the Pro let alone the Max.
 
I have a friend that does youtube videos and his 4k renders went from just under 40 minutes on the mac pro to less than 5 minutes today (he got the 16" version). To be honest that is jaw dropping, especially when you consider it was on battery...

So far the consensus seems to be putting it roughly in the ballpark of an 11980HK for non optimized and basically in its own league for stuff that has been optimized for it and somewhere between an rtx 3060 and 3070 for unoptimized video performance. I have also not seen anything run on it in the high performance mode available on the 16".

I have a hard time justifying purchasing a laptop that expensive for what I do, but I look forward to drooling over the compile times of webkit and firefox on the 16".
 
I have a friend that does youtube videos and his 4k renders went from just under 40 minutes on the mac pro to less than 5 minutes today (he got the 16" version). To be honest that is jaw dropping, especially when you consider it was on battery...

So far the consensus seems to be putting it roughly in the ballpark of an 11980HK for non optimized and basically in its own league for stuff that has been optimized for it and somewhere between an rtx 3060 and 3070 for unoptimized video performance. I have also not seen anything run on it in the high performance mode available on the 16".

I have a hard time justifying purchasing a laptop that expensive for what I do, but I look forward to drooling over the compile times of webkit and firefox on the 16".
That's consistent with what I've seen in some reviews. Not that video encoding is everything, but many of the people who need pro laptops buy them with video production in mind. Before the pandemic, Marques Brownlee took an iMac with him to trade shows just to make sure his videos rendered in a decent amount of time; now, he could bring a MacBook Pro that's far faster than that (and likely any comparable Windows laptop, for that matter). Hell, I've heard even the M1 Pro is pretty great for these tasks.

I still wouldn't get a MacBook Pro for gaming (unless you're a big Apple Arcade fan) or some pro 3D modelling tasks, at least not yet, but Apple is making a very compelling case if you're involved in any kind of audio or visual editing.
 
That's consistent with what I've seen in some reviews. Not that video encoding is everything, but many of the people who need pro laptops buy them with video production in mind. Before the pandemic, Marques Brownlee took an iMac with him to trade shows just to make sure his videos rendered in a decent amount of time; now, he could bring a MacBook Pro that's far faster than that (and likely any comparable Windows laptop, for that matter). Hell, I've heard even the M1 Pro is pretty great for these tasks.

I still wouldn't get a MacBook Pro for gaming (unless you're a big Apple Arcade fan) or some pro 3D modelling tasks, at least not yet, but Apple is making a very compelling case if you're involved in any kind of audio or visual editing.
Ya, I don't know how true it is but apparently apple is starting to heavily invest in gaming. You could make a fantastic console out of these, but I have no idea what the cost would end up being. Most of the most popular games like WoW, FF14, PoE and others mostly run on mac now so it wouldn't be a huge leap.
 
Ya, I don't know how true it is but apparently apple is starting to heavily invest in gaming. You could make a fantastic console out of these, but I have no idea what the cost would end up being. Most of the most popular games like WoW, FF14, PoE and others mostly run on mac now so it wouldn't be a huge leap.
They have allocated a significant budget increase towards Apple Arcade, and I really do hope they can pull it off. I would be very interested to see what Apple can do to their AppleTV boxes in a few years time, they could be rather competent gaming consoles if they can get the right set of developers in their pockets.
 
Considering how huge the gaming industry is, bigger than video and yet Apple has basically been on the sidelines is perplexing on why not. Apple game console working with iPhones, AppleTV and iWatches would be very interesting maybe. Yet Apple is working on a car??? Why? At least AppleGlass is coming (AR).
 
Considering how huge the gaming industry is, bigger than video and yet Apple has basically been on the sidelines is perplexing on why not. Apple game console working with iPhones, AppleTV and iWatches would be very interesting maybe. Yet Apple is working on a car??? Why? At least AppleGlass is coming (AR).
Apple's been happy to participate in gaming — on mobile devices. That's partly because its mobile hardware has historically been more powerful than a lot of the Android crowd, and partly because developers supported gaming more on iOS than on the Mac. And I also suspect Apple 'gets' gaming more on handhelds for a number of reasons, including its tendency to see other devices as having other goals (Apple TV as a media hub, Macs as creative tools).

The car is a long-term project that might never pan out, but I suspect Apple is tackling that like it does when it enters other categories: it sees a 'broken' field where Apple can offer a fix. Think of how many cars have clunky interfaces, and how EVs and automation promise to simplify commuting... that's where Apple could come into play.
 
They have allocated a significant budget increase towards Apple Arcade, and I really do hope they can pull it off. I would be very interested to see what Apple can do to their AppleTV boxes in a few years time, they could be rather competent gaming consoles if they can get the right set of developers in their pockets.

An Apple TV with an M1 Max as a gaming platform would be interesting assuming they get the developers on board. They could do it right now and it would be superior to either console. At least from the benchmarks I’ve seen.

Subscription service to Apple Arcade and Apple TV+ already exists, just have a special package for those and cloud storage for saves and you’d be set.

Apple could easily buy a few good studios to get it going without much effort.
 
Last edited:
I ran some benchmarks on my M1 Max 16" last night, and it blew me away. With Cinemabench, and Geekbench 5. It was over the i9 processors from Intel. I'll reply with the screenshots as soon as I find them.
 
Attached are the performance tests and build I did on mine. Very shocking results. In auto mode for performance setting.
 

Attachments

  • F20F005F-49B3-44FD-B89A-EE93A205F60B.jpeg
    F20F005F-49B3-44FD-B89A-EE93A205F60B.jpeg
    198.5 KB · Views: 0
  • 30A21D77-F319-47F2-8736-677129F61CAD.jpeg
    30A21D77-F319-47F2-8736-677129F61CAD.jpeg
    361.9 KB · Views: 0
  • F65B7949-1EA6-47E1-BE67-2CABFCB87299.jpeg
    F65B7949-1EA6-47E1-BE67-2CABFCB87299.jpeg
    266.7 KB · Views: 0
  • 3F432F9B-C994-4A7F-AEDF-BBB2A6564C41.jpeg
    3F432F9B-C994-4A7F-AEDF-BBB2A6564C41.jpeg
    45.5 KB · Views: 0
  • 5B91FCA6-B7D1-4AF2-A6FF-84A455D14D61.jpeg
    5B91FCA6-B7D1-4AF2-A6FF-84A455D14D61.jpeg
    189.2 KB · Views: 0
Attached are the performance tests and build I did on mine. Very shocking results. In auto mode for performance setting.
I mean, I think it's expected performance for cinebench right? The M1 has the same single core performance, and we know they didn't change much on the individual cores. The multicore looks like it didn't scale perfectly though? M1 has 4 high performance cores, M1 pro/max have 8 high performance cores. Was expecting multicore score to double from ~7500 to ~15000. Still looks good.

https://www.notebookcheck.net/These...impressive-for-a-10-W-processor.504380.0.html
 
Apple's been happy to participate in gaming — on mobile devices. That's partly because its mobile hardware has historically been more powerful than a lot of the Android crowd, and partly because developers supported gaming more on iOS than on the Mac. And I also suspect Apple 'gets' gaming more on handhelds for a number of reasons, including its tendency to see other devices as having other goals (Apple TV as a media hub, Macs as creative tools).
Apple likes gaming on iOS because they get 15-30%. If they got that on MacOS, they'd be pushing Mac gaming. Since they don't, they do little to nothing for gaming on MacOS.
 
Back
Top