Apple leaks M1 Max Duo, and M1 Ultra

There are plenty of AMD based laptops that outperform the M1's. The M1 isn't about best performance but best battery life vs x86 offerings.

Then why do you have a 9900K? Obviously the M1 can't do certain things the 9900K machine can do, including 8TB SSD. I highly doubt the M1 would outperform the 9900K but I can't find any benchmarks since the M1's are always compared to other laptops, for good reason.

There is no 13 or 14 inch AMD laptop that outperforms a loaded 14in M1 Max system. Especially not when taking into account overall CPU, GPU, SSD, and RAM performance. If you want the fastest 13 or 14 inch laptop in the world, you've only got once choice - Apple.
 
Not exactly the TCO on a Mac is lower over a 4 year period than a PC in an Enterprise environment, a $2000 Macbook pro will easily outlive a $2000 PC in a grab and go environment, and once you factor in IT costs, software licensing, MDM systems, asset management, and all those other things Apple pulls way ahead for Total Cost of Ownership, there are a lot of studies to back this up.
My company's clients mostly aren't enterprise environments. Regular small and medium businesses that have always done things a certain way, I don't see jumping on the new ARM Macs, because they're different--or, like I said, the corporate procurement system is "we buy Dells, period", like my company. (In my company's case, it's a moot point anyway, as the software we develop our applications in is Windows-only, but that's already been discussed endlessly. Imagine a situation with something like Java, only it doesn't run on MacOS.) Plus, little 50-person companies, I don't think the stereotypical middle-aged HR lady who's been using Windows for 20 years and knows just enough to get by, isn't going to suddenly switch, even to a cheaper iMac as someone suggested above. Now, the next generation of people, who're more familiar with iOS? Sure, some of them, as they age into the workforce.
 
It'd be nice to see Apple cater more to gamers, but it doesn't "need" to embrace the gaming community. It really, truly doesn't.

Computer gaming is significant, but here's a dirty secret: most people don't buy a computer with gaming as a major factor. Research indicates 69 percent of the graphics hardware sold at the end of 2020 was Intel's, and a lot of that AMD video is likely integrated... in other words, few people buy PCs capable of smoothly playing the latest first-person shooter (or even ones from a few years back). So Apple would be chasing after a subset of the market, and one where it's difficult to get meaningful market share even for heavyweights like Dell and HP.

Apple is still generally best-served by appealing to everyday computer users and creatives. That covers a lot of people and plays to its strengths. Any attempt to court 'traditional' PC gamers wouldn't just involve new hardware and a more gaming-friendly framework; Apple would have to spend a very, very long time persuading major developers to release AAA titles on the Mac. And given that the PC gaming camp is practically the home of Anything But Apple fanatics, even that might not be enough.
Apple is a lifestyle product company, that is quite focused in execution. Additionally, they have been making steady progress on the perf/watt gpu segment for awhile now. Finally building out their own gpu solutions in a desktop product is the first step in creating a desktop gaming ecosystem. With that in place, they would actually have a reason to better integrate 'desktop' class gaming in their portfolio. It's a huge market, and not many know that better than. . .apple.

Up until now they've sold dozens of billions of mobile gaming devices. . .with the most powerful cellphone gaming gpus that are released annually.
 
Apple is a lifestyle product company, that is quite focused in execution. Additionally, they have been making steady progress on the perf/watt gpu segment for awhile now. Finally building out their own gpu solutions in a desktop product is the first step in creating a desktop gaming ecosystem. With that in place, they would actually have a reason to better integrate 'desktop' class gaming in their portfolio. It's a huge market, and not many know that better than. . .apple.

Up until now they've sold dozens of billions of mobile gaming devices. . .with the most powerful cellphone gaming gpus that are released annually.
I am envisioning an AppleTV rocking an A17x or M2 chip a few years down the road would make a hell of a console. Apple has the money that if they wanted to start up some gaming studios I have no doubt they could make a serious go of it. Their unified architecture and graphics model would give them a very large install base.
 
A lot of this I would bet is pent up market for gaming, but I would also bet a good portion of it is for crypto miners. People have long been buying Dell or whatever pre built machines just for the GPU’s. And the folks that are willing to pay a ridiculous premium on GPU’s are mostly people making money from them.
What does that say about Apple and how they're handling the growing gamer market? If not that then what about the crypto mining market? It's not beneath Apple to support gamers and crypto miners since a sale is a sale. They just don't have the hardware that can do that.

It begs the question: trade them for what? There are only two Intel based Macs at this point. The 27” iMac and the Mac Pro. If people are trading their $1200-$3000 computer in for ‘up to’ $1000 credit for a $1800 to $80,000 one then I guess Apple is good with that?
Wait, $80k? Does Apple actually sell $80k computers? I mean they do sell overpriced monitor stands so I could see Apple doing that. Also I imagine Apple will trade a range of years for credit towards an M1 based machine.
(Edit: also the article itself also clearly states this is to get people to upgrade to new M1 Pros and M1 Max machines. And as another addendum Apple is committed to being Arm only by the end of 2022. The 27” iMac will likely be replaced no later than late Q2 22’ and the Mac Pro will likely be the last to come in November 22’)
Yea and that's what I said. Apple wants more people to jump on their ARM based Macs and this is meant to kick start that move. I just don't see people dumping their Intel Macs so easily for the new M1 based Macs.
No one cares. The group of people that need to boot windows while on a Mac is a very small group of people. ARM Macs can already virtualize Windows through VMware or Parallels.
Yes but the performance and compatibility will not be the same.
If you need a PC to run things natively then you buy a PC.
We can see that with Apple's competitors growing in sales faster.
People just get what they want/need and I think it’s absurd to have to explain this to you.
I'm telling you what people will want/need when they're done trying to use an ARM based Mac. You wanna know what Mac users had to deal with when they weren't on x86?

Are Teslas for everyone?: No. Neither is any Mac. Neither is any PC.
Everyone wants a Tesla because gas prices are insane, but most people can't afford one. More than 50% of Americans can't afford the cheapest new car back in 2016, so imagine today post COVID economy? A car isn't the same as a PC as you usually don't need a loan to buy a PC. Though with graphic card prices....
If it needs to be a laptop there are sales on M1 Macs more or less constantly. And can be had for $1200, sometimes less.
Ever wonder why Dell and Lenovo are selling their machines by the butt load? Dell did have a 50% growth in market share 3rd quarter. Cause they don't charge as much as Apple does.
 
Ever wonder why Dell and Lenovo are selling their machines by the butt load? Dell did have a 50% growth in market share 3rd quarter. Cause they don't charge as much as Apple does.

Actually, it's because they use less sophisticated, slower CPUs that were not nearly as supply constrained. TSMC could not deliver enough chips for Apple, as they talked about during their earnings and guidance calls. Apple was backordered and everything shipped late and ship dates are still slow for custom configs. They literally sold every single new Macbook Pro that they made, and they could not manufacture enough to meet demand.

Right now Apple Store Pickup is unavailable for the majority of configurations due to supply constraints, and custom configurations are listing delivery mid Jan.

Yes but the performance and compatibility will not be the same.

You're right - the M1 MAX "emulating" x86 is faster than the fastest Intel mobile chips running native X86. With less power draw.

I don't get why so many people base their identities on being fanboys for a particular architecture or brand. Do what I do, just buy the best tool for the job regardless of who makes it.

  • My 3990X blows away my M1 MAX and every other processor for multithreaded high performance computing.
  • My loaded M1 Max MBP blows away every other 14in laptop on the market from a performance standpoint
  • My 9900k is the best rig I own for gaming, though I mostly use it for an etherium mining rig now.
No need to spread FUD trying to spin sales figures during a pandemic and chip shortage to fit your narrative. Just buy what's best for your needs and enjoy it.
 
Last edited:
Just reading this now, but did Apple annouce that the M2 was supposed to have come over the summer? Or they just milking the M1 line now..
 
Just reading this now, but did Apple annouce that the M2 was supposed to have come over the summer? Or they just milking the M1 line now..
Apple hasn't announced anything about the M2. The M1 Pro and Max are ultimately 'just' supersized M1s... but that's like saying a Bugatti Chiron's engine is just two V8s stuck together. Unofficially, many expect Apple to launch the M2 in early 2022 in a new standard MacBook (it might not be called the Air). The M1 Pro and Max will likely still be faster; it's just that the M2 will represent the start of Apple's next wave of Mac silicon.
 
What does that say about Apple and how they're handling the growing gamer market?
They don't particularly care - and haven't, for a very long time. Hasn't stopped them from making massive margins and revenue.
If not that then what about the crypto mining market?
They REALLY don't care.
It's not beneath Apple to support gamers and crypto miners since a sale is a sale. They just don't have the hardware that can do that.
And they don't particularly care. It's a concept called blue ocean vs red ocean - the gaming market is highly saturated and not worth particularly pursuing if you're not already in that market. Other people will potentially solve that problem, aim for other markets that folks aren't really competing in the same way (or redefine the market). I don't use my Mac laptop for gaming - I've got a lot of other systems for that.
Wait, $80k? Does Apple actually sell $80k computers? I mean they do sell overpriced monitor stands so I could see Apple doing that. Also I imagine Apple will trade a range of years for credit towards an M1 based machine.
They do sell massively overpriced bits - the maxed out Mac Pros are pretty competitive now price-wise to actual apples-to-apples comparisons, although you do pay a premium. But the stands and shit? LOL.
Yea and that's what I said. Apple wants more people to jump on their ARM based Macs and this is meant to kick start that move. I just don't see people dumping their Intel Macs so easily for the new M1 based Macs.
Of the Intel mac folks I know, all but two have either ordered an M1 based one or are planning on it next year. I'm not because mine is a brand new maxed out 16" that work provided (free > not free, even if I'd rather have an M1 Max based system), and the other guy is doing x86 coding - he uses it really as FreeBSD on a laptop.
Yes but the performance and compatibility will not be the same.
For most things, that doesn't matter :-/
We can see that with Apple's competitors growing in sales faster.
A big part of that has been the push over the last 18 months by corporate sales to finish the WFH migrations. Apple does not have enterprise-scale MDM solutions - they just don't. And the ones that exist kinda suck except at small to medium scale. Dell/Lenovo/etc, in the meantime, will sell you 5000 or 10000 or 20000 laptops all with the MDM software pre-installed, SCOM ready to go, and once delivered directly to the end user, it'll effectively image itself (or Dell/Lenovo/etc will fix it if something goes wrong) to the corporate image. Can't do that with Apple, which limits Macbook sales to more niche solutions in the large enterprise (even if those niches are often quite large). I used to sell for one of the OEMs in question - no one compared dell to Apple or Lenovo to Apple at that scale - you went with the random x86 vendor you picked and did 20000 of them with whatever MDM solution preinstalled, and they were all identical. But the execs and sales folks on the other side often had Apple laptops. At the same time, ARM isn't really a concern for those folks either - the bigger issue is the latest release of MacOS, which isn't fully supported by Airwatch/etc yet.
I'm telling you what people will want/need when they're done trying to use an ARM based Mac. You wanna know what Mac users had to deal with when they weren't on x86?

And most of them don't particularly care either. Apple primarily sells direct to consumer. Dell/etc sell both to consumer and B2B.
Ever wonder why Dell and Lenovo are selling their machines by the butt load? Dell did have a 50% growth in market share 3rd quarter. Cause they don't charge as much as Apple does.
Partially right - they don't charge what apple does because most end-users (esp. at the corporate level) don't need a lot of power. Thanks to Intel/AMD, you can be running just fine on a 5 year old level of processor performance and not even notice - Apple doesn't play in that space or price range, by choice (note: margins at that level are tiny). On top of that, my prior response (all the mdm imaging/etc) boosts corporate sales through the roof.

I'm a mixed user. I've had RISC, SPARC, Power, x86 - you name it, I've used it. I'm not a fanboy for anything except revolutionary technology (I have a 10980XE AND a 3960X, Nvidia and AMD gpus, etc) - I've used Apple laptops extensively (and Dell, and Asus, and ...) for the last 15 years. You're absolutely right that there is a trade-off to be made - they're historically BAD at gaming... but most of the people buying them, and most of the people using them, don't particularly care that much. That's not a use case in their mind. Even with the Intel switch, which let you game on a Mac laptop - it wasn't their goal, just a nice side effect.

From that standpoint, would I buy an M1 Max? Eh... I like to game. Doesn't fit my use case entirely - but I also don't tend to use my laptop as a gaming device. It's a work machine and sometimes media box. I've got my phone, or I can buy a switch if I need to game on the road. Or use GeforceNow/etc if I really needed to. I'm damned impressed by the design and the horsepower, although I do wonder if the cyclical nature of "encoding/decoding in silicon / custom hardware for tasks instead of general purpose CPUs and software" is going to make them less relevant faster, but we'll see. I'd love one to play with - but I don't know if I'd personally invest in one. But - I've also recommended them for folks who love the shit out of the ones they have.
 
lopoetve Apple is making advances in the MDM space and the asset management. With Apple School Manager for example they have business and enterprise versions as well. Jamf has also been making huge strides for large enterprise and Intune has come a long way and does work with Mac there too.
 
Linus Tech Tips just did a video on the M1 Max and the results aren't pretty. The most impressive thing on the M1 Max is that the video encoder seems to be unmatched by anything right now, but the GPU performance is not RTX 3060 levels of performance. A common problem is that nobody makes use of the Metal API and that the CPU performance does become a bottleneck for games. Also they use Dolphin a Wii/GameCube emulator to do tests and MSAA seems to tank on the M1 in general. The M1's are very powerful at video editing but beyond that it's GPU is nothing special, and the Zephyrus M16 a laptop that's half the price is usually doing twice as much performance. Also the M1 Max seems to run at 90C just like their Intel chips were running. That seems oddly hot for something that's not suppose to be?

 
I don't understand, who is buying pickup trucks? They suck at gaming! What is wrong with all these people? My Intel laptop games way better than a $100k F250 Platinum! Can't believe how overpriced they are!

I mean, sure - the F250 tows better than my laptop, and drives better than my laptop, and sounds better than my laptop but who needs that? I highly doubt Ford will be successful selling them. Fucking truck can't even mine crypto for $100k.
 
Linus Tech Tips just did a video on the M1 Max and the results aren't pretty. The most impressive thing on the M1 Max is that the video encoder seems to be unmatched by anything right now, but the GPU performance is not RTX 3060 levels of performance. A common problem is that nobody makes use of the Metal API and that the CPU performance does become a bottleneck for games. Also they use Dolphin a Wii/GameCube emulator to do tests and MSAA seems to tank on the M1 in general. The M1's are very powerful at video editing but beyond that it's GPU is nothing special, and the Zephyrus M16 a laptop that's half the price is usually doing twice as much performance. Also the M1 Max seems to run at 90C just like their Intel chips were running. That seems oddly hot for something that's not suppose to be?


I'd say the GPU's performance varies heavily depending on the task. There are some situations where the M1 Max performs like a very high-end GPU, which is something Apple couldn't really claim with the GPUs it had to use in Intel-based MacBook Pros, but it's definitely true that the stars need to align (the app, the OS, the hardware) for that to happen. It seems like Apple's initial goal was to cater to both standard computer tasks (browsing, playing videos) and its well-known base of audiovisual editors. I'd have a hard time picking anything other than the MacBook Pro for AV edits unless my particular apps weren't optimized.

Two things I'd note. To start, part of Apple's pitch is that you can sustain high performance on battery. The test doesn't show how much the M16 throttles back (and it will), and that could matter if you're doing in-the-field work. And the M1 Max is running hot both because it's in a 14-inch chassis (the 16-inch runs a bit cooler) and because Apple doesn't spin the fans as aggressively as ASUS does. The M16 will sound like a jet turbine under load; the MacBook Pro will still be relatively quiet. Now, that won't matter much if you're more interested in churning through a 3D render than hearing the world around you, but it is important if you're an audio/video editor.
 
lopoetve Apple is making advances in the MDM space and the asset management. With Apple School Manager for example they have business and enterprise versions as well. Jamf has also been making huge strides for large enterprise and Intune has come a long way and does work with Mac there too.
Oh yes, come a LONG way, but have a VERY long way to go. I can effectively update the corporate image in 50,000 Dell laptops in a couple of days, assuming they connect to the internet. I can do the same with iPads, but... not MacOS yet (at least not last year). Nor do I have the same level of control. Yet. Nor the flexibility ot define optional apps and corporate policies for licensing and distribution, limits, automation, etc. And those developers aren't standing still either - especially with Covid - this is big business right now.

Will it get there? Probably - but corporate use of Macs is still a niche business in comparison; the addressable market is much smaller because of the lack of MDM support, but you don't get full levels of management without an addressable market, and the extra cost makes it harder too, and so on - plus lack of reparability makes it harder.
 
Linus Tech Tips just did a video on the M1 Max and the results aren't pretty. The most impressive thing on the M1 Max is that the video encoder seems to be unmatched by anything right now, but the GPU performance is not RTX 3060 levels of performance. A common problem is that nobody makes use of the Metal API and that the CPU performance does become a bottleneck for games. Also they use Dolphin a Wii/GameCube emulator to do tests and MSAA seems to tank on the M1 in general. The M1's are very powerful at video editing but beyond that it's GPU is nothing special, and the Zephyrus M16 a laptop that's half the price is usually doing twice as much performance. Also the M1 Max seems to run at 90C just like their Intel chips were running. That seems oddly hot for something that's not suppose to be?


Games are not the only use for GPU cores, and while it matters for a lot of us (heck, it makes ~me~ pause, and I don't travel with a laptop anymore so it honestly doesn't matter for my use cases), it's not their market. For what they're targeting - it's selling as fast as they can make the things. It's a general purpose system and video/photo editing powerhouse with stupid long battery life. And games? We'll see.

Heck, even before Covid I had an Alienware 13 with the 1060 6G in it - and I took that on trips maybe 6 times over 3 years? After work, I don't generally feel like gaming. I wanted a book and a drink and sleep.
 
Oh yes, come a LONG way, but have a VERY long way to go. I can effectively update the corporate image in 50,000 Dell laptops in a couple of days, assuming they connect to the internet. I can do the same with iPads, but... not MacOS yet (at least not last year). Nor do I have the same level of control. Yet. Nor the flexibility ot define optional apps and corporate policies for licensing and distribution, limits, automation, etc. And those developers aren't standing still either - especially with Covid - this is big business right now.

Will it get there? Probably - but corporate use of Macs is still a niche business in comparison; the addressable market is much smaller because of the lack of MDM support, but you don't get full levels of management without an addressable market, and the extra cost makes it harder too, and so on - plus lack of reparability makes it harder.
Yeah not fully there yet, I’m having my Jamf migrated from on-site to the cloud right now and updating it a few versions. I’m looking forward to playing with the new features.

Apple repair options are bad and until those are better large enterprise is a no go.
 
Last edited:
They don't particularly care

They REALLY don't care.

And they don't particularly care.
Seems like Apple doesn't care. In any case then Apple doesn't care about losing market share then.
For most things, that doesn't matter :-/
Windows applications don't matter? As a Linux guy that's the #1 reason why I haven't completely abandoned Windows because performance and compatibility do matter.

I used to sell for one of the OEMs in question - no one compared dell to Apple or Lenovo to Apple at that scale - you went with the random x86 vendor you picked
Random you say? Who has the biggest year over year growth so far?
rategy-Analytics-notebook-market-report-in-Q3-2021.jpg

And most of them don't particularly care either.
We'll find out in time won't we?
I'm a mixed user. I've had RISC, SPARC, Power, x86 - you name it, I've used it.
I use PowerPC, RISC, and x86 all the time and so did most of us. What's your point?
I don't understand, who is buying pickup trucks? They suck at gaming! What is wrong with all these people? My Intel laptop games way better than a $100k F250 Platinum! Can't believe how overpriced they are!

I mean, sure - the F250 tows better than my laptop, and drives better than my laptop, and sounds better than my laptop but who needs that? I highly doubt Ford will be successful selling them. Fucking truck can't even mine crypto for $100k.
You know what the problem is with your analogy besides being a bad one? Apple is the one making the comparison to a RTX 3080 and not me. Apple, APPLE, APPLE is the one. You don't compare your product to a gaming GPU without gaming being involved. So yes, Ford would compare the F250 to a Porsche and claim to be as fast, if this analogy were to actually make sense.

I'd say the GPU's performance varies heavily depending on the task. There are some situations where the M1 Max performs like a very high-end GPU, which is something Apple couldn't really claim with the GPUs it had to use in Intel-based MacBook Pros, but it's definitely true that the stars need to align (the app, the OS, the hardware) for that to happen. It seems like Apple's initial goal was to cater to both standard computer tasks (browsing, playing videos) and its well-known base of audiovisual editors. I'd have a hard time picking anything other than the MacBook Pro for AV edits unless my particular apps weren't optimized.
Like I said, Apple is the one comparing their GPU performance to that of a RTX 3080, which is a gaming GPU. Fact is Apple's GPU isn't what matters but the video encoder they put on the GPU, which is less to do with the GPU and more to do with the encoder. People tend to think the GPU does this work, which it does but to a lesser extent. Intel has Quick Sync which is similar to what you see on Apple's GPU's.
And the M1 Max is running hot both because it's in a 14-inch chassis (the 16-inch runs a bit cooler)
Linus Tech Tips does show the 16 inch one isn't different.
and because Apple doesn't spin the fans as aggressively as ASUS does.
They also didn't when they used Intel chips which can reach over 100C. They can feel free to spin those fans to win to prevent long term damage to those chips.
The M16 will sound like a jet turbine under load; the MacBook Pro will still be relatively quiet. Now, that won't matter much if you're more interested in churning through a 3D render than hearing the world around you, but it is important if you're an audio/video editor.
We making excuses for Apple again? I'm not even sure how the M16 sounds vs the Apple, but I'd rather have a cooler running laptop than a hotter one, just to avoid it damaging itself over time. Also avoid burning my lap. It is called a LAPtop for a reason after all.
 
Random you say? Who has the biggest year over year growth so far?
Apple has had a bigger sales grown in 2020 and 2021 than Acer has but your chart is only PC makers so they will not be on that list
Most of those vendors massive sales boosts were on low cost ChromeOS devices, which actually causes a global shortage on the Intel Celeron N4020, which was mostly then replaced by the Ryzen 3 3200u when Intel couldn't get more to the OEM's on time.
Chromebook sales this last 2 years has been very good for Google and Intel.
 
Seems like Apple doesn't care. In any case then Apple doesn't care about losing market share then.
It's market share they didn't have. They are aiming at a different market - one arguably with higher margins, as their revenue and profit numbers show.
Windows applications don't matter? As a Linux guy that's the #1 reason why I haven't completely abandoned Windows because performance and compatibility do matter.
Not to Apple users, apparently. I don't have bootcamp on either of my Intel based macs, and I dropped fusion off long ago. They're productivity machines, not gaming boxes. I've yet to find a windows app that I need that doesn't have a native MacOS version, outside of games - and that's not what it's for. They're productivity machines - not gaming boxes.
Random you say? Who has the biggest year over year growth so far?
You misunderstood me - "you pick a random x86 vendor" meant you as in the business buying a crap ton of laptops, not you as in DukenukemX. The corporate side buys Lenovo/Dell/HP/Asus/whatever. Individuals tend to buy Apple on the higher end (corporate does too, but in far smaller amounts than the other vendors) or x86 if they're gamers or cost-conscious. A large portion of that growth is in the enterprise space, which is mostly MDM and cost focused.
We'll find out in time won't we?

I use PowerPC, RISC, and x86 all the time and so did most of us. What's your point?
I'm pointing out that Apple has ALWAYS had "not-for-gaming" attached - it wasn't a priority with x86 and MacOS either! Are you bothered that your RISC and PowerPC systems can't run games or windows apps?
You know what the problem is with your analogy besides being a bad one? Apple is the one making the comparison to a RTX 3080 and not me. Apple, APPLE, APPLE is the one. You don't compare your product to a gaming GPU without gaming being involved. So yes, Ford would compare the F250 to a Porsche and claim to be as fast, if this analogy were to actually make sense.
GPUs are used for a lot more than just games. Especially in laptops, where there are rarely any form of prosumer or professional GPUs (Asus latest release aside).
Like I said, Apple is the one comparing their GPU performance to that of a RTX 3080, which is a gaming GPU. Fact is Apple's GPU isn't what matters but the video encoder they put on the GPU, which is less to do with the GPU and more to do with the encoder. People tend to think the GPU does this work, which it does but to a lesser extent. Intel has Quick Sync which is similar to what you see on Apple's GPU's.

Linus Tech Tips does show the 16 inch one isn't different.

They also didn't when they used Intel chips which can reach over 100C. They can feel free to spin those fans to win to prevent long term damage to those chips.

We making excuses for Apple again? I'm not even sure how the M16 sounds vs the Apple, but I'd rather have a cooler running laptop than a hotter one, just to avoid it damaging itself over time. Also avoid burning my lap. It is called a LAPtop for a reason after all.
Look. It's clear that the M1 series is not aimed at your use cases. That's fine. It's not particularly aimed at mine either, but at the moment, laptops in ~general~ aren't aimed at my use cases right now - COVID sucks after all, and why use a laptop when I have both an Intel Extreme Edition and Threadripper sitting right here? But it IS aimed at a huge number of people that are buying them like hot cakes, and the M1 series of chips is incredibly powerful at the tasks it was designed for - video and photo work. Gaming never came into their equation - it's not a particular priority for Apple on laptops or desktops, just on their mobile devices because it drives app store revenue. If you want an x86 laptop for games and other uses - cool, buy one. I don't personally know what I'd get if I was back on the road and needed a system that could do everything, but that's me. For productivity? I'd take an M1 system in an instant - but I don't particularly need a laptop either.
 
Apple has had a bigger sales grown in 2020 and 2021 than Acer has but your chart is only PC makers so they will not be on that list
Most of those vendors massive sales boosts were on low cost ChromeOS devices, which actually causes a global shortage on the Intel Celeron N4020, which was mostly then replaced by the Ryzen 3 3200u when Intel couldn't get more to the OEM's on time.
Chromebook sales this last 2 years has been very good for Google and Intel.
As well as basic low-cost $500-700 laptops for office folk. A %@#%@# ton of those were sold to big business these last two years.
 
We making excuses for Apple again? I'm not even sure how the M16 sounds vs the Apple,
I've got a Zephyrus G14, the one with the 4900HS and RTX 2060, and while the fans spin up, they're not obnoxious for a gaming laptop. Sure, you can hear them, but the sound has been tuned, for one thing, so it's not the obnoxious whine you sometimes get, and, like I said, it's not all that loud. A lot of it clearly depends on how much effort the designer puts into it.
 
I've got a Zephyrus G14, the one with the 4900HS and RTX 2060, and while the fans spin up, they're not obnoxious for a gaming laptop. Sure, you can hear them, but the sound has been tuned, for one thing, so it's not the obnoxious whine you sometimes get, and, like I said, it's not all that loud. A lot of it clearly depends on how much effort the designer puts into it.
My Dell G5 with a 9750H and an RTX 2060 sounds like a god damned jet engine unless I have it on a cooling pad. But it does everything I've needed it to and most of the things I've wanted it to.
 
Dell G5 with a 9750H and an RTX 2060 sounds like a god damned jet engine
Yeah, Dell's exactly the people who don't seem to put a lot of design into stuff except "how can we lower our BOM cost by a nickel with stuff like replacing the 24-pin ATX connector with a smaller one, or cheaping out on the CPU fan?" I've got coworkers with Latitudes (and I owned one of the last model of single-core 17" gaming Inspirons), and the Zephyrus is just engineered better. If I could talk my company into getting one for my next PC I'd be in heaven, but they'd never not buy Dell.
 
Yeah, Dell's exactly the people who don't seem to put a lot of design into stuff except "how can we lower our BOM cost by a nickel with stuff like replacing the 24-pin ATX connector with a smaller one, or cheaping out on the CPU fan?" I've got coworkers with Latitudes (and I owned one of the last model of single-core 17" gaming Inspirons), and the Zephyrus is just engineered better. If I could talk my company into getting one for my next PC I'd be in heaven, but they'd never not buy Dell.
Dell's billing and admin features make accounting and administrative stuff really easy and when things are in warranty their support and parts availability (under usual circumstances) is really good. Right now they can't get me replacement laptop screens to save my life, the new disinfectant spray destroyed a room of them when the cleaning crew wiped down a conference room..... placed the request in Oct, ETA early Feb 2022.

But Dell is all about utility for most things, just get it done fast, cheap, and make it easy to fix.
 
You know what the problem is with your analogy besides being a bad one? Apple is the one making the comparison to a RTX 3080 and not me. Apple, APPLE, APPLE is the one. You don't compare your product to a gaming GPU without gaming being involved. So yes, Ford would compare the F250 to a Porsche and claim to be as fast, if this analogy were to actually make sense.

What you don't seem to understand - probably because you do not do any actual high performance computing outside of gaming - is that the RTX series of video cards are not just "gaming cards." I have four RTX 2080 TI and they have never once loaded a game. They are used for compute in my threadripper box.

Do you really think you can't buy a RTX series card right now because everyone loves to game and gamers are dropping $2-3k on GPUs left and right? Lol, no. A RTX 3090 is worth more than the entire computer in your signature. You can't buy them because you're competing with businesses like mine who will gladly pay $2500-3k for a RTX 3090 or $2k for a RTX 3080 - in bulk - because buying five of them is cheaper than two Tesla/ RTX A series/whatever cards that do the same damn thing. Add crypto - which is not gaming - and video/3d rendering into the equation and I'd say the majority of "gaming" RTX cards are not used for gaming.

If you want to understand just how many RTX cards were being used for compute (ML and crypto) instead of their workstation cards, this was such a big deal to NVIDIA that they actually forced all of their partners to discontinue blower RTX 3090 cards because companies like mine were scooping them up in bulk and loading them up in 4u servers for ML and saving tens of thousands of dollars.
 
Last edited:
What you don't seem to understand - probably because you do not do any actual high performance computing outside of gaming - is that the RTX series of video cards are not just "gaming cards." I have four RTX 2080 TI and they have never once loaded a game. They are used for compute in my threadripper box.

Do you really think you can't buy a RTX series card right now because everyone loves to game and gamers are dropping $2-3k on GPUs left and right? Lol, no. A RTX 3090 is worth more than the entire computer in your signature. You can't buy them because you're competing with businesses like mine who will gladly pay $2500-3k for a RTX 3090 or $2k for a RTX 3080 - in bulk - because buying five of them is cheaper than two Tesla/ RTX A series/whatever cards that do the same damn thing. Add crypto - which is not gaming - and video/3d rendering into the equation and I'd say the majority of "gaming" RTX cards are not used for gaming.

If you want to understand just how many RTX cards were being used for compute (ML and crypto) instead of their workstation cards, this was such a big deal to NVIDIA that they actually forced all of their partners to discontinue blower RTX 3090 cards because companies like mine were scooping them up in bulk and loading them up in 4u servers for ML and saving tens of thousands of dollars.
And a lot of those cards can be stuffed into a server box en-masse, and generate less heat / use less power than the Tesla cards, AND are the same price even at markup. Plus you often get MORE cores that way. I had a customer at my last place with 10 R730s all filled with 2080TIs for that use case - they tried an R740 with V100s - got more out of the 2080s than they did the V100s, as they could simply stuff a pile of them in there in comparison.

Heck, one of the friends that bought an M1 Max (32 GPU core version) got it specifically to see what he could do on the ML side - he's a data scientist by trade, and is really curious what they can do.
 
And a lot of those cards can be stuffed into a server box en-masse, and generate less heat / use less power than the Tesla cards, AND are the same price even at markup. Plus you often get MORE cores that way. I had a customer at my last place with 10 R730s all filled with 2080TIs for that use case - they tried an R740 with V100s - got more out of the 2080s than they did the V100s, as they could simply stuff a pile of them in there in comparison.

Heck, one of the friends that bought an M1 Max (32 GPU core version) got it specifically to see what he could do on the ML side - he's a data scientist by trade, and is really curious what they can do.

Yup, every single card I own is a blower for exactly that reason. Fan noise? Who cares, set it to 100% and eject the heat out the back of the rackmount server. Absolutely sucks they are so hard to find nowadays but NVIDIA got wise to where they all were going. Gamers? LOL no. The whole idea behind the Founders series cooler where it's more than 2 slots and requires clearance on the back of it due to the awkward fan location was 100% specifically to prevent rackmount corporate server usage.

We are itching to upgrade our 2080TIs and will probably resort to overpaying for the Gigabyte Turbo RTX 3090s on StockX.
 
Like I said, Apple is the one comparing their GPU performance to that of a RTX 3080, which is a gaming GPU. Fact is Apple's GPU isn't what matters but the video encoder they put on the GPU, which is less to do with the GPU and more to do with the encoder. People tend to think the GPU does this work, which it does but to a lesser extent. Intel has Quick Sync which is similar to what you see on Apple's GPU's.
Gonna reiterate what others said. We know you think PC gaming is the center of the universe, but here's a fun fact: you can compare performance of mainstream GPUs for things other than gaming. It's a shocker, I know. While Apple is definitely cherry-picking its benchmarks, it's also under no pretense that you're buying a MacBook Pro with gaming as a major focus.

You're right in that Apple clearly leans on the video encode/decode hardware, but we shouldn't discount the GPU too much. And that still means M1-based Macs are very good choices if you're editing video. They're also good picks for audio and photo editing, although those obviously don't lean on GPUs much.


We making excuses for Apple again? I'm not even sure how the M16 sounds vs the Apple, but I'd rather have a cooler running laptop than a hotter one, just to avoid it damaging itself over time. Also avoid burning my lap. It is called a LAPtop for a reason after all.
Here's the thing, though: while I'm not sure I'd want my CPU running at 90F for long stretches, it often doesn't... and importantly, it also doesn't cook your legs in the process. My guess is that Apple prioritized quietness and either believed the temperatures were viable or that there were diminishing returns for ramping the fans higher. It's hard to say if Apple made the right choice without long-term tests or knowing what was happening in the lab, but it's still quieter and more lap-friendly than the M16. And, as mentioned earlier, it doesn't significantly degrade performance while on battery.
 
Not to Apple users, apparently. I don't have bootcamp on either of my Intel based macs, and I dropped fusion off long ago. They're productivity machines, not gaming boxes.
Apple seems to be pushing for video editing as their niche, which is understandable since video editing is at an all time high with all the game streamers with Twitch and YouTube.
What you don't seem to understand - probably because you do not do any actual high performance computing outside of gaming - is that the RTX series of video cards are not just "gaming cards." I have four RTX 2080 TI and they have never once loaded a game. They are used for compute in my threadripper box.
Nvidia doesn't agree with you. Nvidia isn't advertising their RTX cards for productivity. Can they be? Yes, but Nvidia is trying hard to prevent that from being a thing with all the limits they put into crypto mining.
maxresdefault.jpg

Do you really think you can't buy a RTX series card right now because everyone loves to game and gamers are dropping $2-3k on GPUs left and right? Lol, no. A RTX 3090 is worth more than the entire computer in your signature. You can't buy them because you're competing with businesses like mine who will gladly pay $2500-3k for a RTX 3090 or $2k for a RTX 3080 - in bulk - because buying five of them is cheaper than two Tesla/ RTX A series/whatever cards that do the same damn thing. Add crypto - which is not gaming - and video/3d rendering into the equation and I'd say the majority of "gaming" RTX cards are not used for gaming.
Is Nvidia pushing RTX cards for mining or is Nvidia crippling their cards to prevent people for buying them for miners? Again, you're not wrong but we're going by that Nvidia is pushing RTX cards as gamer cards. Therefore Apple comparing their GPU performance to RTX is like comparing their products to gamer cards, not products for video editing.
If you want to understand just how many RTX cards were being used for compute (ML and crypto) instead of their workstation cards, this was such a big deal to NVIDIA that they actually forced all of their partners to discontinue blower RTX 3090 cards because companies like mine were scooping them up in bulk and loading them up in 4u servers for ML and saving tens of thousands of dollars.
Tell your company I hate them.
Gonna reiterate what others said. We know you think PC gaming is the center of the universe,
Gaming plays such a big role in computing that GPU's wouldn't exist today if it wasn't for gaming. Doom outsold Windows 95 to the point where Bill Gates had to dress up as the Doom guy to push for Windows 95. Microsoft now puts serious resources into Xbox, and has even named their search engine after Cortana from Halo.
but here's a fun fact: you can compare performance of mainstream GPUs for things other than gaming. It's a shocker, I know. While Apple is definitely cherry-picking its benchmarks, it's also under no pretense that you're buying a MacBook Pro with gaming as a major focus.
I think that some people who do work on their computers probably also play games. The M1 is not going to make playing games easy or even possible.
Here's the thing, though: while I'm not sure I'd want my CPU running at 90F for long stretches, it often doesn't... and importantly, it also doesn't cook your legs in the process. My guess is that Apple prioritized quietness and either believed the temperatures were viable or that there were diminishing returns for ramping the fans higher. It's hard to say if Apple made the right choice without long-term tests or knowing what was happening in the lab, but it's still quieter and more lap-friendly than the M16. And, as mentioned earlier, it doesn't significantly degrade performance while on battery.
Apple isn't the only one who chooses to let their laptops cook at 90C or higher, but I feel this is done as planned obsolescence. I will not buy a laptop that at any point will run their temperature to 90C or higher. I plan to use my hardware for a long time and don't want to deal with repairing expensive things when they break. Though that's just me.
 
Linus Tech Tips just did a video on the M1 Max and the results aren't pretty. The most impressive thing on the M1 Max is that the video encoder seems to be unmatched by anything right now, but the GPU performance is not RTX 3060 levels of performance. A common problem is that nobody makes use of the Metal API and that the CPU performance does become a bottleneck for games. Also they use Dolphin a Wii/GameCube emulator to do tests and MSAA seems to tank on the M1 in general. The M1's are very powerful at video editing but beyond that it's GPU is nothing special, and the Zephyrus M16 a laptop that's half the price is usually doing twice as much performance. Also the M1 Max seems to run at 90C just like their Intel chips were running. That seems oddly hot for something that's not suppose to be?



With Apple rather lack lusty GPU performance, how are they going to even produce/compete in the VR/AR metaverse type market?

https://finance.yahoo.com/news/appl...-can-take-vr-and-ar-mainstream-170246228.html

Is metaverse overhyped? Probably. Will it overcome and be the next Internet of things? At this time I would say no mostly because a lot of folks just plain get sick in VR, don't stick around once used etc. Hard sell.
 
Gaming plays such a big role in computing that GPU's wouldn't exist today if it wasn't for gaming. Doom outsold Windows 95 to the point where Bill Gates had to dress up as the Doom guy to push for Windows 95. Microsoft now puts serious resources into Xbox, and has even named their search engine after Cortana from Halo.
You... missed the point, and proved ours.

The crux is that you have a highly distorted, irrational view of the importance of PC gaming. You literally have difficulty imagining that Apple isn't concerned about selling to hardcore gamers, and will do just fine without them (it might sell more units, but probably not enough to seriously move the needle). You have trouble picturing people using consumer GPUs for non-gaming purposes, even though other people here have shared direct, real-world experiences to that effect. It's as if you're physically incapable of understanding that not everyone uses a computer the same way you do.

(Also, side note: there is zero evidence to suggest Gates' Doom pitch had a meaningful effect on Windows 95 sales. Some people lined up to buy Windows 95 without even realizing they needed a computer. Jay Leno and the Rolling Stones likely had a much larger impact.)
 
With Apple rather lack lusty GPU performance, how are they going to even produce/compete in the VR/AR metaverse type market?

https://finance.yahoo.com/news/appl...-can-take-vr-and-ar-mainstream-170246228.html

Is metaverse overhyped? Probably. Will it overcome and be the next Internet of things? At this time I would say no mostly because a lot of folks just plain get sick in VR, don't stick around once used etc. Hard sell.
I love the idea of VR and I can play beat saber for maybe up to an hour, but any title that has me sitting down is a one-way trip to Pukesville.
 
Last edited:
With Apple rather lack lusty GPU performance, how are they going to even produce/compete in the VR/AR metaverse type market?

https://finance.yahoo.com/news/appl...-can-take-vr-and-ar-mainstream-170246228.html

Is metaverse overhyped? Probably. Will it overcome and be the next Internet of things? At this time I would say no mostly because a lot of folks just plain get sick in VR, don't stick around once used etc. Hard sell.
It's pretty easy: Apple's GPU performance for portable devices is very good, and it already has a strong mobile app ecosystem.

Think about it. Right now, the best stand-alone VR headset is the Quest 2, which uses a headset variant of the Snapdragon 865. Meanwhile, Apple is already shipping tablets with an M1 chip that obliterates the XR2 and 865 and competes more with PCs than phones. It wouldn't be hard for Apple to ship a headset with an M2 variant (or a similar chip) that stays well ahead of Qualcomm's next SoC. And the apps? However much you might dislike Apple's lock-in, the company clearly knows how to run app stores for mobile devices.

The biggest challenge is, as you said, the appeal of AR/VR. You gotta be content with strapping something to your head for prolonged periods, however capable and comfortable it is.
 
It's pretty easy: Apple's GPU performance for portable devices is very good, and it already has a strong mobile app ecosystem.

Think about it. Right now, the best stand-alone VR headset is the Quest 2, which uses a headset variant of the Snapdragon 865. Meanwhile, Apple is already shipping tablets with an M1 chip that obliterates the XR2 and 865 and competes more with PCs than phones. It wouldn't be hard for Apple to ship a headset with an M2 variant (or a similar chip) that stays well ahead of Qualcomm's next SoC. And the apps? However much you might dislike Apple's lock-in, the company clearly knows how to run app stores for mobile devices.

The biggest challenge is, as you said, the appeal of AR/VR. You gotta be content with strapping something to your head for prolonged periods, however capable and comfortable it is.
An M1 runs too hot for something to be strapped to your face, there isn't enough surface area to move that heat. The A15 would be more than enough at this stage. Let alone any future A variants, the A16 assuming no changes, made on the N4P node would be 11% faster while using 22% less energy while being 6% smaller than the existing A15.

When the M1 is running at low voltages 7-14w yeah it's not going to put out that much heat but at that range, it doesn't really outperform an A15, while costing significantly more, the M1 really pulls ahead in the mobile space in the 15-54w space but that you don't want strapped to your face.
 
Last edited:
Apple seems to be pushing for video editing as their niche, which is understandable since video editing is at an all time high with all the game streamers with Twitch and YouTube.
And all the folks doing video for every damned news story / event / thing / tiktok / etc now. Oh yes, games matter - but there are other markets, and htat's what Apple aims at.
Nvidia doesn't agree with you. Nvidia isn't advertising their RTX cards for productivity. Can they be? Yes, but Nvidia is trying hard to prevent that from being a thing with all the limits they put into crypto mining.
View attachment 420160
Oh certainly - that'd cut into the stupid high margin GRID licenses they sell (not the cards, mind you - the software, which is literally just a driver, but that they charge $30 a month sub for. Per SEAT. That's pure profit.
Is Nvidia pushing RTX cards for mining or is Nvidia crippling their cards to prevent people for buying them for miners? Again, you're not wrong but we're going by that Nvidia is pushing RTX cards as gamer cards. Therefore Apple comparing their GPU performance to RTX is like comparing their products to gamer cards, not products for video editing.
Mining is a bit different - other productivity uses (CUDA / etc) aren't crippled. :) There's a LOT that can use those cards. They aimed at something most people would know - saying it's like an A16 would confuse most consumers, since they don't know tesla cards.
Tell your company I hate them.
Mine too then. But yep.

Gaming plays such a big role in computing that GPU's wouldn't exist today if it wasn't for gaming. Doom outsold Windows 95 to the point where Bill Gates had to dress up as the Doom guy to push for Windows 95. Microsoft now puts serious resources into Xbox, and has even named their search engine after Cortana from Halo.
Oh no doubt. But computers were originally built to process transactions and do scientific work - Gaming wouldn't exist the same way without that. All things are cyclical - computers drove gaming drove GPUs drove ML work/etc. Vector processing is handy for a LOT of stuff - and general purpose processors aren't great at that. GPUs are.
I think that some people who do work on their computers probably also play games. The M1 is not going to make playing games easy or even possible.
A lot of those have gaming systems too. Don't get me wrong - lack of gaming, if I needed a laptop, would make me hesitate to buy Apple. But... I also have 4 monster machines at home. And if I just needed a productivity laptop, Apple might be near the top of the list.
 
An M1 runs too hot for something to be strapped to your face, there isn't enough surface area to move that heat. The A15 would be more than enough at this stage. Let alone any future A variants, the A16 assuming no changes, made on the N4P node would be 11% faster while using 22% less energy while being 6% smaller than the existing A15.

When the M1 is running at low voltages 7-14w yeah it's not going to put out that much heat but at that range, it doesn't really outperform an A15, while costing significantly more, the M1 really pulls ahead in the mobile space in the 15-54w space but that you don't want strapped to your face.

I wouldn't expect the M1 as such. My theory is that it'll be an M2 or even M3 variant (depending on timing) with optimizations for headsets. So power consumption will be in check, but the performance will still be relatively strong (particularly for graphics; it might be 4K per eye). I certainly wouldn't rule out an A-series chip, but the rumors suggest Apple wants to transcend the "phone with goggles" performance we get from stand-alone headsets today.
 
Back
Top