Apple leaks M1 Max Duo, and M1 Ultra

I wouldn't expect the M1 as such. My theory is that it'll be an M2 or even M3 variant (depending on timing) with optimizations for headsets. So power consumption will be in check, but the performance will still be relatively strong (particularly for graphics; it might be 4K per eye). I certainly wouldn't rule out an A-series chip, but the rumors suggest Apple wants to transcend the "phone with goggles" performance we get from stand-alone headsets today.
I would think instead they would do a wireless headset that connects back to a laptop or desktop. There are lots of design tools that are taking advantage of VR some exist already others are in various stages of development. Microsoft is doing very well with the HoloLens in medical and a number of industrial fields, complex design and cad stuff too. I can see Apple wanting in on that, because they build stuff for creators and all that jazz…. But by the time they come out with the M2 or an M3, they will likely be on the A17 and A20, which for all we know could then rival the M1 while maintaining the 7-14w envelope.

If it weren’t for the fact VR makes me hurl and there are no good sets out there I can comfortably fit over my glasses I would be very tempted to use them in place of multi-monitor setups for work. My desk is up to 4 monitors now and it’s getting ridiculous.
 
Last edited:
Oh no doubt. But computers were originally built to process transactions and do scientific work - Gaming wouldn't exist the same way without that. All things are cyclical - computers drove gaming drove GPUs drove ML work/etc. Vector processing is handy for a LOT of stuff - and general purpose processors aren't great at that. GPUs are.
Was going to say just this.
While gaming was a large driving force for NVIDIA and ATI (later AMD) in the 1990s through the 2010s, the pendulum is certainly swinging back the other way with scientific and business sectors greatly benefiting from such hardware, while gaming is taking a backseat.

Gaming was certainly important for sales and the development of this hardware, and that can't be historically argued at all.
What Apple is doing is swinging back the other way as well, focusing again on audio and video editing along with prosumer application workloads just as they did in the 2000s, while general-purpose usage (this includes gaming and Windows compatibility) is taking a backseat.
 
Was going to say just this.
While gaming was a large driving force for NVIDIA and ATI (later AMD) in the 1990s through the 2010s, the pendulum is certainly swinging back the other way with scientific and business sectors greatly benefiting from such hardware, while gaming is taking a backseat.

Gaming was certainly important for sales and the development of this hardware, and that can't be historically argued at all.
What Apple is doing is swinging back the other way as well, focusing again on audio and video editing along with prosumer application workloads just as they did in the 2000s, while general-purpose usage (this includes gaming and Windows compatibility) is taking a backseat.
Gaming has always been a backseat for Apple.
 
Gaming has always been a backseat for Apple.
But for how much longer, on iOS they’ve made billions every year as the middle man for all those tasty in game transactions. If they do loose out on those (which is likely) how long until they decide to be the front man not just the middleman. I mean the biggest FU to Epic I can imagine would be beating them at their own game.
 
I would think instead they would do a wireless headset that connects back to a laptop or desktop. There are lots of design tools that are taking advantage of VR some exist already others are in various stages of development. Microsoft is doing very well with the HoloLens in medical and a number of industrial fields, complex design and cad stuff too. I can see Apple wanting in on that, because they build stuff for creators and all that jazz…. But by the time they come out with the M2 or an M3, they will likely be on the A17 and A20, which for all we know could then rival the M1 while maintaining the 7-14w envelope.

If it weren’t for the fact VR makes me hurl and there are no good sets out there I can comfortably fit over my glasses I would be very tempted to use them in place of multi-monitor setups for work. My desk is up to 4 monitors now and it’s getting ridiculous.
Wouldn't rule that out, as some rumors have suggested as much, but the most recent claims are that it's fully standalone. I could see why Apple would rather do standalone, though — that provides a consistent hardware target where tethering to a computer or phone will lead to wildly varying experiences. That and tethering will limit the potential audience, unless Apple sees this solely as a way to sell iPhones/Macs or is prepared to write VR companion software for Android and Windows.

On this note, I'm very thankful VR doesn't make me sick... and that's a problem Apple might need to overcome, in the same way that the AirPods Pro solved the air pressure issues that can make some earbuds uncomfortable.
 
Wouldn't rule that out, as some rumors have suggested as much, but the most recent claims are that it's fully standalone. I could see why Apple would rather do standalone, though — that provides a consistent hardware target where tethering to a computer or phone will lead to wildly varying experiences. That and tethering will limit the potential audience, unless Apple sees this solely as a way to sell iPhones/Macs or is prepared to write VR companion software for Android and Windows.

On this note, I'm very thankful VR doesn't make me sick... and that's a problem Apple might need to overcome, in the same way that the AirPods Pro solved the air pressure issues that can make some earbuds uncomfortable.
Looking into why it makes me sick it very well could be because of how they don't fit well over my glasses and as a result, it messes with the depth of field and screws with my brain, I am told that I could improve or remove the problem by getting prescription inserts for the goggles which I may have to do.

But yeah I would think they would want to do stand-alone, I think an A15 would be nice but I could see having looked into it more them doing a variant with a heavier GPU like an A15X to drive that extra screen but I wonder what they would launch with it. Apple isn't the kind of company that just launches a product with nothing to back it up, if they were to launch a VR kit they would launch it alongside software that uses it. Probably a VR section of Apple Arcade and a bunch of solid titles that really take advantage of it along with the existing favorites Beat Saber comes to mind. But if Apple were getting heavy into the game development side of things they have managed to keep that extremely quiet.
 
The rumor mill on Apple VR has been in full swing for at least 1.5 years.
Most people will see first Gen Apple VR and balk, as insiders suggest that the price will be around $3000 and be targeting devs and early adopters. Then a second ‘consumer friendly’ set in the $1000 range will come 1-2 years afterwards.

They also suggest though that part of the price (perhaps a large part of it) is that it will be driving 4k 120Hz per eye and have a hybrid fresnel lens system. So both the display cost as well as the chip cost that can drive that smoothly is the target. And with that is essentially the expectation that the hardware will have to be top of the line as it will not require tethering to any iOS or Mac based device.

For the entire round up of all Apple VR and also AR projects, the compendium can be found here:

https://www.macrumors.com/roundup/apple-glasses/
 
The rumor mill on Apple VR has been in full swing for at least 1.5 years.
Most people will see first Gen Apple VR and balk, as insiders suggest that the price will be around $3000 and be targeting devs and early adopters. Then a second ‘consumer friendly’ set in the $1000 range will come 1-2 years afterwards.

They also suggest though that part of the price (perhaps a large part of it) is that it will be driving 4k 120Hz per eye and have a hybrid fresnel lens system. So both the display cost as well as the chip cost that can drive that smoothly is the target. And with that is essentially the expectation that the hardware will have to be top of the line as it will not require tethering to any iOS or Mac based device.

For the entire round up of all Apple VR and also AR projects, the compendium can be found here:

https://www.macrumors.com/roundup/apple-glasses/
At that price they are firing shots at the Hololense, which is doing well in relative silence, and well shit, a Microsoft announcement from earlier this year about the Hololense 3 mentions how they are expecting it to launch in 2024 alongside stiff competition from HTC and Apple, and given the Hololense 2 currently goes for $3500 a pop if Apple were to get a solid stand-alone out at $3000 that would follow the approach they did when they started launching their monitors. Which yeah people always remember the $1000 stand, but their $5000 Pro XDR monitor still was outclassing their competition while being significantly cheaper than what was available from them at the time.

"Recently Microsoft have announced a consumer grade Hololens 3. With a predicted release date of March 2024. The Hololens 3 will drop right after the predicted release date of the Apple’s XR headset;"

Double shit, apparently apple has dumped $430B into their VR launch....
"Apple taking a serious stand alongside AR. Getting an investment of $430B in the US and are already applying AR to their Clips app for IPhone and Ipads with LiDAR scanners."

2024 is gonna get fun! I better get my VR sickness stuff under wraps because I don't want to miss out on that.
 
Last edited:
Mining is a bit different - other productivity uses (CUDA / etc) aren't crippled. :) There's a LOT that can use those cards. They aimed at something most people would know - saying it's like an A16 would confuse most consumers, since they don't know tesla cards.
Keep in mind that's because Nvidia wants to sell special cards to miners so it doesn't have resale value. Notice that both AMD and Nvidia aren't producing anything in the $250 range for consumers and anything they do produce feels like a minor upgrade over the RX 580 and GTX 1060. Nvidia knows they screwed up producing so many GTX 1060's and they don't want to make that mistake again. CUDA and etc are standards that Nvidia wants to push over other open standards, and don't make direct money like mining does.
Oh no doubt. But computers were originally built to process transactions and do scientific work - Gaming wouldn't exist the same way without that. All things are cyclical - computers drove gaming drove GPUs drove ML work/etc. Vector processing is handy for a LOT of stuff - and general purpose processors aren't great at that. GPUs are.
Gaming brought about GPU's which eventually evolved to do computing, but anyone using modern GPU's to do stuff outside of gaming is doing it because the hardware is there and it's cheap. At least it was cheap. Because the hardware evolves so fast and does so much that it kinda works out that it can do other tasks that a CPU isn't suited for. Speaking of which, modern consumer CPU's focus on IPC a lot because gaming does as well. There's a reason why Apple and many other smart phone manufacturers use SoC's with big and little cores because you simply don't need a 5Ghz IPC monster for most productivity, but multiple cores that run more efficiently are better suited for productivity tasks. If productivity was the main focus then computers would use more Xeon like chips that contain many cores at lower clocks with ECC memory. Last I checked the Apple M1's don't make use of ECC memory. You all don't want to admit it but you're using gaming like hardware for productivity because it's far cheaper than buying Nvidia's Teslas and Intel Xeons. Apple with their M1 hardware is trying to mimic gaming PC's even if they're not aware of it. They did compare their new M1's to that of a RTX 3080.


A lot of those have gaming systems too. Don't get me wrong - lack of gaming, if I needed a laptop, would make me hesitate to buy Apple. But... I also have 4 monster machines at home. And if I just needed a productivity laptop, Apple might be near the top of the list.
That's you, but that's not most people. I'm surprised that some people are looking at my signature and using that to demoralize me because I simply don't buy the latest and currently very expensive hardware. You are out of touch, is the best way I can say it. Most people would rather buy one device that does all, instead of buying multiple devices that specialize. That's not very fiscally responsible. This is why the mobile gaming market is booming because smart phones is something everyone needs and is more affordable than buying a Playstation 5, while also able to do their productivity needs like checking mail and browsing the web. At the same time buying a laptop that you need for productivity while also serving as a gaming device would also save money on buying multiple hardware. There's a reason why the GTX 1060 is still the most popular gaming GPU used on Steam.
s+you+were+gonna+_71d91808c4fdf60336868d791c242af0.png
 
What Apple is doing is swinging back the other way as well, focusing again on audio and video editing along with prosumer application workloads just as they did in the 2000s, while general-purpose usage (this includes gaming and Windows compatibility) is taking a backseat.
Apple would never enter the gaming market because Apple knows they could never compete. Acting like Apple never thought of gaming is ignoring the history of the company.
1*UBDquONVOOvCocIvyYjwFg.jpg


But for how much longer, on iOS they’ve made billions every year as the middle man for all those tasty in game transactions. If they do loose out on those (which is likely) how long until they decide to be the front man not just the middleman. I mean the biggest FU to Epic I can imagine would be beating them at their own game.
Gaming on iOS is big and many people don't want to admit that most iPad and iPhones end up in children's hands to baby sit them while their parents do more important things like care about themselves. If it isn't children playing Roblox then it's middle aged women playing Candy Crush. Gaming is the overwhelming majority of use on iOS devices by 21% with business at 10% and education at less than 9%. Productivity is a mere 3%.

Apple could enter the gaming market but again would lose easily to the open nature of the PC market. They would have to open up others to make hardware that can run Mac OSX and they tried this once decades ago and Apple pulled out fast.
 
Keep in mind that's because Nvidia wants to sell special cards to miners so it doesn't have resale value. Notice that both AMD and Nvidia aren't producing anything in the $250 range for consumers and anything they do produce feels like a minor upgrade over the RX 580 and GTX 1060. Nvidia knows they screwed up producing so many GTX 1060's and they don't want to make that mistake again. CUDA and etc are standards that Nvidia wants to push over other open standards, and don't make direct money like mining does.

Gaming brought about GPU's which eventually evolved to do computing, but anyone using modern GPU's to do stuff outside of gaming is doing it because the hardware is there and it's cheap. At least it was cheap. Because the hardware evolves so fast and does so much that it kinda works out that it can do other tasks that a CPU isn't suited for. Speaking of which, modern consumer CPU's focus on IPC a lot because gaming does as well. There's a reason why Apple and many other smart phone manufacturers use SoC's with big and little cores because you simply don't need a 5Ghz IPC monster for most productivity, but multiple cores that run more efficiently are better suited for productivity tasks. If productivity was the main focus then computers would use more Xeon like chips that contain many cores at lower clocks with ECC memory. Last I checked the Apple M1's don't make use of ECC memory. You all don't want to admit it but you're using gaming like hardware for productivity because it's far cheaper than buying Nvidia's Teslas and Intel Xeons. Apple with their M1 hardware is trying to mimic gaming PC's even if they're not aware of it. They did compare their new M1's to that of a RTX 3080.



That's you, but that's not most people. I'm surprised that some people are looking at my signature and using that to demoralize me because I simply don't buy the latest and currently very expensive hardware. You are out of touch, is the best way I can say it. Most people would rather buy one device that does all, instead of buying multiple devices that specialize. That's not very fiscally responsible. This is why the mobile gaming market is booming because smart phones is something everyone needs and is more affordable than buying a Playstation 5, while also able to do their productivity needs like checking mail and browsing the web. At the same time buying a laptop that you need for productivity while also serving as a gaming device would also save money on buying multiple hardware. There's a reason why the GTX 1060 is still the most popular gaming GPU used on Steam.
View attachment 420982

Man, it's apparent you're out of touch. At this point with all of your hand waving you're going to have to tell us why Apple is the most profitable company on the planet.
And I bet it will include more hand waving that basically will be tantamount to: "consumers are dumb" while ignoring tons of industries that are invested in macOS.

Apple would never enter the gaming market because Apple knows they could never compete. Acting like Apple never thought of gaming is ignoring the history of the company.
View attachment 420997


Gaming on iOS is big and many people don't want to admit that most iPad and iPhones end up in children's hands to baby sit them while their parents do more important things like care about themselves. If it isn't children playing Roblox then it's middle aged women playing Candy Crush. Gaming is the overwhelming majority of use on iOS devices by 21% with business at 10% and education at less than 9%. Productivity is a mere 3%.

Apple could enter the gaming market but again would lose easily to the open nature of the PC market. They would have to open up others to make hardware that can run Mac OSX and they tried this once decades ago and Apple pulled out fast.

Yeah, except that Apple Arcade makes more money than Nintendo, Sony, and Microsoft combined:
(EDIT: okay, as discussed below, it's technically not "Apple Arcade", but all of their game sales on the App Store - either way, the point is, Apple makes more from gaming than all of the console makers combined).
https://www.therichest.com/rich-pow...ts-than-microsoft-nintendo-and-sony-combined/
https://www.wsj.com/articles/apple-...-its-the-hottest-player-in-gaming-11633147211
https://www.forbes.com/sites/anthon...s-first-subscription-success/?sh=4b78d89db53a

You're not even thinking correctly about the sort of box Apple would make. At this point all they really need to do is make a new Apple TV (as in, running "tvOS", which is a fork of iOS) with sufficiently fast enough ARM chip, and then they would already have a rival graphically speaking to all of their console competitors (target 4k/30p). Especially when the programming is optimized for their hardware.
Sell it for $250, add a controller, which wouldn't come with the "Apple TV" for $50, and it's profit all the way down. It could be an A15 (or A16 or some future chip). It could be an M1 (or M2+). It could be whatever.

Then they can sell an Apple Arcade subscription (or really a combined Apple subscription bundle, like AppleTV+, iCloud, and Apple Arcade Bundles). AND in addition to that people could stop playing at home and pickup their phone and continue to play as literally all the hardware would essentially be the same and their progress could all be saved through the cloud - they could even make a first party Apple branded controller that fits around people's phones specifically for that purpose.

People already are playing CoD on iOS and there is a fairly large community based around it (EDIT: and that's just one example, as another, all of the major racing games such as "Burn Out" are on iOS). All it would take is a few compelling killer apps - and guess what, like you say Apple will be able to continue their market dominance. If they do that, it's a VERY short trip onto macOS after that as now with M1 and universal libraries, running iOS apps natively on macOS is already possible. And porting from iOS to macOS should a dev want to is incredibly simple and straight forward. Frankly companies that are more forward thinking are already starting to port both ways. Like Firaxis with Civ VI, Feral Interactive with X-Com 2, and Larian Studios with games like Divinity Original Sin 2.

All of this without even starting up their own game dev studio. Which, is also another option for them. Apple is already showing they can procure smart people to create content for them (AppleTV+), albeit slowly. It's not like they couldn't head hunt the best game devs and actually pay them a living wage to make excellent games. I'm sure they could poach people from Activision/Blizzard. Essentially doing the same thing that Microsoft is doing now. But really as has been noted above, they don't even need to make first party games to make money as they're already winning.

Your thinking is INCREDIBLY constrained on what 'gaming' hardware could be and what is even profitable. Going back to your 1060 example, you're already thinking too high. Phones are already on top - Apple is already making more money than nVidia and AMD. They have a 3 Trillion market cap compared to nVidia at a paltry 750 billion and AMD last at 180 billion. Even throwing in Intel, they're only at $207B.

Frankly nVidia and AMD need to get on Apple's level. All they have is "sales numbers" but Apple is clearly winning on the score board.

(And this is ignoring Apple moving towards being a first mover with VR and the Metaverse, which they are uniquely positioned to be ahead of everyone else on - since they can make their own leading, best in class hardware, they do excellent hardware/software integration, they obviously know how to make app stores, they know how to make novel products, and connectivity is 'some-what' of a specialization of theirs. Most other companies only do one of those things well. As an example, only being on the hardware side. Apple uniquely can make the entire product stack like no other company can).


EDIT: And honestly I think Apple is more or less content to do nothing, even in light of everything I said above. Eating the lunch of all the console makers wouldn't even make a dent in their bottom line. Adding 300 Billion to their market cap in light of their 3 Trillion, is probably not even worth pursuing. In comparison with much bigger markets such as EV's and VR/AR - both of which they are known to be invested in. Taking the car market's lunch and being ahead of essentially what will be the next form of computing (VR/AR) is the real way to continue to be dominant over pursing some people that want to play games.
 
Last edited:
Yea I was gonna say, isn’t Apple already the gaming winner? Just because a niche set of “elite desktop gamers” don’t use them, the majority does.

Looking at Apples arm lineup, it’s only time before a MacBook Pro is outperforming most $1500 gaming PCs at their own game.
 
Apple would never enter the gaming market because Apple knows they could never compete. Acting like Apple never thought of gaming is ignoring the history of the company.
View attachment 420997


Gaming on iOS is big and many people don't want to admit that most iPad and iPhones end up in children's hands to baby sit them while their parents do more important things like care about themselves. If it isn't children playing Roblox then it's middle aged women playing Candy Crush. Gaming is the overwhelming majority of use on iOS devices by 21% with business at 10% and education at less than 9%. Productivity is a mere 3%.

Apple could enter the gaming market but again would lose easily to the open nature of the PC market. They would have to open up others to make hardware that can run Mac OSX and they tried this once decades ago and Apple pulled out fast.

You realize you're proving my point, right? Apple isn't interested in the gaming market. Not as you see it, at least - they don't want to open up, they have no interest in catering to that market, because they're making money hand-over-fist going after a ~different~ market. Why doesn't John Deere start making cars? I mean - it's got 4 wheels, and Ford makes (or made) tractors, so they have to be in the same space, right? Also, the Pippin was released in 1996 - BEFORE Steve Jobs reset the path of the company. (https://lowendmac.com/2006/beleague...s at an all,is a completely different company.) This was when they were trying to find ANY way to make money. And yeah, Halo (being Bungie) was something Steve Jobs hoped would excite folks about the resurgence of Apple, but did you see them mention gaming ever again?

Apple literally prints money picking their markets and approach carefully. General purpose gaming is not a market they care about. They're looking to enable content creators and professionals - folks who generally (especially since they're spending a premium, or having a premium spent on them) have other systems for doing things like playing games, rather than doing it all on one system.
 
I am sure it is not Apple Arcade but the revenue generated from the App Store.
Good job not clicking the links or reading any of the posted information:

"To obtain the number The Wall Street Journal used Sensor Tower’s report to determine the company made $15.9 billion in revenue from the App Store in fiscal 2019. Sensor Tower claims that 69% of that amount is coming from games. Using numbers revealed during the Apple v. Epic court battle they resolved that the store had an implied operating profit of $12.3 billion that year.
According to a Journal analysis that equates to $8.5 billion in earnings for gaming alone. That figure is $2 billion more than the combined operating profit of videogame developers in the same time frame."


and then further in the article:

"In fact, it’s basically just a game store if you look at numbers alone. In 2020, game transactions accounted for 68% of total App Store revenue, making the App Store “primarily a game store and secondarily an ‘every other’ app store,” Judge Yvonne Gonzalez Rogers wrote in her final ruling on Epics case against Apple."

In other words, this is publicly available information because of the Epic v Apple lawsuit, and it has been verified by multiple sources. Including, literally, a judge.
 
Good job not clicking the links or reading any of the posted information:

"To obtain the number The Wall Street Journal used Sensor Tower’s report to determine the company made $15.9 billion in revenue from the App Store in fiscal 2019. Sensor Tower claims that 69% of that amount is coming from games. Using numbers revealed during the Apple v. Epic court battle they resolved that the store had an implied operating profit of $12.3 billion that year.
According to a Journal analysis that equates to $8.5 billion in earnings for gaming alone. That figure is $2 billion more than the combined operating profit of videogame developers in the same time frame."

In other words, this is publicly available information because of the Epic v Apple lawsuit, and it has been verified by multiple sources.

Not that I disagree with most of your points, Games in App Store =/= Apple Arcade. 69% of revenue coming from games in App Store is not the revenue coming from Apple Arcade subscription service. It is not correct ot state that "Apple Arcade" makes more money than Nintendo, Sony, and MS combined.
 
Not that I disagree with most of your points, Games in App Store =/= Apple Arcade. 69% of revenue coming from games in App Store is not the revenue coming from Apple Arcade subscription service. It is not correct ot state that "Apple Arcade" makes more money than Nintendo, Sony, and MS combined.
This is a fair point. The point of the platform generating crap tons of revenue still stands. And they don’t have to battle Microsoft or Sony or anyone for that revenue.
 
I am sure it is not Apple Arcade but the revenue generated from the App Store.
Paid games made Apple 476m
F2P games made Apple 21.3B
Apple Arcade subscribers paid just shy of 3.5B this year and they are showing yearly growth.
If the current trend continues Apple is estimating 4.6B in subscription fees and a user base of 1 in 10 iOS devices by 2024.
 
Paid games made Apple 476m
F2P games made Apple 21.3B
Apple Arcade subscribers paid just shy of 3.5B this year and they are showing yearly growth.
If the current trend continues Apple is estimating 4.6B in subscription fees and a user base of 1 in 10 iOS devices by 2024.

Oh no, I am not disagreeing that Apple makes more revenues from gaming software/subscription than console manufacturers combined, just saying that "Apple Arcade" making more than them isn't a correct statement.
 
Oh no, I am not disagreeing that Apple makes more revenues from gaming software/subscription than console manufacturers combined, just saying that "Apple Arcade" making more than them isn't a correct statement.
Yeah 100% I was just providing the numbers to back you up.
 
Yea I was gonna say, isn’t Apple already the gaming winner? Just because a niche set of “elite desktop gamers” don’t use them, the majority does.

Looking at Apples arm lineup, it’s only time before a MacBook Pro is outperforming most $1500 gaming PCs at their own game.
I'd say it's one of the winners, but yes... it's not just that the mobile market eclipses PC gaming by both revenue and sheer community size, it's that Apple owns disproportionately large slices in both those criteria. For Apple, chasing hard after PC gaming would be a fool's errand — it'd spend a ton of money to grab a small slice of a smaller market than it participates in, and would try to court an audience that in some cases is openly hostile to the brand.

I don't think Apple will really care much about beating gaming PCs, but the company's emphasis on constant iteration could make for some more flattering comparisons in a few years. Chipmakers ignore Apple at their own peril — look at how it's thrashing Qualcomm on virtually every front.
 
Somewhat related, and while not exactly confirming, interesting none the less:
while macOS only uses one set of IRQ control registers, there was indeed a full second set, unused and apparently unconnected to any hardware. [. . .] interrupts delivered from [AIC2] popped up with a magic “1” in a field of the event number, which had always been “0” previously. Yes, this is the much-rumored multi-die support.

Sauce: https://asahilinux.org/2021/12/progress-report-oct-nov-2021/
 
Wait isn't the M1 Max 57 billion transistors? So the M1 Ultra is 228 billion transistors? Just how chungus is this thing? Isn't this like the transistor budget of 8 x RTX 3090s?
 
Wait isn't the M1 Max 57 billion transistors? So the M1 Ultra is 228 billion transistors? Just how chungus is this thing? Isn't this like the transistor budget of 8 x RTX 3090s?
You have to consider the machines it would be placed inside.

Rumorsville says a new iMac Pro is coming. The original iMac Pro started at $5000. This also in theory could be in the Mac Pro, which also currently starts at $3000.

I very much doubt, should the ultra be a real product, that it will be in any of their lower tiered machines like the Mac Mini. Or any lower costed system period. I’d expect a price premium and something big enough to facilitate cooling the densely packed chips. If you’ve seen the iMac Pro internals, it’s set for something like this.
 
You have to consider the machines it would be placed inside.

Rumorsville says a new iMac Pro is coming. The original iMac Pro started at $5000. This also in theory could be in the Mac Pro, which also currently starts at $3000.

I very much doubt, should the ultra be a real product, that it will be in any of their lower tiered machines like the Mac Mini. Or any lower costed system period. I’d expect a price premium and something big enough to facilitate cooling the densely packed chips. If you’ve seen the iMac Pro internals, it’s set for something like this.
The current rumors suggest the new iMac Pro will actually be a redesigned 27-inch iMac with the M1 Pro/Max inside. Think of it more as a realignment of Apple's strategy where the iMac Pro is the direct counterpart to the MacBook Pro instead of a niche model — it's still meant for demanding customers, but it's not total overkill as a home computer in some configurations.

Any higher-end chips will probably be reserved for Mac Pro-class workstations, if just due to the heat and power considerations.
 
The current rumors suggest the new iMac Pro will actually be a redesigned 27-inch iMac with the M1 Pro/Max inside. Think of it more as a realignment of Apple's strategy where the iMac Pro is the direct counterpart to the MacBook Pro instead of a niche model — it's still meant for demanding customers, but it's not total overkill as a home computer in some configurations.

Any higher-end chips will probably be reserved for Mac Pro-class workstations, if just due to the heat and power considerations.
I've seen the rumors. I guess it will come down to what Apple will find worthy of dubbing the moniker of "pro". They may not want to do a reveal of any multi-chip design until the Mac Pro comes out in order to preserve its hype. But I would say that if they are going to bring back the "iMac Pro" name, it would be a missed opportunity if it contains the exact same chip as what I more or less expect the next Mac Mini to contain (although they may not, and simply use an M2 in the next Mac Mini, which I think is another missed opportunity for those wanting compact, fast machines). The iMac Pro was basically Xeon's strapped to a reasonably fast (at the time) graphics chip with a massive cooling system that allowed it to run silent. While it's unlikely they want that form factor moving forward, I think it's also a shame to not use that engineering and tech in this next iteration of hopefully silent and lightning quick AIOs. At least allow "2" processors and perhaps not quad. But whatever, it's not up to me.
 
though they compared it to a RTX 3090.
The fps are a bit different than say here (with a 9900k), maybe different setting ?
https://www.guru3d.com/articles-pages/geforce-rtx-3090-founder-review,17.html
or here:
https://www.gpucheck.com/game-gpu/s...nvidia-geforce-rtx-3090/intel-core-i9-10900k/

With DX12:
169fps at 1080p, 153fps at 1440p, 95fps at 4K (where it start to stretch is leg seem true for the Ultra has well if we look at the delta with the Max or mac pro)

60 fps at 4K (if it is the Highest details, pure hair on , HBA0+, etc...) would be a 2080TI performance which is quite good, but according to the reviewer despite the good average numbers you cannot play yet on it (Now, this is Apple gaming, of course, so Tomb Raider was not a perfect or even particularly good experience: there was substantial, noticeable micro stutter at every resolution we tried), so maybe there is drivers/software room to grow here.
 
I would sincerely hope a chip that is 2 nodes ahead with 4x the transistors of an RTX 3090 can perform somewhat close to it at a fraction of the power...
 
You don’t buy a Mac to game on…
Sure, but what are you paying for then? Why buy this GPU if you can't game? Why compare it against gaming GPU's?...

IMO it's kind of a silly thing to say after it gets crushed in a benchmark it said it could compete in.

Then again, the mental gymnastics of paying 2x the price for hardware so you can get a Mac OS never made sense ot me.

Might as well go Arch/SteamOS or Windows 11. You can actually do far more on either of these from any perspective.

Apple sure knows how to market, I'll give them that. They've effectively brainwashed their clientele into accepting that they can spend thousands of extra dollars for a console-esque experience with severe limitations.
 
Last edited:
  • Like
Reactions: kac77
like this
Sure, but what are you paying for then? Why buy this GPU if you can't game? Why compare it against gaming GPU's?...

IMO it's kind of a silly thing to say after it gets crushed in a benchmark it said it could compete in.

Then again, the mental gymnastics of paying 2x the price for hardware so you can get a Mac OS never made sense ot me.

Might as well go Arch/SteamOS or Windows 11. You can actually do far more on either of these from any perspective.

Apple sure knows how to market, I'll give them that. They've effectively brainwashed their clientele into accepting that they can spend thousands of extra dollars for a console-esque experience with severe limitations.
I have a quad RTX8000 that games like shit.
In the limited workloads that MAC users care about I bet the M1Ultra is going to melt face. Outside of those it’s going to be a mixed bag.
 
Sure, but what are you paying for then?
Horsepower and unified memory. I’ll explain.
Why buy this GPU if you can't game? Why compare it against gaming GPU's?...
Because a lot of places use “gaming GPUs” for non gaming tasks.

Example 1: my best friend is a data scientist working on graph algorithms. He bought a 3090 the day it came out - yes, to game, but also to run CUDA enabled ML tests against his data set locally instead of feeding it to the AWS farm for early testing. He then bought an M1 Max MacBook to do similar tests since the unified memory lets him go a lot farther with it. His company is considering the studio max and ultra for similar use cases before feeding the farm.

Example 2: major satellite imaging company locally uses 2080TI FEs to feed their image processing ML library. They stuff 6 of them in an R730 server, 8 of those servers. They’ve tinkered with the P40 and RTX 8000 series for this, as they allow for a deeper algorithm (more memory thus more simultaneous processing runs) but found that the compute horsepower can’t keep up, and they can only run 3 of those in a 730/740, vs 6 2080, and those 3 use more power and run hotter than the 2080s. They’re considering the Max because it LOOKS like it’ll allow for the deeper runs (unified memory) but also keep power and cooling down, and might have the compute power to compensate for the deeper run. Oh? And no GRID licensing required either - consumer cards.

Example 3: my sister is an Emmy winning editor for a news station (one of the big 4). She wants one to replace her MacBook Pro because it appears it’ll buy her 5 minutes, which is enough to get more detail or take more time tweaking the evening and noon run-ins - or breaking news ones- before they have to send it for final render to show on TV. FCP loves that hardware and the dedicated encoders.

Other than example 1, none of these care a lick about gaming. And 1 mostly plays PS5 these days.
IMO it's kind of a silly thing to say after it gets crushed in a benchmark it said it could compete in.
The crypto benchmarks and some of the others are promising for the workloads the buyers want to see (Ars review)
Then again, the mental gymnastics of paying 2x the price for hardware so you can get a Mac OS never made sense ot me.
None of my examples care about the OS. It’s an API to hardware.
Might as well go Arch/SteamOS or Windows 11. You can actually do far more on either of these from any perspective.
Depends on what you’re doing.
Apple sure knows how to market, I'll give them that. They've effectively brainwashed their clientele into accepting that they can spend thousands of extra dollars for a console-esque experience with severe limitations.
Depends on what you’re doing.

Me? I agree with your points for myself. I built a Threadripper and Intel boxes for that reason. I don’t need what my customers need, or when I do, I don’t need the dedicated hardware and options like they do and am willing to take the trade offs for gaming and the like. But none of those customers care about cost, they care about results - and the results look promising for their use cases. Especially when you could stack 50 of the studios where those Dell servers used to sit. And pull less power. That pays for itself in less than a year.

Outside of prosumers, I see no reason for your average person to buy a studio. But I expect to see piles of them in data centers and video studios, if not the AI/ML world too.
 
Sure, but what are you paying for then? Why buy this GPU if you can't game? Why compare it against gaming GPU's?...
An RTX 3090 with 24 gig of VRAM is more a pro than a gaming GPU's arguably and at least use for similar stuff an Mac will be used for (the Adobe, OpenCL/Cuda/AI type, Render and so on) and is quite close to some Ampere pro card version.

Both in price and capacity (game that take advantage out of the box of 24 gig instead of say 16 gig of VRAM must be quite rare if they exist)

Might as well go Arch/SteamOS or Windows 11. You can actually do far more on either of these from any perspective.
You think SteamOS is better for someone making movies (on any perspective) ?

Apple sure knows how to market,
They make extra good computer for the workload computer reviewer has, which make it I imagine a bit of a strange thing in a way to get easy for them to be overexcited, because of how good they are at doing their own work
 
Last edited:
"We used some recycled bits to make a tiny piece of the casing!!!"

No upgrade path, no user service options.

E-waste by design.

I know pollution is just a farradicalleftcommunistfascist conspiracy to take our guns, but do we have to shit up our only planet so aggressively?

Apple continues to be a pretty shitty company.
 
"We used some recycled bits to make a tiny piece of the casing!!!"

No upgrade path, no user service options.

E-waste by design.

I know pollution is just a farradicalleftcommunistfascist conspiracy to take our guns, but do we have to shit up our only planet so aggressively?

Apple continues to be a pretty shitty company.

Upgrade path is a joke across the industry if you buy high end. I bought a 3990X and AMD just tanked the whole fucking Threadripper line. Zero upgrade path. DDR4 to DDR5 will require new motherboards. Storage changes connector type virtually every couple years at this point.

Upgrade path might matter if you're someone buying like a $800 computer and planning to upgrade to hand me down tech after a few years and stretch your platform for a decade, but that's not the high end or Mac customer anyway.

I prefer to look at my 14 inch Macbook Pro as very good for the environment; since it's so much faster than any comparably sized PC laptop in existence (M1 Max, 64GB ram, 8TB SSD) that I would have to buy a brand new PC laptop every year for the next three years to just hit the performance that my current machine has. That'd be a lot of junked PCs.
 
Upgrade path is a joke across the industry if you buy high end. I bought a 3990X and AMD just tanked the whole fucking Threadripper line. Zero upgrade path. DDR4 to DDR5 will require new motherboards. Storage changes connector type virtually every couple years at this point.

Upgrade path might matter if you're someone buying like a $800 computer and planning to upgrade to hand me down tech after a few years and stretch your platform for a decade, but that's not the high end or Mac customer anyway.

I prefer to look at my 14 inch Macbook Pro as very good for the environment; since it's so much faster than any comparably sized PC laptop in existence (M1 Max, 64GB ram, 8TB SSD) that I would have to buy a brand new PC laptop every year for the next three years to just hit the performance that my current machine has. That'd be a lot of junked PCs.

I get that it's industry wide, but that doesn't excuse it. Hell it's universal, not just in the "tech industry".

All this disposable shit is so incredibly shortsighted. I guess shortsighted is the rule in a speculation driven global economy.

Like paying $500K for a football.
 
Back
Top