Apple launched the M2 Pro and the M2 Max

I think Apple is cost cutting because the development of the Apple silicon wasn't cheap. They clearly could have done more. They did cost cut with the M2 256GB and M2 Pro 512GB SSD's for a reason. While Apple is a big player in the laptop market, they aren't that big. Not enough to justify the R&D they put into making their silicon. Also their CPU tech is mainly from ARM, which is in financial limbo in terms of bankruptcy and who's who in terms of buying them. Apple's GPU tech is Imaginations PowerVR tech reimagined. Which is now owned by the Chinese government. It's only going to get more expensive for Apple to compete against AMD and Intel. I said 3 years ago that AMD and Intel will catch up to Apple in efficiency, and this year they will. Maybe if Apple went 3nm and ARMv9 they'd still have the edge, but as far as I know the M2's are 5nm using ARMv8.
Based on profits posted Apple's ARM R&D is more than paying for itself, Apple brags about making about $500 in profit per iPhone sold and they sell something like 125 million of them a year, that alone is more than enough to keep their Apple silicon funded for a long time.
If ARM goes belly up, it's probably still going to be Nvidia who buys them, that or Apple, ARM China will get spun off on its own and cheap Android phones will be plentiful.
ARM v9 doesn't really add much aside from security features needed for servers, and it integrates a lot of features that make it more easily integrated into SoC packages, something Apple already did, it also adds more cache and changes the memory layout, to be more like how Apple redesigned theirs to be. ARMv9 basically just reverse engineers the work Apple already did for their silicon.
In terms of efficiency, I am still waiting to see any Intel or AMD product that does what an MBP does while only using a 45w peak draw. They have announced them sure, but we have only AMD's slide show and their promise that Microsoft has been working on integrating the features. And while I use Microsoft all day at work, I don't trust their ability to fry an egg let alone deliver good returns on a first-generation part with new features from a single vendor on a niche bit of silicon expecting moderate to low sales.
 
Like I've said before, the Apple only reviewers aren't very good. The Premier test he did was with what codec? It's going to be power efficient when using the code the Macbook is capable of using, but the crutch of specialized hardware is they don't always stay in style. AV1 is the codec of choice for those who upload videos online and I really doubt he did those tests in AV1. There is still a use case for H.264 and H.265, but YouTube and Netflix have already moved over to AV1. Not sure if that MSI laptop has a AV1 hardware encoder but willing to bet either way that AV1 is much faster on it.
Not sure either. AV1 in NVENC is pretty new, and I don’t know if the mobile parts got it (I believe the 4000 series did).

And doing a final render you wouldn’t use the GPU either way. Too low quality - you always do that on CPU. The encoders are more to help with editing and scanning as you construct the video in real time.
You didn't see that as a joke? I'm comparing a very expensive Macbook to a cheap Chromebook.
Nope. When the hyperbole has gotten crazy, wouldn’t be the weirdest thing I’ve seen today.
That maybe but AMD and Intel has finally started to get their shit together. I have to applaud Apple for releasing their silicon at the right time, which was probably the lowest points for both AMD and Intel.
Bulldozer and sledgehammer would like to have a word. 😂
Intel for sitting around and thinking they're the best and have to do barely anything to improve their products, and AMD who for some reason didn't use their latest tech on their mobile parts for nearly 3 years.
Mobile is harder than fully wired. Lucrative but difficult.
That and they still used Vega graphics up until last year... mostly. They still pull this crap today with their mobile 7000 series as I still see them selling Zen2+ and Zen3 cores, along with RDNA2 and Vega. AMD has to stop gate keeping their tech like this. Apple though isn't doing much better. For the most part the M2 series is just a beefed up M1.
That’s all anyone expected. Anyone who follows Apple at least - it was always going to just be a faster M1 for now.
I was expecting Apple to go 3nm with the M2's, and switch over to the ARMv9 architecture, but that didn't happen.

I think Apple is cost cutting because the development of the Apple silicon wasn't cheap. They clearly could have done more.
Such as? Their license let’s them backported features from later instruction sets if they want - they’ve already done that with versions of V8
They did cost cut with the M2 256GB and M2 Pro 512GB SSD's for a reason. While Apple is a big player in the laptop market, they aren't that big. Not enough to justify the R&D they put into making their silicon.
Given the sales and profit from everything using Apple designed SoCs, we’re going to have to disagree here. Remember that it all grew out of the existing designs form the A series chips - they didn’t start from scratch for a laptop.

They also didn’t have an alternative. AMD competition is barely out - and Intel still doesn’t compete in the market Apple is aiming for. The 13980 is impressive but built for a different use case. Where’s the 35w chip that can match what Apple has done?

It’s also made them unique - which drives sales. Right now their 10K certainly makes it look successful, even after this last quarter.
Also their CPU tech is mainly from ARM, which is in financial limbo in terms of bankruptcy and who's who in terms of buying them.
ARM is not bankrupt. The holding company SoftBank isn’t bankrupt. I’m seriously not sure where you’re getting this information.

SoftBank has had several down years from bad investments; they need cash infusions, and you normally get that by selling odd valuable (producing) assets. ARM is huge for that.
Sure. The original chips and even A series chips used PowerVR for a long time. So what?

They have a licensing agreement:

https://www.theregister.com/2020/01/02/imagination_apple_ip_deal/



Which is now owned by the Chinese government.
So? The tech they licensed doesn’t become tainted in any way till after that.
It's only going to get more expensive for Apple to compete against AMD and Intel.
Of course it will get more expensive; that’s what competition does. So what? Either the product works and is worth the price charged or it isn’t.
I said 3 years ago that AMD and Intel will catch up to Apple in efficiency, and this year they will.
We’ll see, won’t we?

My bet is they’ll get closer, but they’re not going to catch up.
Maybe if Apple went 3nm and ARMv9 they'd still have the edge, but as far as I know the M2's are 5nm using ARMv8.
We’ll see.
 
Maybe if Apple went 3nm and ARMv9 they'd still have the edge, but as far as I know the M2's are 5nm using ARMv8.

We’ll see.
M2 was very much supposed to be a 3nm part but TSMC pushed it back 6 months which fell outside Apples launch window, TSMC has only recently started up their 3nm facilities for customer orders. December 29, 2022 if the posts were true, which means we have a few more months before Apple can actually launch anything made from it.

ARM v9 is mostly a reverse engineering job of the Apple Silicon and doesn’t offer much Apple doesn’t already have with the exception of some security features that Apple instead integrates into their T2 chip.

ARM propers biggest weakness right now is Mali, that needs to go. It’s generic as all hell which makes it super flexible and easy to program for but it is just so very weak.

I hope that Apple can figure out its graphics sooner than not, the more I dive into that on my M1 the more I miss certain functions for acceleration that aren’t there Apple approximates them well enough so my work gets done all the same (better than my old system at least) but when I do specific tasks I can watch it’s poor little hamster wheel spinning and it’s kinda sad.
 
You are fundamentally misunderstanding Apple and their goals. On purpose, most likely.
Like you do. Apple's goals are to be as independent as possible when it comes to making their products. This is why Apple had legal fights with Qualcomm over IP, and why Apple's GPU tech is "borrowed" from Imagination.
Based on profits posted Apple's ARM R&D is more than paying for itself, Apple brags about making about $500 in profit per iPhone sold and they sell something like 125 million of them a year, that alone is more than enough to keep their Apple silicon funded for a long time.
Cell phones and laptops are two different things. Apple's laptop silicon has to compete against AMD and Intel, and has to also deal with legacy x86. Where as their cell phone and tablet chips has to compete against Qualcomm and maybe Samsung.
If ARM goes belly up, it's probably still going to be Nvidia who buys them, that or Apple, ARM China will get spun off on its own and cheap Android phones will be plentiful.
I wouldn't high five the idea of either Apple or Nvidia buying ARM.
ARM v9 doesn't really add much aside from security features needed for servers, and it integrates a lot of features that make it more easily integrated into SoC packages, something Apple already did, it also adds more cache and changes the memory layout, to be more like how Apple redesigned theirs to be. ARMv9 basically just reverse engineers the work Apple already did for their silicon.
According to the wiki, Apple's A16 cell phone chips use 4nm and ARMv8.6-A, while their latest M2's use 5nm and ARMv8.5-A. Kinda seems like Apple got cheap with the M2. If Apple does eventually end up using ARMv9 then everything you said was false.
And doing a final render you wouldn’t use the GPU either way. Too low quality - you always do that on CPU. The encoders are more to help with editing and scanning as you construct the video in real time.
Depends on the encoder but on Apple you'd probably use the encoder.
Bulldozer and sledgehammer would like to have a word. 😂
Bulldozer is what I'm referencing.
Given the sales and profit from everything using Apple designed SoCs, we’re going to have to disagree here. Remember that it all grew out of the existing designs form the A series chips - they didn’t start from scratch for a laptop.
Nothing Apple did started from scratch. Also we're not talking about Apple SoC's in general, but their laptop chips specifically since you can't really use those in phones. Apple probably still makes money, but they are also a publicly traded company who has shareholders that want infinite growth. It's clear Apple cut costs with their SSD's, but the question is why? Given that Apple didn't go 4nm like AMD will be doing, and didn't improve their architecture in any meaningful way, I think they're avoiding the need to spend R&D.
They also didn’t have an alternative. AMD competition is barely out - and Intel still doesn’t compete in the market Apple is aiming for. The 13980 is impressive but built for a different use case. Where’s the 35w chip that can match what Apple has done?
Both AMD and Intel make laptop chips that aren't meant to be efficient. For example AMD's 7045 series has no support for LPDDR5 and is using RDNA2 instead of the more efficient RDNA3. Pretty clear that these chips are meant to be paired with a discrete GPU from AMD or Nvidia. That MSI laptop with the 13900Hk has enough lights on it to illuminate a room.
ARM is not bankrupt. The holding company SoftBank isn’t bankrupt. I’m seriously not sure where you’re getting this information.
If SoftBank is bankrupt then so is ARM. You can spin it how you want, but Nvidia almost bought them for this reason.
Sure. The original chips and even A series chips used PowerVR for a long time. So what?
GPU's are hard to make. Who's going to be making them for Apple? China?
 
According to the wiki, Apple's A16 cell phone chips use 4nm and ARMv8.6-A, while their latest M2's use 5nm and ARMv8.5-A. Kinda seems like Apple got cheap with the M2. If Apple does eventually end up using ARMv9 then everything you said was false.
Apple probably didn't get cheap here. It's more likely that this is because the M2 is an expansion of the previous chip architecture, not a parallel development with A16. That and I suspect Apple doesn't mind spreading out production across multiple processes to minimize supply constraints. I'd expect this year's M3 to be a 4nm part.


Nothing Apple did started from scratch. Also we're not talking about Apple SoC's in general, but their laptop chips specifically since you can't really use those in phones. Apple probably still makes money, but they are also a publicly traded company who has shareholders that want infinite growth. It's clear Apple cut costs with their SSD's, but the question is why? Given that Apple didn't go 4nm like AMD will be doing, and didn't improve their architecture in any meaningful way, I think they're avoiding the need to spend R&D.
Again, the evidence suggests Apple isn't being cheap so much as dealing with supply realities. In storage: 128GB chips are becoming rarer, and Apple would need to use more chips overall for a given capacity. It's much easier to ship a computer with two readily available 256GB chips than four 128GB chips that may be hard to get.

On the CPU front, there has been talk (from Bloomberg's well-connected Mark Gurman, if I recall correctly) that M2 is more of an interim upgrade, something to keep the lineup fresh while the bigger leap (M3) is in development. It just happens to be a sizeable upgrade in some areas, particularly GPU performance.
 
Like you do. Apple's goals are to be as independent as possible when it comes to making their products. This is why Apple had legal fights with Qualcomm over IP, and why Apple's GPU tech is "borrowed" from Imagination.
Licensed, and who cares? You act like this is a big deal, but it's not - the outcome doesn't change from who they license GPU technology from.
Cell phones and laptops are two different things. Apple's laptop silicon has to compete against AMD and Intel, and has to also deal with legacy x86. Where as their cell phone and tablet chips has to compete against Qualcomm and maybe Samsung.
Or generic ARM Cortex cores, sure - different markets. But right now, it sure looks like they're competing just fine.
I wouldn't high five the idea of either Apple or Nvidia buying ARM.

According to the wiki, Apple's A16 cell phone chips use 4nm and ARMv8.6-A, while their latest M2's use 5nm and ARMv8.5-A. Kinda seems like Apple got cheap with the M2. If Apple does eventually end up using ARMv9 then everything you said was false.
No, it just means there was no reason to use it right ~now~. What in the ARMv9 ISA do you think Apple should have adopted?
Depends on the encoder but on Apple you'd probably use the encoder.
Maybe, maybe not - but you definitely don't use NVENC if you want quality (or even Quicksync really). That's not its job - it's supposed to be fast and "good enough" for early passes.
Bulldozer is what I'm referencing.
M1 came out in 2020; Bulldozer was nearly a decade prior. AMD was already on Zen 2 at that time - I'm confused? They were already well into their resurrection by then.
Nothing Apple did started from scratch. Also we're not talking about Apple SoC's in general, but their laptop chips specifically since you can't really use those in phones.
sigh.
Apple probably still makes money,
It's in their 10k, you can read it yourself
but they are also a publicly traded company who has shareholders that want infinite growth.
Sure they do. Same for EVERY company out there.
It's clear Apple cut costs with their SSD's, but the question is why?
Because it didn't matter, and the only people that care are ones that wouldn't buy it anyway. Seriously, you're not going to buy it - why do you ~care~?
Given that Apple didn't go 4nm like AMD will be doing, and didn't improve their architecture in any meaningful way, I think they're avoiding the need to spend R&D.
Or maybe it takes longer to design and tape out a full new CPU generation than you think? I mean heck, intel stayed on 14nm Skylake for what... 6 years? We'll see if it matters once we get actual numbers on the new AMD chips - till then, it's just rumors and guesses.
Both AMD and Intel make laptop chips that aren't meant to be efficient. For example AMD's 7045 series has no support for LPDDR5 and is using RDNA2 instead of the more efficient RDNA3. Pretty clear that these chips are meant to be paired with a discrete GPU from AMD or Nvidia. That MSI laptop with the 13900Hk has enough lights on it to illuminate a room.
Which is a completely different market than the one Apple is competing in. The real question is if Intel and AMD can compete in that market or not. Or if they even really care to.
If SoftBank is bankrupt then so is ARM. You can spin it how you want, but Nvidia almost bought them for this reason.
Softbank is not bankrupt. They just don't have the cash reserves to make the moves they used to make, so they're trying to sell assets to make investments. Heck, their startup investment fund still has 6.5 BILLION dollars in it, they just want more cash flow because they make BIG bets. This is like a dude at vegas who's down to $500 in cash in his hand after starting with $5000, but he's still got 50k in the bank. He's not bankrupt, he's just not flexible in the bets he can make right now. Their shareholders expect returns, and they have to find new bets to generate those returns, as ARM as already paid out what it can - that's how VC and PE work.

https://archive.ph/EnSfg (bloomberg paywall without archive).


GPU's are hard to make. Who's going to be making them for Apple? China?
TSMC, the same people currently fabbing them. You do understand that it's integrated with the SoC right? It's not a separate device...

edit: They're buying architectural designs and IP from Imagination - not GPUs, and not completed designs or chips.
 
Does anyone think that if Apple wanted all the 128GB chips they could buy, they couldn't get them? I don't.
Yeah they could but I would bet that they would be charged for a custom order job and that would cost more than the stock 256 stuff readily available. No company likes spinning up old nodes for discontinued products.
 
No company likes spinning up old nodes for discontinued products.
Let me move my question back one step: does anyone think Apple couldn't have seen this coming and convinced whoever's making those chips to not stop making them?

In fact, that made me think: given how committed Apple is to vertical integration, it seems reasonable to think they might want to buy a foundry.

custom order job

And even disregarding what I wrote above, how many billions of dollars do they have sitting in the bank? I bet they could negotiate one hell of a discount.
 
Let me move my question back one step: does anyone think Apple couldn't have seen this coming and convinced whoever's making those chips to not stop making them?

In fact, that made me think: given how committed Apple is to vertical integration, it seems reasonable to think they might want to buy a foundry.



And even disregarding what I wrote above, how many billions of dollars do they have sitting in the bank? I bet they could negotiate one hell of a discount.

Suddenly, paying GloFo (for example) to make a few million 128GB flash chips doesn't seem so farfetched, does it? :)

They could have, what they should have done is just not release the 256gb model, 512 or bust.
 
Let me move my question back one step: does anyone think Apple couldn't have seen this coming and convinced whoever's making those chips to not stop making them?
Probably not, honestly. Foundry space is taken up often years in advance. And with the cell density increase, they'd be passing up on more lucrative contracts that could potentially be retrofitted in place.
In fact, that made me think: given how committed Apple is to vertical integration, it seems reasonable to think they might want to buy a foundry.
REALLY different businesses - especially when we talk EUV and the like. There's ONE company in the world (IIRC) that builds the tools for them - and it's a 6 month setup process from delivery to online, and comes in 6 full-size shipping crates. Plus the water burn/etc.
And even disregarding what I wrote above, how many billions of dollars do they have sitting in the bank? I bet they could negotiate one hell of a discount.
Maybe? But even then, what does it buy you. Running a foundry is expensive and that's not their expertise. Wasn't AMD's either - that's why they got out of GloFo. Foundries are a very different beast now.
 
But even then, what does it buy you.
I'm not suggesting Apple is going to or would necessarily want to, nor trying to say it'd be simple. I was pointing out that Apple is rapidly moving towards massive vertical integration, and buying a foundry so they wouldn't have to depend on anyone else for the chips they need isn't as farfetched as you might think, for a company with hundreds of billions of dollars in the bank. Flash doesn't need 3nm, GloFo (just to pick one foundry) could probably do it.
 
I'm not suggesting Apple is going to or would necessarily want to, nor trying to say it'd be simple. I was pointing out that Apple is rapidly moving towards massive vertical integration, and buying a foundry so they wouldn't have to depend on anyone else for the chips they need isn't as farfetched as you might think, for a company with hundreds of billions of dollars in the bank. Flash doesn't need 3nm, GloFo (just to pick one foundry) could probably do it.
Sure. But now you need environmental lawyers, environmentalists, a whole new set of engineering staff and operations people, etc etc. there’s a lot more to it than just buying a foundry. Given their lack of expertise in that space I’m not sure there’s a benefit to taking it on that wouldn’t be eaten by higher costs.

Doing it yourself and vertical integration works when no one has the product you need - or that is your product. But I’m not sure it has enough benefit here, and they’re not getting into the foundry business for others.

Dunno. I need to think about that more, but my business experience says that one is not a profit making move.
 
Does anyone think that if Apple wanted all the 128GB chips they could buy, they couldn't get them? I don't.
I do. Apple can negotiate deals with Samsung or other flash memory makers, but it can't necessarily force companies to stay on old manufacturing processes or pretend there aren't industry-wide chip shortages. And buying a foundry would be a very expensive alternative to just using 256GB chips and accepting a performance hit on certain configurations.
 
Sure. But now you need environmental lawyers, environmentalists, a whole new set of engineering staff and operations people, etc etc.
In my hypothetical, Apple didn't fire anyone, and presumably the foundry already has all those people.
 
In my hypothetical, Apple didn't fire anyone, and presumably the foundry already has all those people.
That's assuming the foundry company wants to sell those people - and they're willing to go. You might have to buy the entire ~company~ for that rather than just a foundry. Now we're talking a MAJOR purchase - one they can afford, aside from TSMC or Samsung itself - but one that would shake the markets quite a bit. That may piss off the board/shareholders/etc.

I've got training in M+A, although I've not done it personally (for precisely these reasons) - this gets SUPER deep and involved for major acquisition outside of your specialty, and that freaks the fuck out of people outside.
 
Apple probably didn't get cheap here. It's more likely that this is because the M2 is an expansion of the previous chip architecture, not a parallel development with A16. That and I suspect Apple doesn't mind spreading out production across multiple processes to minimize supply constraints. I'd expect this year's M3 to be a 4nm part.
Apple is going to release the M3 this year? That's odd considering that it took Apple nearly 3 years to release the M2's. This supports the idea that the M2's are just M1's with more cores and clocks.

Again, the evidence suggests Apple isn't being cheap so much as dealing with supply realities. In storage: 128GB chips are becoming rarer, and Apple would need to use more chips overall for a given capacity. It's much easier to ship a computer with two readily available 256GB chips than four 128GB chips that may be hard to get.
Nobody here has provided any evidence of a shortage of supply. Also if 128GB chips are rare, then maybe the base model should be 512GB for the M2, and 1TB for the M2 Pro? Also, Apple did reduce the price of their M2 products by $100 compared to the M1's, so that wouldn't make sense either way.
On the CPU front, there has been talk (from Bloomberg's well-connected Mark Gurman, if I recall correctly) that M2 is more of an interim upgrade, something to keep the lineup fresh while the bigger leap (M3) is in development. It just happens to be a sizeable upgrade in some areas, particularly GPU performance.
Maybe if you're a M1 user then the M2 should be skipped? Would feel bad to buy the M2's when the M3 with 3nm and ARMv9 is just around the corner, but at the same time the plain M3 is meant to replace the plain M2, and therefore shouldn't be superior to the M2 Pro and Max.
 
Apple is going to release the M3 this year? That's odd considering that it took Apple nearly 3 years to release the M2's. This supports the idea that the M2's are just M1's with more cores and clocks.
That's what the more credible rumors indicate. My hunch is that you see the M3-based MacBook Air and iMac later in the spring, possibly at WWDC. It was only 1.5 years (fall 2020 to mid-2022), but you're right in that M2 is mostly a refinement of M1 — albeit a big one for graphics and media editing.


Nobody here has provided any evidence of a shortage of supply. Also if 128GB chips are rare, then maybe the base model should be 512GB for the M2, and 1TB for the M2 Pro? Also, Apple did reduce the price of their M2 products by $100 compared to the M1's, so that wouldn't make sense either way.
I've pointed to analyst data regarding the supply chain. And you can't argue that Apple is being cheap unless you have evidence to that effect. Now, I'd love it if Apple had upped the base storage in response, but I suspect the margins were going to dip lower than Apple was comfortable with (whether or not they were actually too high is something we'll likely never know).

Apple only lowered the prices for the Mac mini family, and that's likely because it sees the system as a gateway for potential switchers. It's easier to reel someone in with a $599 price than $699, even if you realistically have to spend more to be truly happy. The MacBook Air M2 actually starts $200 above the M1 model (I hope Apple can bring that down for the M3).


Maybe if you're a M1 user then the M2 should be skipped? Would feel bad to buy the M2's when the M3 with 3nm and ARMv9 is just around the corner, but at the same time the plain M3 is meant to replace the plain M2, and therefore shouldn't be superior to the M2 Pro and Max.
That's the prevailing wisdom. The M1 Macs are still very fast, especially those with Pro/Max/Ultra chips. You buy M2 if you have anything else, or if you're a pro who absolutely values the performance gains for certain tasks (say, editing 8K video or many-track audio compositions).

How M3 will stack up to the M2 Pro/Max is unclear, but Apple clearly isn't afraid of generational updates that blur the lines a bit. A Mac mini M2 Pro can sometimes outperform the base Mac Studio (with an M1 Max). If M3 is a more substantial upgrade, I wouldn't be surprised if even the base Macs give current pro computers a run for their money.
 
Nobody here has provided any evidence of a shortage of supply. Also if 128GB chips are rare, then maybe the base model should be 512GB for the M2, and 1TB for the M2 Pro? Also, Apple did reduce the price of their M2 products by $100 compared to the M1's, so that wouldn't make sense either way.
Toshiba still appears to be Apple's exclusive supplier for TLC NAND, and they don't seem to have a 128Gb module in any of their current parts lineups, looking at what is available out there unless I go with some Chinese brands whose names don't even load in my browser unless they actually used windings in their name 128Gb TLC NAND currently cost more than the 256Gb options.
It looks like Toshiba, and the lot started phasing out the 128Gb modules back in 2020, as there just isn't a lot of demand for 128 or 256 Gb NVME drives, the process they use also isn't suitable for PCIe4 or 5 as the modules can't keep up there so I expect the 256Gb modules to be disappearing shortly as well. For the PCie 4 and 5 devices they need to use the 96L or 128L modules which start at 512Gb per chip.
So it looks like Toshiba probably gave Apple a sweet deal on those 256Gb modules because I can't imagine there is a huge consumer demand currently for 1TB and smaller PCIe3 NVME drives.
 
Last edited:
Toshiba still appears to be Apple's exclusive supplier for TLC NAND, and they don't seem to have a 128Gb module in any of their current parts lineups, looking at what is available out there unless I go with some Chinese brands whose names don't even load in my browser unless they actually used windings in their name 128Gb TLC NAND currently cost more than the 256Gb options.
It looks like Toshiba, and the lot started phasing out the 128Gb modules back in 2020, as there just isn't a lot of demand for 128 or 256 Gb NVME drives, the process they use also isn't suitable for PCIe4 or 5 as the modules can't keep up there so I expect the 256Gb modules to be disappearing shortly as well. For the PCie 4 and 5 devices they need to use the 96L or 128L modules which start at 512Gb per chip.
So it looks like Toshiba probably gave Apple a sweet deal on those 256Gb modules because I can't imagine there is a huge consumer demand currently for 1TB and smaller PCIe3 NVME drives.
I scoured digikey for a bit myself - there are 128GB modules out there - at about the same price as the 256G modules as well (all controller integrated, which skews pricing quite a bit). Same for ones where the SATA controller is integrated (they're all BGA modules).

There's just no reason to make them when you can make a bigger one for about the same price.
 
Apple is going to release the M3 this year? That's odd considering that it took Apple nearly 3 years to release the M2's. This supports the idea that the M2's are just M1's with more cores and clocks.
No one argued otherwise on the M2? Zen 3 was just Zen2 with more clocks and optimizations as well.
Maybe if you're a M1 user then the M2 should be skipped? Would feel bad to buy the M2's when the M3 with 3nm and ARMv9 is just around the corner, but at the same time the plain M3 is meant to replace the plain M2, and therefore shouldn't be superior to the M2 Pro and Max.

I doubt anyone (I certainly haven't seen any reviews) is seriously considering upgrading from M1 -> M2, unless it's M1 base to M2 pro/max. Just like I didn't bother upgrading my Zen2 CPUs to Zen3- for anything I'm doing the improvement is minimal, but I'd consider upgrading older Zen 1 / Zen + stuff, or anything older than that. Most reviews even say it's not worth it unless you need a specific thing enabled on M2 that isn't on 1 - it's just the continual, evolutionary growth of the product line (much like half of Intel or AMD's releases). Having two revolutionary products in a row is exceedingly rare.
 
I scoured digikey for a bit myself - there are 128GB modules out there - at about the same price as the 256G modules as well (all controller integrated, which skews pricing quite a bit). Same for ones where the SATA controller is integrated (they're all BGA modules).

There's just no reason to make them when you can make a bigger one for about the same price.
I would also point out that vendors usually provide end of life dates for components as well (regardless if there is part shortage or not). When a company is designing a new product they will look at this along with other parameters to decide which part to use. If you're as big as Apple you can probably get vendors to continue manufacturing a certain part, but I would think they would need to have a pretty good reason to do so. Personally, I do not think that increasing the SSD bandwidth of the base apple macbook would be a very compelling reason.
 
No one argued otherwise on the M2? Zen 3 was just Zen2 with more clocks and optimizations as well.


I doubt anyone (I certainly haven't seen any reviews) is seriously considering upgrading from M1 -> M2, unless it's M1 base to M2 pro/max. Just like I didn't bother upgrading my Zen2 CPUs to Zen3- for anything I'm doing the improvement is minimal, but I'd consider upgrading older Zen 1 / Zen + stuff, or anything older than that. Most reviews even say it's not worth it unless you need a specific thing enabled on M2 that isn't on 1 - it's just the continual, evolutionary growth of the product line (much like half of Intel or AMD's releases). Having two revolutionary products in a row is exceedingly rare.
It depends. If you're M1 Max and do certain GPU intensive tasks, the M2 Max is actually more than just a 20% bump as they not only added more GPU cores but increased the cache on the SOC which was a huge bottleneck with the M1 Max.
 
It depends. If you're M1 Max and do certain GPU intensive tasks, the M2 Max is actually more than just a 20% bump as they not only added more GPU cores but increased the cache on the SOC which was a huge bottleneck with the M1 Max.
Hence the specific thing 😁😁. Sometimes there’s a need. But for most of us it’s good enough. Same reason I’ve been sweating out one last year on my 10th gen and zen2 boxes.
 
Seems the new Apple M2's haven't fixed the huge latency the M1's have. We're talking about displays with over 30ms, compared to most PC displays with 5ms or less. The gaming experience is still horrible. Apple has sacrificed display speed significantly with a better picture, so long as that picture has no motion on it. Even outside of gaming that would be terrible.
 
Seems the new Apple M2's haven't fixed the huge latency the M1's have. We're talking about displays with over 30ms, compared to most PC displays with 5ms or less. The gaming experience is still horrible. Apple has sacrificed display speed significantly with a better picture, so long as that picture has no motion on it. Even outside of gaming that would be terrible.

Yes, believe it or not Apple optimizes their displays for accuracy really above all else. No one is buying a Macbook to play CSGO on.
 
  • Like
Reactions: DPI
like this
Yes, believe it or not Apple optimizes their displays for accuracy really above all else. No one is buying a Macbook to play CSGO on.
Or moving a window on the screen and mouse cursor apparently. The mouse would have huge input lag. Anything in motion is going to look like a blurry mess. This includes web browsing, video play back, and scrolling in general. This also means the high refresh rate is utterly useless on the Macbooks since there's no reason to go beyond 60Hz or possibly lower. The display is only really good for photo editing and not even for video editing. Took a gamer to find out the displays response time sucks, but it is important to have a screen that doesn't take nearly 1/20th of a second to display.

https://youtube.com/clip/UgkxJUby7u7JY8cPWQ68asqLyJRGi7JzXoeU
 
I am not sure why video playback would be affected by input lag (input does not affect what happen in a non real time video, outside a bit more time to pause if you press pause) and videos will usually between 24 to 60 fps, video did look good not so bad on LCD tv with giant input lag.
 
Last edited:
I am not sure why video playback would be affected by input lag (input does not affect what happen in a non real time video, outside a bit more time to pause if you press pause) and videos will usually between 24 to 60 fps
If the video has a lot of motion like a sports event, then you will see the blur. Can Macbook users read the text in this video?
 
Can Macbook users read the text in this video?
Considering I can on my terrible third monitor (a 60hz Acer LCD that was cheap in 2009 plugged with some DP to dvi cheap cable), I would imagine yes, hardware unboxed seem to say that achieve to be fast enough to match a 60hz monitor
 
If the video has a lot of motion like a sports event, then you will see the blur. Can Macbook users read the text in this video?

Yes, without any issues. In fact, it's smoother and easier to read compared side-by-side with my OLED panel which is much higher refresh rate (Along with the much better pixel response), because the OLED is updating to fast it's actually not fluid looking.

For most content the Apple display is better outside of twitch gaming.
 
If the video has a lot of motion like a sports event, then you will see the blur. Can Macbook users read the text in this video?

That’s refresh rate not display latency. Display latency is the time between moving your mouse and it showing up on screen. It still displays fine.

Haven’t watched the videos yet to see what the test was
 
Mac's scrolling is completely fucked without programs running to fix it so I imagine most mac users don't really notice anyway.
 
I have no issues editing video using Premiere Pro on a MacBook Pro (M1 Pro). It sure is a lot faster and smoother than it ever was on my 2017 MacBook Pro (Intel).
 
I don't know what the point of this tangent is... well I do, Duke wants a total OS monopoly and the death of choice, but there's no practical point here. It's well-established that modern Macs are plenty powerful, and that people — gasp! — buy computers for reasons other than gaming. A MacBook Pro with an M2 Pro or Max will be a strong media editing machine and general-purpose workhorse, especially if you need to work on battery power.
 
Back
Top