Several macOS Monterey Features Unavailable on Intel-Based Macs

Apple is entirely to blame by forcing obsolescence via spite and creating artificial market segmentation.

They don't *need* to optimize anything, they just need to stop being slimy and two faced, but that'll never happen. When they introduced the M1 processor, they also introduced Rosetta 2 and had a whole dog and pony show to tell everyone how easy it is to cross compile applications and have the same functionality on both architectures. Then they do an about face with all of these new features with the lie that their buzzword "neural" engine is required.

Their "Neural" engine is probably just some GPGPU or customized ARM core with specific extensions to enhance whatever code they want to run. They pulled this same stunt back in the G3 to G4 transition with their "velocity engine", which just turned out to be vector extensions similar to SSE and had little performance implications outside highly specific workloads, which is not a thing here. We won't really know for a few years what it exactly is because they like keeping things a secret.
Apples neural engine is exactly what they say it is. Go find the CoreML API and all their other developer info its not hard to get a good picture of what the hardware does. No its not complicated they are not full GPU cores... they are actually fairly stripped down which = speed.
What all these features are are software bits that are using the CoreML API.... apple has started including CoreML bits to show off what it can do. Third Party developers will start using it more as well. There really isn't a good way to translate those API calls to a CPU... could they code them to work on a GPU ? Perhaps of course it would probably be drastically slower... and then Apple would run into the other obvious issue of being accused of purposely making things run like ass on Intel. As they likely would.

To put it in terms gamers can easier understand. (even though I know I know M1 is just a crap phone SOC nothing Intel can't better em hmm) The situation here would be like this.... Nvidia has DLSS and RT hardware on their GPUs. Now some games use a very light bit of RT, and no one says they could code that to run on a GPU (Which they could) and DLSS quality should be able to run on a GPU as well. AND they can for sure if Nvidia was to code them to engage a few GPU cores instead. I mean performance would blow.... but it can still be run on a GPU. They also have very little financial incentive to make their new features work on old cards.

In this situation ya it may seem silly that Apple doesn't have some silly 3D global feature run on anything. As I see it they are just writing CoreML modules to leverage their new hardware bits. Not to keep going back to Nvidia but another example would be cuda. If you wrote something that used Cuda.... yes that same thing could be written to use opencl and run on any GPU.... but yes it would run slower and it would be a pretty good amount of work to rewrite said module to also work with opencl.

I don't think this one really falls under planned obsolescence M1 actually has some hardware Intels chips don't have... in the same way Nvidia GPUs have tensor cores, and AMDs newest radeons have RT bits.
 
Lot of Apple haters in here. The only people that care about these "issues" are ones that don't even own Apple products or are 1% of the Apple userbase.

For 99% of Apple users, they have a macbook or iphone that they either use it until it breaks or buys another when a new model comes out and they go on about their happy lives without thinking of "issues" like this.

Its no different than the enthusiast on this site that upgrades to a new GPU every generation....
 
Its no different than the enthusiast on this site that upgrades to a new GPU every generation....
Now it's a race to see who can price gouge the best though.

Apple user == gaming enthusiast == begging for food
 
Lot of Apple haters in here. The only people that care about these "issues" are ones that don't even own Apple products or are 1% of the Apple userbase.

For 99% of Apple users, they have a macbook or iphone that they either use it until it breaks or buys another when a new model comes out and they go on about their happy lives without thinking of "issues" like this.

Its no different than the enthusiast on this site that upgrades to a new GPU every generation....
It is rather amusing. I do think we need to hold Apple's feet to the fire on a frequent basis, as it does do things that rub a lot of people the wrong way, but a lot of tech enthusiasts have this weirdly outsized view of their influence, as if they not only reflect the majority opinion but define it.

The boring reality? Limiting a few features to M1 chips isn't going to spur a huge wave of early upgrades, nor will it leave people up in arms. At best, it might prompt people who were already itching to upgrade to make the leap with current products instead of waiting one refresh longer. Most everyone else? They'll hang on to their Mac for several years and upgrade when it's not powerful enough to meet their needs.
 
The only thing I really see here is Apple working to clean up their code base and they are ditching "Legacy" parts in favor of their new and improved blah blah blah blah. The lack of legacy code is sort of what keeps Apple being Apple, so this doesn't at all surprise me, but I figured they would have gone a generation or two longer before cutting that out on the older Intel product lineup.
 
The only thing I really see here is Apple working to clean up their code base and they are ditching "Legacy" parts in favor of their new and improved blah blah blah blah. The lack of legacy code is sort of what keeps Apple being Apple, so this doesn't at all surprise me, but I figured they would have gone a generation or two longer before cutting that out on the older Intel product lineup.
If it is one more nail in the coffin for Intel...

-70iqIAUSKyKPCMAF1sbeoQ10s_y2SQJXWyYZEVL4rld0AJzgw.jpg
 
It is rather amusing. I do think we need to hold Apple's feet to the fire on a frequent basis, as it does do things that rub a lot of people the wrong way, but a lot of tech enthusiasts have this weirdly outsized view of their influence, as if they not only reflect the majority opinion but define it.

Apple has had a monumental effect on many industries outside their own, there is no outsized view of their influence. Companies around the world haven't been blind to Apple's success over the past 20 years, and many have tried to copy them, including all of their cancerous elements, like reparability of their devices and the other anti-consumer behavior I outlined earlier. It all goes back to Jobs' vision of having a closed box that nobody is allowed to mess with, and he's entirely responsible for that aspect of Apple. During the years where Apple was without Jobs, Apple was actually moving to open up their architecture and in the late 90s, was on the way of making an ATX compatible standard for their machines. As soon as Jobs came back, all of that was swiftly and decisively killed by him personally.

Aurelius said:
The boring reality? Limiting a few features to M1 chips isn't going to spur a huge wave of early upgrades, nor will it leave people up in arms. At best, it might prompt people who were already itching to upgrade to make the leap with current products instead of waiting one refresh longer. Most everyone else? They'll hang on to their Mac for several years and upgrade when it's not powerful enough to meet their needs.

Just like with the 68k->PowerPC and PowerPC->x86 split, the M1 is going to leave x86 behind fairly rapidly. People are going to move quickly to M1 machines as application vendors drop x86. But Apple is again going to have an exodus of applications with going to a proprietary architecture. Even though x86 is losing steam, it is still ubiquitous enough that many vendors won't bother with porting to M1, and M1 users will be stuck with emulation, just like the PowerPC days.

If it is one more nail in the coffin for Intel...

While Intel has been a slimy company over the years, them going under isn't good for anyone. AMD wouldn't have a counterbalance and would turn into Intel. There also aren't many other companies that make x86, you can more or less count them on one finger - VIA. They license a few other companies to make x86 parts, like the DM&P Vortex86 and Zhaoxin.
 
Apple has had a monumental effect on many industries outside their own, there is no outsized view of their influence. Companies around the world haven't been blind to Apple's success over the past 20 years, and many have tried to copy them, including all of their cancerous elements, like reparability of their devices and the other anti-consumer behavior I outlined earlier. It all goes back to Jobs' vision of having a closed box that nobody is allowed to mess with, and he's entirely responsible for that aspect of Apple. During the years where Apple was without Jobs, Apple was actually moving to open up their architecture and in the late 90s, was on the way of making an ATX compatible standard for their machines. As soon as Jobs came back, all of that was swiftly and decisively killed by him personally.
This is mostly true, and I do wonder why companies copy so much of what Apple does, not just the good stuff (it's also ironic how many of the people who claim Apple isn't innovative also complain that companies are following Apple's lead).

With that said, the end to the open architecture was almost a business necessity at the time. Remember, the Mac clones nearly killed Apple — a company built around hardware sales was losing sales to standardized machines from third-parties, but those machines didn't sell in nearly enough numbers to make up the difference from lost hardware sales. And the software roadmap was a muddled mess, so a mostly software-driven Apple wouldn't have done well at the time.

Jobs might have locked things down, but in doing so he also saved the company by returning it to what it did best (iconic hardware) with a narrower focus. We wouldn't have a viable alternative consumer PC platform (no, Linux doesn't count) these days if it hadn't been for him. I suspect Apple without Jobs would have shared the same fate HP dealt to Palm; the software would've had a second life, but only as a shadow of its former self. And that wouldn't have been good for the tech industry.


Just like with the 68k->PowerPC and PowerPC->x86 split, the M1 is going to leave x86 behind fairly rapidly. People are going to move quickly to M1 machines as application vendors drop x86. But Apple is again going to have an exodus of applications with going to a proprietary architecture. Even though x86 is losing steam, it is still ubiquitous enough that many vendors won't bother with porting to M1, and M1 users will be stuck with emulation, just like the PowerPC days.
I don't think the market or software development environments are the same, though. Even for the PPC-to-x86 transition, Apple was a relative underdog in the tech world, didn't have the greatest transition tools, and was switching more for survival than anything. Apple in 2021 is much stronger, has a more robust set of tools and is transitioning because the dominant PC architecture is holding it back. I don't think you'll see a sudden surge of companies porting apps and games that weren't already there, but I don't think Apple will regret this, either. Especially not if Apple Silicon chips progress the way iPhone chips did and become the clearly faster option in most cases.
 
Last edited:
Just like with the 68k->PowerPC and PowerPC->x86 split, the M1 is going to leave x86 behind fairly rapidly. People are going to move quickly to M1 machines as application vendors drop x86. But Apple is again going to have an exodus of applications with going to a proprietary architecture. Even though x86 is losing steam, it is still ubiquitous enough that many vendors won't bother with porting to M1, and M1 users will be stuck with emulation, just like the PowerPC days.
AMD just announced their V-Cache technology that is going to give them a 15% boost in IPC and you think x86 is dead? I think Apple's future ARM products are going to have a bad time, not x86. The move from 68k to PowerPC and x86 was because Motorola, IBM, and Intel were all left behind because of a competitor. Apple was sick of the 68k and were convinced RISC was the future, and so they went with PowerPC. IBM failed to compete with Intel and so Apple went x86. Now that Intel is falling behind, Apple went on their own. The 68k is now history while the Power10 architecture still lives in someone's fantasy. Because of AMD's and Intel's relationship with x86, it will almost certainly never die out. At some point fairly soon, Apple's ARM chips won't be able to compete with x86 and that's going to force Apple to go with other chip vendors. Probably Nvidia.
While Intel has been a slimy company over the years, them going under isn't good for anyone. AMD wouldn't have a counterbalance and would turn into Intel. There also aren't many other companies that make x86, you can more or less count them on one finger - VIA. They license a few other companies to make x86 parts, like the DM&P Vortex86 and Zhaoxin.
I'm not worried about Intel. I'm worried when Intel does make something good then they'll increase their prices. Unless someone like Nvdia takes ARM seriously, you'll see a lot of tech companies who pay for the ARM license and "make" their own chips. At this point even a breakfast cereal company could make an ARM chip.
This is mostly true, and I do wonder why companies copy so much of what Apple does, not just the good stuff (it's also ironic how many of the people who claim Apple isn't innovative also complain that companies are following Apple's lead).
They copy Apple for a few reasons. One because whatever Apple does will sell, so they wanna copy. Good or bad, Apple products sell. Secondly, because Apple is the only company I know that will do bad things to their customers and get away with it, so other companies want to do bad things and get away with it too. Things like no headphone jack, no SD Card slot, and etc. Finally because Apple products take care of all the nasty research and development stuff. If Apple is doing it then you know it must be well tested and proven to sell.
We wouldn't have a viable alternative consumer PC platform (no, Linux doesn't count) these days if it hadn't been for him.
4fc.png
 
Last edited:
AMD just announced their V-Cache technology that is going to give them a 15% boost in IPC and you think x86 is dead?

I never said it was dead, I said it was losing steam. 20 years ago, there was lots of excitement around x86, hundreds of companies made products for x86 and there were always exciting advances going on. Today, x86 is stagnant. The number of companies that still make stuff for the DIY market has dwindled down to a handful in each category. There used to be over a dozen motherboard manufacturers, now we have just four: ASUS, Gigabyte, MSI and ASRock, which is also ASUS so three. We used to have brands like Abit, FIC, Shuttle, DFI, Soyo, Soltek, IWill, Biostar, PC-Chips, Foxconn, Amptron and more.

It doesn't help that we've been having a world wide trade war, pandemic, <everything> shortage, mining frenzy and scalpers that pedal parts like contraband. This has turned off many people that could have come into the DIY x86 market because of insane prices/unobtanium parts. If you aren't already in the ecosystem, it's a tall order to get started because even used parts have skyrocketed in price, along with availability.

Who cares if AMD's V-Cache gives you a 15% or 1000% performance increase, it's completely irrelevant when you can't get parts, or they're so expensive that they're completely out of budget. So what you have this smokin' fast CPU? Doesn't do squat when all you can buy is 10 year old GT 210 video cards for what midrange cards cost 3-4 years ago.
 
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor if you're an extra special person, should you be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?
 
Last edited:
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor is you're an extra special person, you should be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?
It happened back in 2006 when so many customers purchased the Apple Quad G5 from late 2005, and the following year it was almost totally abandoned in favor of x86.
History repeats itself.
 
It happened back in 2006 when so many customers purchased the Apple Quad G5 from late 2005, and the following year it was almost totally abandoned in favor of x86.
History repeats itself.
Just like the Mac Pro trashcan in 2013. It was released with great fanfare and.... abandoned. After the release in 2013, it never got an update, despite multiple assurances it would. The machine suffered severe overheating issues due to the severely cramped internals, even worse than the G4 cube. It wasn't until four years later in 2017 that Apple admitted it was a terrible design and the design would be abandoned in favor of a rehash of the old G5 Mac Pro case that was released in 2019. People paid over $3000 for the trash can and were left high and dry. Mind you, the machine was still usable if it didn't brick itself from thermal death, but a high end workstation is getting pretty long in the tooth after 6 years.
 
I like my M1 Mac and Monterey is coming with a number of features I am really looking forward to. Can't say the same for any Windows release ever. If my Intel Mac is missing some minor features then so be it.
 
I like my M1 Mac and Monterey is coming with a number of features I am really looking forward to. Can't say the same for any Windows release ever. If my Intel Mac is missing some minor features then so be it.
I run a PC, I don't run Windows, and I'm constantly receiving updates that I look forward to.
 
I never said it was dead, I said it was losing steam.
X86 is clearly not losing steam. If Intel screws up then AMD exists to take market share away. If AMD screws up, then Intel will be there to take market share. If Apple screws up then Apple is there to convince you they're the best. If Apple continues to screw up then Apple will show you benchmarks in how amazing their products are. X86 has a unique situation that you don't see anyplace else.
20 years ago, there was lots of excitement around x86, hundreds of companies made products for x86 and there were always exciting advances going on.
20 years ago Intel screwed up with the release of the Pentium 4 while AMD's Athlon CPU's were gaining momentum with the XP line of products. Then AMD releases the Athlon 64 which brings 64-bit to x86 and a built in memory controller into the CPU. This isn't too different from today with Intel screwing up being stuck on 14nm and AMD is going to release V-Cache technology.
Today, x86 is stagnant.
Seriously, AMD V-Cache. How many times I gotta say "AMD V-Cache"? I believe ARM announced L3 cache as well, so we know cache matters to CPU performance.
The number of companies that still make stuff for the DIY market has dwindled down to a handful in each category. There used to be over a dozen motherboard manufacturers, now we have just four: ASUS, Gigabyte, MSI and ASRock, which is also ASUS so three. We used to have brands like Abit, FIC, Shuttle, DFI, Soyo, Soltek, IWill, Biostar, PC-Chips, Foxconn, Amptron and more.
This is just capitalism doing capitalism like things. That's unfortunately normal. You have less choice because competition got harder. We used to have 3DFX, and Via Savage 3D graphics. We used to have Gateway and Compaq computers, though they do exist but as rebranded Chinese and HP computers. We used to have Pontiac, Oldsmobile, Saturn, and etc.

None of that means the DIY market has dwindled, it just means that consumers have a preference and it wasn't those companies. In fact the DIY market is doing better than ever, otherwise we wouldn't have a graphics card shortage. Or do you think people are buying graphic cards and sticking them into Apple M1's?
It doesn't help that we've been having a world wide trade war, pandemic, <everything> shortage, mining frenzy and scalpers that pedal parts like contraband. This has turned off many people that could have come into the DIY x86 market because of insane prices/unobtanium parts. If you aren't already in the ecosystem, it's a tall order to get started because even used parts have skyrocketed in price, along with availability.
So you think people just buy Apple M1's and call it a day? This has moved on to wishful thinking.
Who cares if AMD's V-Cache gives you a 15% or 1000% performance increase, it's completely irrelevant when you can't get parts, or they're so expensive that they're completely out of budget. So what you have this smokin' fast CPU? Doesn't do squat when all you can buy is 10 year old GT 210 video cards for what midrange cards cost 3-4 years ago.
I'm sorry but what does this have to do with anything we're talking about? Last I checked Apple doesn't make graphic cards or a place to put a graphics card in. Nor do they have graphics good enough for gaming. So in that regard forget about Apple.
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor if you're an extra special person, should you be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?
Those who don't learn from history are doomed to repeat it. Apple users love to repeat history.
It happened back in 2006 when so many customers purchased the Apple Quad G5 from late 2005, and the following year it was almost totally abandoned in favor of x86.
History repeats itself.
I have a PowerBook G4 that isn't getting any support from anyone. The few people that give a crap about it have barely gotten Linux to work on it. The difference here is that x86 is well supported and you will likely see support from Linux for a long time. Assuming Apple lets you install Linux on their newer Intel Macs. They kinda don't last time I heard.
I like my M1 Mac and Monterey is coming with a number of features I am really looking forward to. Can't say the same for any Windows release ever.
Was it on sale? HA! Old joke but still relevant.
Windows_Vs_Mac_Vs_Linux_10.jpg
If my Intel Mac is missing some minor features then so be it.
Spoken like someone who works for Apple.
 
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor if you're an extra special person, should you be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?
Not really. Those people bought Mac Pros because they needed a machine with many cores, gobs of RAM and expandability. They may be tempted to upgrade early, but a lot of the things they bought that workstation for will remain relevant for several years.

And if the Apple Silicon version is better in most respects... well, great. People aren't about to complain that Apple is improving its systems too much.
 
They copy Apple for a few reasons. One because whatever Apple does will sell, so they wanna copy. Good or bad, Apple products sell. Secondly, because Apple is the only company I know that will do bad things to their customers and get away with it, so other companies want to do bad things and get away with it too. Things like no headphone jack, no SD Card slot, and etc. Finally because Apple products take care of all the nasty research and development stuff. If Apple is doing it then you know it must be well tested and proven to sell.
Oh, Apple definitely isn't the only company that treats customers badly at times but still manages to thrive. Weren't we just talking about how many Android vendors purposefully abandon support early, or scale back updates, unless you buy their flagship phones? Hell, I've seen major vendors who shipped phones that were stuck with the launch OS forever, even if it was already a year old. It's this weirdly classist approach where you're punished for living in the 'wrong' country (or having the 'wrong' economic opportunities) by getting an inferior update experience.
 
X86 is clearly not losing steam. If Intel screws up then AMD exists to take market share away. If AMD screws up, then Intel will be there to take market share. If Apple screws up then Apple is there to convince you they're the best. If Apple continues to screw up then Apple will show you benchmarks in how amazing their products are. X86 has a unique situation that you don't see anyplace else.

20 years ago Intel screwed up with the release of the Pentium 4 while AMD's Athlon CPU's were gaining momentum with the XP line of products. Then AMD releases the Athlon 64 which brings 64-bit to x86 and a built in memory controller into the CPU. This isn't too different from today with Intel screwing up being stuck on 14nm and AMD is going to release V-Cache technology.

Seriously, AMD V-Cache. How many times I gotta say "AMD V-Cache"? I believe ARM announced L3 cache as well, so we know cache matters to CPU performance.

This is just capitalism doing capitalism like things. That's unfortunately normal. You have less choice because competition got harder. We used to have 3DFX, and Via Savage 3D graphics. We used to have Gateway and Compaq computers, though they do exist but as rebranded Chinese and HP computers. We used to have Pontiac, Oldsmobile, Saturn, and etc.

None of that means the DIY market has dwindled, it just means that consumers have a preference and it wasn't those companies. In fact the DIY market is doing better than ever, otherwise we wouldn't have a graphics card shortage. Or do you think people are buying graphic cards and sticking them into Apple M1's?

So you think people just buy Apple M1's and call it a day? This has moved on to wishful thinking.

I'm sorry but what does this have to do with anything we're talking about? Last I checked Apple doesn't make graphic cards or a place to put a graphics card in. Nor do they have graphics good enough for gaming. So in that regard forget about Apple.

Those who don't learn from history are doomed to repeat it. Apple users love to repeat history.

I have a PowerBook G4 that isn't getting any support from anyone. The few people that give a crap about it have barely gotten Linux to work on it. The difference here is that x86 is well supported and you will likely see support from Linux for a long time. Assuming Apple lets you install Linux on their newer Intel Macs. They kinda don't last time I heard.

Was it on sale? HA! Old joke but still relevant.
View attachment 365667

Spoken like someone who works for Apple.

No because an Apple user wouldn't know how to install a graphics card.
 
Was it on sale? HA! Old joke but still relevant.
467309_Windows_Vs_Mac_Vs_Linux_10.jpg
The irony: Apple hasn't charged for a macOS update since Mountain Lion, in 2012, and even then they were cheap upgrades in later years (I remember buying Snow Leopard for $29). So you've probably paid more for OS upgrades as a Windows PC user over the past decade than any Mac user!

Yes, this is partly because Apple can fold the cost of software development into hardware sales (every legit macOS copy is attached to a Mac, after all), but Mac users are now like those Linux users... for the most part, it's just a bunch of free goodies that make your computer better.
 
The irony: Apple hasn't charged for a macOS update since Mountain Lion, in 2012, and even then they were cheap upgrades in later years (I remember buying Snow Leopard for $29). So you've probably paid more for OS upgrades as a Windows PC user over the past decade than any Mac user!

Yes, this is partly because Apple can fold the cost of software development into hardware sales (every legit macOS copy is attached to a Mac, after all), but Mac users are now like those Linux users... for the most part, it's just a bunch of free goodies that make your computer better.
Mac O.S. goodies aren't free. you paid for them when you purchased your Mac.
 
Mac O.S. goodies aren't free. you paid for them when you purchased your Mac.
Like I said, Apple can fold the cost of macOS software development into hardware sales. And it still means that Windows users are ironically more likely to pay for OS upgrades than Mac users. I'm curious to see if Microsoft will charge for the next version of Windows after years of iterative updates to Windows 10.
 
I run a PC, I don't run Windows, and I'm constantly receiving updates that I look forward to.
I know you said you don't run windows, but for those that do patch Tuesday is like a free choose your own adventure book every week. It's really great for Microsoft to inject so much anticipation and a sense of excitement into their products, and not even charge for it. Will it boot?!?! Is the registry corrupt?!?! Did it blow away all customer data?!?!!
 
Oh, Apple definitely isn't the only company that treats customers badly at times but still manages to thrive. Weren't we just talking about how many Android vendors purposefully abandon support early, or scale back updates, unless you buy their flagship phones?
Yes and Android is shit too for a different reason. Apple updates their devices because they have the App store where Apple makes money even off older iPhones. Not the case with Android devices where only Google makes money off the Play Store, with the exception of Amazon Fire products. This is why Google needs to treat Android devices like Windows and Linux where everyone gets updates and not something vendor specific. As a PC user it boggles my mind that we have such fragmented OS's for smart phones.
Hell, I've seen major vendors who shipped phones that were stuck with the launch OS forever, even if it was already a year old. It's this weirdly classist approach where you're punished for living in the 'wrong' country (or having the 'wrong' economic opportunities) by getting an inferior update experience.
At least sometimes we got custom roms.
 
The irony: Apple hasn't charged for a macOS update since Mountain Lion, in 2012, and even then they were cheap upgrades in later years (I remember buying Snow Leopard for $29). So you've probably paid more for OS upgrades as a Windows PC user over the past decade than any Mac user!
That depends if you need to buy an M1 powered Mac to get all the new Monterey features. Then the Mac OSX upgrade will cost you over $1k.
Yes, this is partly because Apple can fold the cost of software development into hardware sales (every legit macOS copy is attached to a Mac, after all), but Mac users are now like those Linux users... for the most part, it's just a bunch of free goodies that make your computer better.
I don't remember a Linux distro withholding features.
 
Yes and Android is shit too for a different reason. Apple updates their devices because they have the App store where Apple makes money even off older iPhones. Not the case with Android devices where only Google makes money off the Play Store, with the exception of Amazon Fire products. This is why Google needs to treat Android devices like Windows and Linux where everyone gets updates and not something vendor specific. As a PC user it boggles my mind that we have such fragmented OS's for smart phones.

Funny, I just went from Android to iOS after 10 or so years because of the update situation.
 
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor if you're an extra special person, should you be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?

Isn't that what always happens.

This is what 10+ years of Intel stagnation has done to peoples expectations of how long their PC should be considered "top end". (not picking on you Maz I do see your point)

I remember being into computers way back in the 80s.... now I'm not that old I admit my parents bankrolled my computer nerd childhood. But yes that Amiga was replaced pretty quickly. Not that I had one but the earliest macs went from insane to paper weight quicker then I'm sure anyone that bought one liked. 286 386 486.... ya those where rendered ejunk in in a few years as well.

We have lived in a time we all complain about where Intel has done nothing but incremental change for a decade. And as much as we bitch about it... we have to know that is why our 5+ year old (hell sometimes 8 or 9 year old) machines are still just fine for the kids and perhaps even wives. I suspect that is ending now with ARM rising... and AMD wanting to keep x86 relevant. I expect 2022-2030 or so are going to be very painful for people buying top of the line stuffs. I believe 2020 machines are not going to be all that useful in 2025. I think the days of just throwing in a new GPU a couple times to extend a CPUs life span are probably over till the next bout of complacency.
 
Right. Also, once apps (especially games and engines) become fully multi-threaded, then there is no limit on the performance.

Whereas a 4 core system can probably skate by today, in like a couple years we will all have 16 or 32 core systems like it is normal.

Hell, Unreal Engine 5 lists a 12-core CPU as *minimum* requirements (and 32GB RAM too). So once these games come out, all our PCs will be obsolete overnight.
 
Hell, Unreal Engine 5 lists a 12-core CPU as *minimum* requirements (and 32GB RAM too). So once these games come out, all our PCs will be obsolete overnight.

Well, that requirement is for the development build demo running on UE5 engine editor without further optimization on lessor hardware though. I doubt the minimum requirement rising above 8 cores for a several years considering that is the base spec for PS5/XSX. UE5 even supports Switch(!) btw.
 
True, but it shows the even now there are programs that are requiring higher specs.

I agree that 8-core will be the norm in terms of consumers, but for creators I expect the higher core count and other advances (more RAM, requiring SSDs, etc,) will start happening sooner than you think.
 
That depends if you need to buy an M1 powered Mac to get all the new Monterey features. Then the Mac OSX upgrade will cost you over $1k.
Ah, but like I noted earlier, the likelihood of someone buying an Apple Silicon Mac just for those features is... slight. Strictly anecdotally, I have a 2019 iMac and I won't be in a rush to buy a new one even if they overhaul the 27-inch model with everything I'd hope for. You'd have to be pretty hardcore to buy a new Mac just for a few minor software features. If you get an M1 system, you get it for the performance and (on laptops) battery life; anything else is gravy.

I don't remember a Linux distro withholding features.
Insert "that's because there aren't many features to start with" joke here.
 
The '00s called, they want their childish Apple stereotypes back.

The only people I've seen who upgrade Macs every couple of years are enthusiasts who'd do that no matter how well their old systems were running. Hell, I'm only replacing my eight-year-old MacBook Pro this year because it won't get major OS updates, and I want to remain current beyond security updates (which I'd still get). It'd still work well enough if I didn't want the latest software features.

Think about that line for a good long minute and then come back to talk about the hypocrisy of your post.

The ONLY time Microsoft EOL'd hardware on their platform is when there was a fundamental change in hardware or processor technology. At no point did they ever say "Notepad will no longer function on i386 platforms" or or gate Internet Explorer to a hardware upgrade or stupid crap like that. There is zero sound reason to force hardware upgrades on people with a list of bullet point "features" that could easily be done on any CPU. As previously stated, this is wholly a case of Apple's planned obsolescence to get people to buy new hardware. Apple is a hardware company. Their profit motive is to sell hardware, especially since their hardware is NOT a loss leader.
 
Think about that line for a good long minute and then come back to talk about the hypocrisy of your post.

The ONLY time Microsoft EOL'd hardware on their platform is when there was a fundamental change in hardware or processor technology. At no point did they ever say "Notepad will no longer function on i386 platforms" or or gate Internet Explorer to a hardware upgrade or stupid crap like that. There is zero sound reason to force hardware upgrades on people with a list of bullet point "features" that could easily be done on any CPU. As previously stated, this is wholly a case of Apple's planned obsolescence to get people to buy new hardware. Apple is a hardware company. Their profit motive is to sell hardware, especially since their hardware is NOT a loss leader.
Well that isn't true. Windows Aero required pixel shader 2.0. Windows still ran without a PS 2.0 video card but Aero did not.

This is no different. Apple has developed a API named CoreAI. CoreAI is not like opencl it doesn't use AI or GPU or CPU cores... it only uses Apples Neural engine cores. CoreAI does not run on Intel because Intel is lacking the actual Hardware to run it. Just like you could run Vista and Windows 7 with Areo on older machines but you didn't get Areo peak or 3D tab flips ect.

This isn't about EOLing hardware... its about old hardware not having actual features.

You can argue perhaps Apple should have made CoreAI like OpenCL to run on any processor. Really if they had then the headline would be.... Apple drastically reducing the speed of CoreAI features on Intel.

I don't know get used to it folks... we are heading into new waters, there are a lot of new things coming soon to CPUs and GPUs. I mean we have already got there with Nvidia. I mean Nvidia has a bunch of stuff they COULD make work on their old cards but restrict to their new cards with tensor cards. DLSS could for sure have a basic non tensor core version for older cards. (AMD seems to have proven that). Intel is about to big.Little their chips with Alder lake... and just wait they are not just going to be using those little cores to reduce power usage, and the interesting things they add will not work on older chips either. AMD is going to go all in on 3D stacking and I have a feeling we will be seeing interesting things with Zen4 (zen3+ is just more cache... but 4 is going to do something interesting). That is just the standard stuff.... Nvidia at some point is going to go in on ARM CPUs as well and they are going to include bits that are not GPU or CPU cores as well... I mean with Nvidia it might just be a few tensors but who knows. Microsoft is probably making their own chips... Samsung has a new RDNA2 ARM chip coming... ect ect.

Bottom line as I said earlier in the thread. The last decade of stagnation is ending. Its what we all said we really wanted. So hold on to your money and spend wisely cause everyone is going to be throwing crap at the wall and some of it will catch fire and some of it will end up having support die within a few years of its launch. I think we are heading into a time we haven't seen since the 90s when there where 20 different Video Card companies... 4 or 5 APIs, and just as many ways to make the sausage. The next few years are going to be fun but a lot of us are going to get burned on expensive gear that ends up abandoned a few years later.
 
Last edited:
Think about that line for a good long minute and then come back to talk about the hypocrisy of your post.

The ONLY time Microsoft EOL'd hardware on their platform is when there was a fundamental change in hardware or processor technology. At no point did they ever say "Notepad will no longer function on i386 platforms" or or gate Internet Explorer to a hardware upgrade or stupid crap like that. There is zero sound reason to force hardware upgrades on people with a list of bullet point "features" that could easily be done on any CPU. As previously stated, this is wholly a case of Apple's planned obsolescence to get people to buy new hardware. Apple is a hardware company. Their profit motive is to sell hardware, especially since their hardware is NOT a loss leader.
Apple hasn't done moves like that, either... not in recent memory, at least. Apple isn't breaking existing functionality if you don't have its new silicon; it's not requiring a new chip to run the latest version of Safari. These are minor features where it's clearly just using new hardware to do them more efficiently (at least from a development perspective, and possibly from a computational perspective) than with the old hardware. You won't be groaning in agony if you run macOS Monterey on an Intel Mac; you might not even notice what you're missing.

Long support cycles are good in many ways, but one of Microsoft's core problems of the past 15-20 years has been its "legacy support at all costs" mindset — that is, where it crosses over from helping most customers to being desperate to keep every last one. That's why Microsoft has routinely struggled to get some business customers to upgrade, even in the Windows 7 days. You'll still see something like 32-bit support, for example, because some business customer wants to keep running a decades-old database app... never mind that Intel's last 32-bit desktop chip was released in 2002.

Of course Apple's goal is to sell hardware. But if its only concern was to force people to upgrade machines every few years, it wouldn't offer around eight years of major OS updates or limit new Apple Silicon-specific features to minor things that won't realistically prompt many upgrades. Apple is sometimes too aggressive on cutting off support, but it at least acknowledges that there's a balance to be struck between providing meaningful OS support to customers and drawing a technological line in the sand. I'd say it leans a bit more on the aggressive side (Catalina's 64-bit-only support may have come too soon), but it's not nearly as aggressive as you think.
 
Last edited:
So....If you just shelled over tens of thousands for a Mac Pro and perhaps even the Apple monitor if you're an extra special person, should you be worried about becoming the red head stepchild of the Apple lineup?

For tens...Of...Thousands...Of dollars (bearing in mind prices gain an extra 'outside US tax' the second tech is sold outside the USA). At least you can say it looks pretty I suppose? But is it as pretty as the iPad on a stick called the iMac?
I'm sure all five of those guys are going to cry in their beer now.
 
I'm sure all five of those guys are going to cry in their beer now.
The monitors and the pro's actually sold well, Their pro monitors are actually cheap compared to what they were up against at that resolution and color accuracy. I know a good number of places that bought the monitors for their PC workstations because they were cheaper and better than the existing alternatives for that class. The pro's for as ugly as they were those custom GPU's paid for themselves if you had the appropriate workloads for them, and again for their price/performance ratios if you needed them for the work you were doing were actually really good. But if you are not putting that hardware to work for the specifics it was designed for (which is different than who it was directly marketed to), then sweet Jesus those things were expensive. You think the stands for the monitors were expensive you should look at the roller feet for the towers... The stands were never meant to be sold in volume because anybody buying those already has an ergonomic desk and those are VESA mounted, and the only people buying the roller feet were offices that had union cleaning staff so they could move them easily to clean around or behind them.
 
Back
Top