Apple to release first ARM Mac without Intel processor in next 18 months

I think the other thing that people are missing here in regards to gaming is that it will go both ways. Imagine a desktop platform that is also your mobile platform. If games that have "A" level of quality come to macOS (perhaps upwards of AAA, but CoD I don't think will be missed by most mac/iOS users), they'll also be able to be used on iOS. You could have "cross platform" iOS/macOS games that are AAA and allow you to pickup on one and play on the other. There isn't another system that is close to that other than the Switch. The benefit is that more iPhones are sold in a year than all three current gen consoles from launch to current date combined. Meaning that if done correctly, this could allow devs to leverage $60 phone games that can then be played on desktop etc. All it will require is a controller, of which Apple already has a built in SDK for. The only other area necessary is having powerful enough hardware and/or different graphics settings on different levels of hardware.

That can be a double edged sword though. Two problems it can potentially have:

1) If the games are being designed for both platforms, that means lowest common denominator. So you get games that aren't as good and don't take full advantage of the desktop because they also want to run on mobile. That isn't a killer issue... but it is an issue. For two games that suffered form that see Xcom and Shadowrun. Both were designed to run on mobile, and had mobile versions released, and both had limitations they otherwise wouldn't have because of it that got lifted later. You find Shadowrun Hongkong has much more detailed and larger levels because they didn't have to hit the same memory targets to make it work on mobile. Likewise Xcom 2 sees was more of a jump than you'd expect in graphics and interface design because there wasn't a mobile target.

2) Mobile games can poison the well. Let's face it: The mobile gaming market is utter shit. There are so few good titles they are almost entirely drowned out with low effort clones, aggressive pay-to-win bullshit and so on. It is part of why the switch has been able to succeed, because the tablet/mobile market is so bad that many gamers who want to game on the go want something with better games. That could happen if they come to the desktop. The large amount of crap drowns out the good stuff, and developers of it are discouraged and go elsewhere.
 
I think the issue there is going to come down to the limitation of the GPU more so than the CPU, much as it is with current Apple workstations still using Intel Iris iGPUs or low-end AMD iGPUs.
The GPUs, on all but the Mac Pro, are currently the biggest limiting factor, at least for gaming, on those workstations at the moment.

The 27" iMac has had reasonably competitve graphics cards since 2009. Right now it has a Vega 48 or RX580. But it previously had 395x, 295x, 7970, and 6970.
The 15" Macbook Pro has also generally had at least somewhat decent graphics. Nothing 1070 level for sure, but decent if people were willing to pay for discreet. Currently the 5500M, but Vega 20 before that, and RX460 Before that.

The other option is also eGPU's which Apple more or less supports better than any other platform. I comfortably use a Radeon VII at home on my 2019 15" MacBook Pro. I think the real buy in Apple's lineup that people kind of overlook is actually the Mini. It unfortunately hasn't been upgraded in a bit (still tops out on an 8600k), but paired with any GPU you want and it screams, especially considering how small a form factor it is. It's basically a CPU and RAM in a box that you can attach all manner of peripherals to via Thunderbolt 3 (which I suppose is directly comparitive to a higher end Intel NUC).
 
Last edited:
Not really... Metal can leverage Afterburner, Intel, AMD and Apples own silicon. NVIDIA is the only odd one out. Apples silicon is more than capable of taking on NVIDIA if it is scaled up / thermal limits removed.
I think you missed my point, which was that most Apple workstations are using Intel (or AMD) low-end iGPUs, which are low-power and not meant for gaming-level performance, that's all. :)
I do agree with everything you are saying, though.
 
That can be a double edged sword though. Two problems it can potentially have:

1) If the games are being designed for both platforms, that means lowest common denominator. So you get games that aren't as good and don't take full advantage of the desktop because they also want to run on mobile. That isn't a killer issue... but it is an issue. For two games that suffered form that see Xcom and Shadowrun. Both were designed to run on mobile, and had mobile versions released, and both had limitations they otherwise wouldn't have because of it that got lifted later. You find Shadowrun Hongkong has much more detailed and larger levels because they didn't have to hit the same memory targets to make it work on mobile. Likewise Xcom 2 sees was more of a jump than you'd expect in graphics and interface design because there wasn't a mobile target.

Yes and no. I think generally the way it will play out is automated graphics settings. This is more or less what happens on games you've brought up like XCom. On an older iPhone 5S it wouldn't render some shadows or use as much dynamic lighting. But it still allowed people to play on it. I think Civ VI is probably the big shining example of what this could look like if you wanted something that actually exists in the world. Like that, but more. Considering the power of current APU's, I'm not worried. The A10 is plenty quick, let alone the A12/A12X/A13 and whatever they're going to put into an Intel replacement laptop. We're already WAAAAAY past Switch levels of graphics hardware. The Nintendo Switch pales in comparison to the 2018 iPad Pro in terms of graphics performance.

2) Mobile games can poison the well. Let's face it: The mobile gaming market is utter shit. There are so few good titles they are almost entirely drowned out with low effort clones, aggressive pay-to-win bullshit and so on. It is part of why the switch has been able to succeed, because the tablet/mobile market is so bad that many gamers who want to game on the go want something with better games. That could happen if they come to the desktop. The large amount of crap drowns out the good stuff, and developers of it are discouraged and go elsewhere.
It is only until its not though. And you want to be ahead of the curve and not behind it. Lets be honest, Nintendo and Atari were in the realm of nerds. They were boxy, hard to play on, and had limited libraries. But things mature and suddenly (35 years later) there are more people by percentage of the population gaming than ever. The mobile market will flip when the value proposition changes. We literally have a thread about ET copies getting pulled out of the New Mexico landfill or whatever. So yeah, shovelware has existed on every platform. But to say that the past dictates the future is ignoring how different the future is. The landscape is changing.
We aren't back 10 years ago when the iPhone launched. The fact that XCOM and Civ VI are available on iOS is a testament to that fact. Let alone older AAA games that are getting ported over for nostalgia reasons. People are playing every SNES Square RPG on iOS as well as hardcore RPG's like KOTOR. I think those games really were a test bed by game devs to see if this was viable. I think history will eventually prove that it is. Perhaps not this year, but who can say in 18 months?
 
Last edited:
The other option is also eGPU's which Apple more or less supports better than any other platform. I comfortably use a Radeon VII at home on my 2019 15" MacBook Pro. I think the real buy in Apple's lineup that people kind of overlook is actually the Mini. It unfortunately hasn't been upgraded in a bit (still tops out on an 8600k), but paired with any GPU you want and it screams, especially considering how small a form factor it is. It's basically a CPU and RAM in a box that you can attach all manner of peripherals through via Thunderbolt 3.
You know, using an eGPU on Thunderbolt is a really good idea, and I suppose while I do think that the average Mac user (mainly focusing on productivity) wouldn't really have a need for that, it is certainly a viable option.
It will be interesting to see the continued compatibility with their ARM-based workstations and how games (assuming they are available for the platform) will perform for such a setup.

Good call!
 
I use Macs for work but all my gaming is on a PC. I was interested to learn that Civ VI was available on the iPad/iPhone but then I found out that:
  1. A purchase on Valve/Steam does not give you the right to play it on iPad/iPhone, unlike how Steam games are cross-playable with one purchase if there's a Mac and PC version
  2. The iPad/iPhone version of Civ VI cannot be played with the Mac and PC versions
As long as cross-platform play remains neutered and having to purchase multiple versions for multiple platforms, I won't be considering iOS/macOS games.
 
It works... but that doesn't mean it's a good idea. Thunderbolt is still limited in terms of bandwidth.

More of a last resort sort of thing than something to plan for.
Wanting an Apple workstation for productivity, along with getting an eGPU for occasional gaming, is a much lower cost than getting a Mac Pro, or another PC for gaming, overall.
I do agree with you on the limited data transfer rate, though Thunderbolt 3 is still about 1GB/s faster than a PCI-E 3.0 4x slot, which is also feasible for average (not high-end) gaming and workloads.

I won't get into use-case scenarios with using this on an Apple workstation, I'm just going with the assumption that MacOS is required for the user's productivity.
Maybe "good idea" wasn't quite right - "serviceable idea" would be more accurate. ;)
 
It works... but that doesn't mean it's a good idea. Thunderbolt is still limited in terms of bandwidth.

More of a last resort sort of thing than something to plan for.

Its a great idea really, maintain a laptops true purpose as a battery powered portable with a GPU optimised for running of a 65w-95w charger etc. Then you get home or work and dock IO/display/GPU for more flexibility.

Most high powered NVIDIA laptops cant run off a plane charger because of the power requirements.
 
Then you get home or work and dock IO/display/GPU for more flexibility.

Flexibility, sure; but like the laptop, you're overpaying for performance. Significantly, when you consider the price of all the kit to make a high-performance laptop dockable, and compared to a desktop to do the same thing...

Yeah.

You're paying for convenience, and I can see it being 'worth it', just don't want to beat around the bush as to what you're getting, at least if/until Thunderbolt increases in bandwidth closer to PCIe.
 
Flexibility, sure; but like the laptop, you're overpaying for performance. Significantly, when you consider the price of all the kit to make a high-performance laptop dockable, and compared to a desktop to do the same thing...

Yeah.

You're paying for convenience, and I can see it being 'worth it', just don't want to beat around the bush as to what you're getting, at least if/until Thunderbolt increases in bandwidth closer to PCIe.

Depends what you define as performance, me being able to edit video timelines on a plane in realtime 4K compared to a Windows laptop that flat out cant without tripping the power which means you are 100% more productive.

OS stability is also a “performance” metric for me.
 
It works... but that doesn't mean it's a good idea. Thunderbolt is still limited in terms of bandwidth.

More of a last resort sort of thing than something to plan for.
Whether you're on a Mac or a PC it's the only way to have a single system setup that delivers speed. It's significantly less expensive than having both a higher powered laptop and a high powered desktop no matter how you slice it.
The performance penalty is negligable. I imagine with PCIE 4 and beyond it won't even matter, although I would argue that it doesn't even matter now. The tipping point is already there.
 
ARM is going provide a cheaper product with a shorter life cycle. They are going to gain more market share by dropping the price, and the refresh cycle will increase 50+%.
 
You're paying for convenience, and I can see it being 'worth it', just don't want to beat around the bush as to what you're getting, at least if/until Thunderbolt increases in bandwidth closer to PCIe.
Thunderbolt 3 runs at 40Gbps, or ~5GB/s.
PCI-E 3.0 x4 is 4GB/s, and the performance drops from it (due to lack to data transfer rates) are less than 10% compared to PCI-E 3.0 8x or 16x.

I do agree with you that while it is not the most ideal setup, the option (for a price) is there, and it is definitely a serviceable and viable option for the niche group who requires such a setup.
It is definitely cheaper than going with a desktop replacement laptop (huge and expensive) or buying a secondary gaming desktop, all while allowing the user to have a smaller more portable laptop and have the ability to run MacOS.

Again, it is definitely a niche group, but the option to have it is definitely a good one. :)
 
ARM is going provide a cheaper product with a shorter life cycle. They are going to gain more market share by dropping the price, and the refresh cycle will increase 50+%.
Why do you think that?
 
ARM is going provide a cheaper product with a shorter life cycle. They are going to gain more market share by dropping the price, and the refresh cycle will increase 50+%.

I doubt that highly. It generally takes Apple a year to release a new phone. And other devices longer. The iPad Pro is set to be refreshed this year and its been around for almost 2 years. Whatever their "extra higher end" ARM chips look like, it will certainly take more dev time then you're letting on.
All things being equal though, I believe that making their own ARM chips in house will however make the update schedule significantly more consistent than it has been in the past. Right now only the iPhone has an expected release window every year, and the entire Mac product line is in question about when a refresh will occur. Not needing to depend on Intel or any other company to figure out when something is going to launch or needing them in their supply chain means they could launch their new laptops (as an example) every year during WWDC.

Releasing 50% more often doesn't actually benefit them either at least in terms of making more money. That would mean they're doing twice as many redesigns and twice as many changes in manufacturing increasing their costs.... by twice. It's far more profitable to have a single product you can charge the same price for as long as possible. Apple for the most part tries to play the balance in between. They generally do a refresh when they feel the tech is significantly different enough or will add enough benefit/performance to the user to update. The other half of that like I mentioned is the design/procurement time.
 
I doubt that highly. It generally takes Apple a year to release a new phone. And other devices longer. The iPad Pro is set to be refreshed this year and its been around for almost 2 years. Whatever their "extra higher end" ARM chips look like, it will certainly take more dev time then you're letting on.
All things being equal though, I believe that making their own ARM chips in house will however make the update schedule significantly more consistent than it has been in the past. Right now only the iPhone has an expected release window every year, and the entire Mac product line is in question about when a refresh will occur. Not needing to depend on Intel or any other company to figure out when something is going to launch or needing them in their supply chain means they could launch their new laptops (as an example) every year during WWDC.

Releasing 50% more often doesn't actually benefit them either at least in terms of making more money. That would mean they're doing twice as many redesigns and twice as many changes in manufacturing increasing their costs.... by twice. It's far more profitable to have a single product you can charge the same price for as long as possible. Apple for the most part tries to play the balance in between. They generally do a refresh when they feel the tech is significantly different enough or will add enough benefit/performance to the user to update. The other half of that like I mentioned is the design/procurement time.

I am sorry when I say refresh I don't meant more releases, I mean people replacing their existing MBP. Regular consumers hang on to their MBPs 5+ years, and even the resale value of a 5+ year old MBP isn't that bad. ARM won't age as gracefully with software/OS updates.
 
I doubt that highly. It generally takes Apple a year to release a new phone. And other devices longer. The iPad Pro is set to be refreshed this year and its been around for almost 2 years. Whatever their "extra higher end" ARM chips look like, it will certainly take more dev time then you're letting on.
All things being equal though, I believe that making their own ARM chips in house will however make the update schedule significantly more consistent than it has been in the past. Right now only the iPhone has an expected release window every year, and the entire Mac product line is in question about when a refresh will occur. Not needing to depend on Intel or any other company to figure out when something is going to launch or needing them in their supply chain means they could launch their new laptops (as an example) every year during WWDC.

Releasing 50% more often doesn't actually benefit them either at least in terms of making more money. That would mean they're doing twice as many redesigns and twice as many changes in manufacturing increasing their costs.... by twice. It's far more profitable to have a single product you can charge the same price for as long as possible. Apple for the most part tries to play the balance in between. They generally do a refresh when they feel the tech is significantly different enough or will add enough benefit/performance to the user to update. The other half of that like I mentioned is the design/procurement time.

You do know this is all rumor about the switch to ARM?

Apple designs their products to have an end of life. They want to sell you a new one every year on phones, every 2 years on macs/laptops if they can get away with it. Engineering costs are small fries compared to getting people to ditch a 2 year old mac..
 
You do know this is all rumor about the switch to ARM?

Apple designs their products to have an end of life. They want to sell you a new one every year on phones, every 2 years on macs/laptops if they can get away with it. Engineering costs are small fries compared to getting people to ditch a 2 year old mac..
I am sorry when I say refresh I don't meant more releases, I mean people replacing their existing MBP. Regular consumers hang on to their MBPs 5+ years, and even the resale value of a 5+ year old MBP isn't that bad. ARM won't age as gracefully with software/OS updates.
You two need to read each others posts.

But anyway, to directly oppose what has been stated: Apple really doesn't necessarily attempt to sell you new things constantly. In fact, iOS devices last significantly longer than ANY Android manufacturer. As in, they receive far more support and OS updates than anyone else. Most iPhone's receive support now for 5+ years (currently we have 6 years of phones from the SE and iPhone 6S and up that are supported). That isn't just bugs, that's also OS versions. The irony is is that it's more like Samsung wants you to buy every 2 years. And they are the ones who will barely release 1-2 OS updates for a phone and likely only offer 1 year of support. And Android is the OS that is fragmented with poor ability to push updates due to every manufacturer needing to put their skin on things and then taking their time getting their update validated. It's not uncommon for Samsung devices to be 6 months or more behind Google in terms of updates. Frankly why anyone buys anything other than a Pixel on Android is beyond me.
In terms of what Apple does for Mac sales, it's also very similar. I used my 2011 iMac for close to 9 years. It took that long also to just barely reach what Apple calls legacy hardware. Frankly if I would have bought 1 year later (2012), the machine would likely still be good for another 3-4 years as the only reason for removal of support on the 2011 was it's inability to use Metal. The 2012's are still fully supported at this point and until anything without Metal 2 becomes legacy, it will likely stay that way for some time. The 2009 Mac Pro can still be updated with new graphics cards and as a result still chugs along just fine. But another thing, there is definitely also a difference between being a legacy status system and having an "unusable" system. A lot of folks are still running older versions of macOS intentionally. Even my "old" 2011 iMac will continue to run just fine provided that whatever I'm trying to do it has the hardware ability to do.

tl;dr: No one is buying a new Mac every two years except companies and organizations that want to issue a new laptop to their employees every 2 years (although most sys-admin policy is usually 3, and that would be done REGARDLESS of if the user was using a PC or a Mac) or enthusiasts that really like to have the latest hardware.
It is an incredibly dumb assertion that you "must" buy a new Mac or iPhone constantly. And frankly it isn't true anymore than it would be true with any other piece of hardware.

Regarding ARM's longevity, I don't think that can be true either. The move to ARM would be because Apple wants to make, better, faster, less-energy consuming hardware. Certainly Apple's ARM architecture out the gate will be better than any and all entry level laptop processors. And if that's the case why would it age more poorly than an Intel one? Can you see how logically that makes no sense? More to the point, people still have ancient iOS devices still around that they use. Granted the iPad 2 (2011) or iPad Mini 2 (2013) aren't supported in iOS anymore, but there are plenty that still use them as YouTube and browsing devices.
Considering that current ARM processors made by Apple are multiple magnitudes of levels faster than the a4/a5 it's only going to favor the consumer. Especially over Intel's stagnated chip lineup. The A12X is already capable of editing 4k video and playing it back in real-time. That is not something most laptops under $2000 can even do. And that isn't even considering that whatever Apple puts into a Mac will be a theoretical, full-fat, laptop level ARM chip. The A12X is a low-power, low-heat, passively cooled, tablet chip.
 
Are these ARM units really fast enough to be taken seriously?

For a mac user most of which are using macbook airs, sure. They dont need much. I have used Dex with samsung phones and its perfectly viable. Now if you want to talk about high end gaming or photoshop or other intensive tasks its not really there yet but its getting close. Ultimately though its more a question of how long and how rough the transition of software will be. For MS and windows that transition is painfully long and slow. For apple its really going to be fast, average users are just riff raff they can throw around and will put up with any major change apple does. Professional users will keep shelling out big bucks for intel CPUs. The real loser here is google, they have no real app development or market share to speak of for chrome, so they really should be pushing down this road faster than anyone given its their only real avenue into the laptop and destop space. But for some reason they are just slow.
 
You two need to read each others posts.

But anyway, to directly oppose what has been stated: Apple really doesn't necessarily attempt to sell you new things constantly. In fact, iOS devices last significantly longer than ANY Android manufacturer. As in, they receive far more support and OS updates than anyone else. Most iPhone's receive support now for 5+ years (currently we have 6 years of phones from the SE and iPhone 6S and up that are supported). That isn't just bugs, that's also OS versions. The irony is is that it's more like Samsung wants you to buy every 2 years. And they are the ones who will barely release 1-2 OS updates for a phone and likely only offer 1 year of support. And Android is the OS that is fragmented with poor ability to push updates due to every manufacturer needing to put their skin on things and then taking their time getting their update validated. It's not uncommon for Samsung devices to be 6 months or more behind Google in terms of updates. Frankly why anyone buys anything other than a Pixel on Android is beyond me.
In terms of what Apple does for Mac sales, it's also very similar. I used my 2011 iMac for close to 9 years. It took that long also to just barely reach what Apple calls legacy hardware. Frankly if I would have bought 1 year later (2012), the machine would likely still be good for another 3-4 years as the only reason for removal of support on the 2011 was it's inability to use Metal. The 2012's are still fully supported at this point and until anything without Metal 2 becomes legacy, it will likely stay that way for some time. The 2009 Mac Pro can still be updated with new graphics cards and as a result still chugs along just fine. But another thing, there is definitely also a difference between being a legacy status system and having an "unusable" system. A lot of folks are still running older versions of macOS intentionally. Even my "old" 2011 iMac will continue to run just fine provided that whatever I'm trying to do it has the hardware ability to do.

tl;dr: No one is buying a new Mac every two years except companies and organizations that want to issue a new laptop to their employees every 2 years (although most sys-admin policy is usually 3, and that would be done REGARDLESS of if the user was using a PC or a Mac) or enthusiasts that really like to have the latest hardware.
It is an incredibly dumb assertion that you "must" buy a new Mac or iPhone constantly. And frankly it isn't true anymore than it would be true with any other piece of hardware.

Regarding ARM's longevity, I don't think that can be true either. The move to ARM would be because Apple wants to make, better, faster, less-energy consuming hardware. Certainly Apple's ARM architecture out the gate will be better than any and all entry level laptop processors. And if that's the case why would it age more poorly than an Intel one? Can you see how logically that makes no sense? More to the point, people still have ancient iOS devices still around that they use. Granted the iPad 2 (2011) or iPad Mini 2 (2013) aren't supported in iOS anymore, but there are plenty that still use them as YouTube and browsing devices.
Considering that current ARM processors made by Apple are multiple magnitudes of levels faster than the a4/a5 it's only going to favor the consumer. Especially over Intel's stagnated chip lineup. The A12X is already capable of editing 4k video and playing it back in real-time. That is not something most laptops under $2000 can even do. And that isn't even considering that whatever Apple puts into a Mac will be a theoretical, full-fat, laptop level ARM chip. The A12X is a low-power, low-heat, passively cooled, tablet chip.

Well, Apple just today agreed to a $500 million penalty payout in court over deliberately slowing older phones without informing consumers, in violation of law.

Apple has been described as an anti-consumer company: they sell locked down hardware with few options to upgrade, not counting the new $6k base Mac Pro. In this age of eco-awareness, climate change and conservation, forced obsolescence isn't the direction we need right now. Apple is presently facing a class-action lawsuit for using children in sourcing of rare earth metals in the Third World, which includes the death of children in mines. Apple may also be complicit in having its iPhones used to track down Uighur Muslims in China thereby facilitating genocide.

I don't know if these things mean anything to you, but they do to me. I rewarded HP with my money for building the ZBook 17 G5 mobile workstation that is completely modular, completely upgradeable, built to military standards and will limit the need for me to keep replacing things in a time when we need to restrict wanton consumerism. We have a planet to save.
 
Well, Apple just today agreed to a $500 million penalty payout in court over deliberately slowing older phones without informing consumers, in violation of law.
I'd like to see the sitation for most of this, including what you've written below. But for this one in particular I'm farily certain it was in relationship to older batteries.
Apple, trying to preserve battery life and prevent iPhones from shutting down throttled iPhones as a preventative measure. Because for most people a slower phone is better than a phone that turns off under load. However, general consumers don't see it that way, despite batteries fixing all performance issues Apple was 'at fault'. The law rules as it rules. But if that's the issue you're talking about, I'm unsympathetic. I had one of these phones. An iPhone 5S. I replaced the battery myself and then just used it for another 2 years. Big whoop. Problem solved.

Apple has been described as an anti-consumer company:
By whom? And by what definition?

they sell locked down hardware with few options to upgrade, not counting the new $6k base Mac Pro.
Most people do not upgrade their computers at all, referring to anything that is built by an OEM. It wouldn't matter if every part in every computer was 100% modular. It doesn't matter if it's Dell, HP, or Apple in this case.
Most businesses/sysadmins replace their hardware every 3 years. Regardless of if it's PC or Mac. And especially in those cases, they aren't upgrading their machines.

So the question becomes: what is more relevant, upgrade-ability or other factors in terms of cost of ownership, form factor, support, OS stability, updates, etc? You got what you wanted: a modular machine. Other people weigh other factors as more important. That's why we have this thing called: choice.

In this age of eco-awareness, climate change and conservation, forced obsolescence isn't the direction we need right now. Apple is presently facing a class-action lawsuit for using children in sourcing of rare earth metals in the Third World, which includes the death of children in mines. Apple may also be complicit in having its iPhones used to track down Uighur Muslims in China thereby facilitating genocide.
There isn't forced obsolescence. This has been covered more than once. That's a talking point. You'd have to actually prove that. But the short version is: most Mac hardware has greater longevity than PC hardware due to a more optimized OS and better base specs. Apple's minimums are generally PC midrage with the exception of the Macbook which is using an ultra-low power CPU.
iPhones last more than 5 years. There isn't a single Android device that I would use that's 5 years old. But I would happy use an iPhone 6s+. And did literally until this past May. And the only reason I upgraded was because the trade in value was stupid high and I broke my camera, which they didn't care about.

On rare earth metals: guess what, if you're in the electronics manufacturing space you're using rare-earth metals. There is not one sinless in this category. There isn't such a thing as "fair trade" rare-earth metals. China controls a massive percentage of it and there isn't any way around the decisions they choose to make in mining it.

As for genocide, that's a high claim. There's an incredibly high burden of proof necessary for that. Most of everything you're saying is more feelings based than factual. Saying "may have" is a waste of time.

I don't know if these things mean anything to you, but they do to me. I rewarded HP with my money for building the ZBook 17 G5 mobile workstation that is completely modular, completely upgradeable, built to military standards and will limit the need for me to keep replacing things in a time when we need to restrict wanton consumerism. We have a planet to save.
That's fine. Buying any product is up to the one who is using it. But buy your laptop because it does what you need it to do and it will work into the future like you need it to. Buy it because it has the OS you want and or need. Buy it because it supports the software you need to run.
If you think that HP and Microsoft along with all the other IC companies involved in the manufacture of that laptop (likely companies like Foxconn and Samsung) are all 100% clean, you're a fool. And as modular as your machine is, eventually it too will land in the garbage bin. But to your note about saving the planet, at least Apple recycles literally every part of their machines through their recycling programs and builds their machines using the least amount of carbons and toxic chemicals possible. Standards which no other company has met.
https://www.apple.com/environment/
Feel free to scroll to the bottom and look at each products environmental report card. Also feel free to show me HPs.

So if you don't want to buy a Mac, that's fine. But it seems to me that you're not weighing other manufactures in the same way that you are Apple. Apple has a lot of spotlights on it. It happens when you're at the top of any industry. Any piece of "news" is a "big deal" when it's about Apple. There is no hype machine at that level for HP. So their dirty deeds are far more likely to be hidden. And if obscurity is enough for you to feel comfortable, fine. But when you're flinging mud just be prepared to also get dirty.
 
Last edited:
You do know this is all rumor about the switch to ARM?

Apple designs their products to have an end of life. They want to sell you a new one every year on phones, every 2 years on macs/laptops if they can get away with it. Engineering costs are small fries compared to getting people to ditch a 2 year old mac..

Quit the BS, that is all unsubstantiated garbage. iPhones typically see software support for 5-6 years (they have no requirement to do so either) and my 5yr old Macbook still supports the latest Catalina OS.

If Apple wanted to really make your phone a brick, they would cut all software updates after 24 months like android.
 
Apple will not adopt Vulkan as all it accomplishes is fragmenting their environment and splitting developers work. Everyone writes for Metal and all hardware except NVIDIA (idiots) supports it.

It's got nothing to do with fragmentation, supporting standards that are already in place is far from fragmentation. It's got everything to do with the fact that Apple want to be the next Sun Microsystems with their locked down API.

If Apple wanted to really make your phone a brick, they would cut all software updates after 24 months like android

It's longer than 24 months and even then the fact that you're not running the latest bleeding edge OS doesn't make your device a brick.
 
It's got nothing to do with fragmentation, supporting standards that are already in place is far from fragmentation. It's got everything to do with the fact that Apple want to be the next Sun Microsystems with their locked down API.
All the code for Metal, Xcode, Swift, etc is all out there. Other people could adopt these standards but they won't. Apple created OpenCL, but they moved away from it because they wanted a single unified low-level API that covered not only graphics but also GPGPU and is significantly more parallel than any of those API's alone. (As a side note, this is why all Apple pro-apps are all as parallel as they are and also no other programs in the industry are nearly as efficient). In order to program for the open standards that you think are wonderful, you'd have to program for 2-3 programing languages whereas Metal is 1. Metal is based on C++, it's easy to program for and familiar with basically anyone who has done any level of graphics programming. It's also the only API that is both on mobile (phones/tablets) and desktop that is anywhere close to this flexible and powerful, and low-level.

This is a far cry from trying to make an obscure format like Sun Microsystems. And it's also not as if Apple will ever have dominant marketshare in the PC desktop space (if Apple in the lifetime of the company ever exceeded 20% adoption in the consumer space that would be unbelievable high), so I wouldn't particularly worry about their "locked down API" as it likely will never even affect you. But for people inside the ecosystem it has quite a bit of benefit. It's likely you're right, even though you didn't say this but you more or less implied, that there won't be a lot of gaming companies rushing to port their games over to macOS even though there isn't particularly a lot of restriction(s) for doing so and that provided it's coded in Vulkan wouldn't take a particularly long amount of time or money. However considering that again, this affects you in no way if all you do is work in Windows I don't really see why you care. Just don't buy a Mac. Problem solved.

EDIT: I would also argue that nVidia is far more of a problem than Apple when it comes to making API's that are self serving and wanting to dominate the market. They have repeatedly fractured the gaming market and GPGPU market. Cuda is a long term problem. Their control over raytracing will become a problem if they aren't checked. What they did with physics was a problem (PhysX). Hell, even syncing with Gysnc also a problem. All of these API's are completely closed (Cuda, PhysX, RTX, Gsync, etc), and can only be used on or in conjunction with nVidia hardware. But I imagine you have less of a problem with them as they are the "winners" on the PC platform regardless of if they are the ones also directly splintering the PC platform, whereas Apple doesn't really affect the Windows platform at all.
 
Last edited:
All the code for Metal, Xcode, Swift, etc is all out there. Other people could adopt these standards but they won't. Apple created OpenCL, but they moved away from it because they wanted a single unified low-level API that covered not only graphics but also GPGPU and is significantly more parallel than any of those API's alone. (As a side note, this is why all Apple pro-apps are all as parallel as they are and also no other programs in the industry are nearly as efficient). In order to program for the open standards that you think are wonderful, you'd have to program for 2-3 programing languages whereas Metal is 1. Metal is based on C++, it's easy to program for and familiar with basically anyone who has done any level of graphics programming. It's also the only API that is both on mobile (phones/tablets) and desktop that is anywhere close to this flexible and powerful, and low-level.

This is a far cry from trying to make an obscure format like Sun Microsystems. And it's also not as if Apple will ever have dominant marketshare in the PC desktop space (if Apple in the lifetime of the company ever exceeded 20% adoption in the consumer space that would be unbelievable high), so I wouldn't particularly worry about their "locked down API" as it likely will never even affect you. But for people inside the ecosystem it has quite a bit of benefit. It's likely you're right, even though you didn't say this but you more or less implied, that there won't be a lot of gaming companies rushing to port their games over to macOS even though there isn't particularly a lot of restriction(s) for doing so and that provided it's coded in Vulkan wouldn't take a particularly long amount of time or money. However considering that again, this affects you in no way if all you do is work in Windows I don't really see why you care. Just don't buy a Mac. Problem solved.

EDIT: I would also argue that nVidia is far more of a problem than Apple when it comes to making API's that are self serving and wanting to dominate the market. They have repeatedly fractured the gaming market and GPGPU market. Cuda is a long term problem. Their control over raytracing will become a problem if they aren't checked. What they did with physics was a problem (PhysX). Hell, even syncing with Gysnc also a problem. All of these API's are completely closed (Cuda, PhysX, RTX, Gsync, etc), and can only be used on or in conjunction with nVidia hardware. But I imagine you have less of a problem with them as they are the "winners" on the PC platform regardless of if they are the ones also directly splintering the PC platform, whereas Apple doesn't really affect the Windows platform at all.

You can't accuse others of 'fragmenting' the API situation while praising the addition of yet another API that only works on Apple devices, that's not how it works. You also cannot accuse Nvidia of making closed proprietary platforms when that's exactly what Metal is all about.

Developers aren't exactly going to be jumping over one another rejoicing over yet another API they have to support for one make. Furthermore, I run Linux, not Windows or MacOS and my OS is a far better gaming platform than MacOS for all the reasons I mentioned earlier.
 
You can't accuse others of 'fragmenting' the API situation while praising the addition of yet another API that only works on Apple devices, that's not how it works. You also cannot accuse Nvidia of making closed proprietary platforms when that's exactly what Metal is all about.
The irony here is that is exactly what you’re doing. But in reverse.

Developers aren't exactly going to be jumping over one another rejoicing over yet another API they have to support for one make. Furthermore, I run Linux, not Windows or MacOS and my OS is a far better gaming platform than MacOS for all the reasons I mentioned earlier.
Cool. Choice in the market. Which is also more fragmentation. Developers aren’t exactly going to be jumping over one another rejoicing over yet another operating system to have to support either.
Your opinion about what platform is better is a matter of opinion as well. But to cement an opposing view Metal has well over 100k's of games written on it through iOS. You can argue they’re simple, but no matter how you slice it that’s way more devs working on Apple platforms than Linux. I find it ironic that you want to single out Apple for being a less suitable OS to run when Linux is running on far fewer consumer desktops. The only real support that Linux has in terms of devs is Valve. But even they couldn't prop up the store and really get devs to want to develop for Steam on Linux or the "Steambox". if you're a gamer, it's Windows all the way down.
 
Last edited:
If Apple wanted to really make your phone a brick, they would cut all software updates after 24 months like android.

No, they just have batteries that die within 2 years, then patch your OS to run slower. It's insidious... sure, your phone still 'works', but now it's slow and old. You go out and buy the new hotness, and Apple $$$
 
No, they just have batteries that die within 2 years, then patch your OS to run slower. It's insidious... sure, your phone still 'works', but now it's slow and old. You go out and buy the new hotness, and Apple $$$

Which is why Apple entered into the settlement agreement; because people upgraded. I didn't upgrade my phone nor did I get upset that Apple throttled my phone.
 
No, they just have batteries that die within 2 years, then patch your OS to run slower. It's insidious... sure, your phone still 'works', but now it's slow and old. You go out and buy the new hotness, and Apple $$$

Seriously people cant be this stupid, as phones age, batteries degrade and you can’t pull as much current. Apple applied power draw limits to extend a devices useful life without it crapping out completely. I had 2 other phones who didn’t have this and they would suddenly turn off at say 30% because the battery couldn’t meet power needs.

The alternative is your device suddenly turns off or crashes. Devices age, and performance degrades with use. Everyone is affected by this.

If Apple wanted you to but a new device, they could cut all software updates after 2yrs instead of releasing iOS updates that actually make your device faster.
 
Well some people still care about Mac gaming (not this guy). I'm on Slickdeals as well, and EVERY.SINGLE.TIME. there is a game sale thread, the posts pop up... Will this work on mac?

Then again, one cannot compare their user base to this one. No one comes to sites like this to find deals on diapers or rechargeable batteries. There "might" be a "tiny" bit of a tech ed difference. LOL
 
Cuda is a long term problem. Their control over raytracing will become a problem if they aren't checked. What they did with physics was a problem (PhysX). Hell, even syncing with Gysnc also a problem. All of these API's are completely closed (Cuda, PhysX, RTX, Gsync, etc), and can only be used on or in conjunction with nVidia hardware.
Of those, only G-Sync is actually closed. It exists because it was first and remains the best.

CUDA exists for the same reason.

RTX is based on DXR.

PhysX is the only real example, as Nvidia 'closed' it, but then they opened it back up.
 
Of those, only G-Sync is actually closed. It exists because it was first and remains the best.

CUDA exists for the same reason.

RTX is based on DXR.

PhysX is the only real example, as Nvidia 'closed' it, but then they opened it back up.
Great. If the goal is "perfect unification of standards" then none of those "explanations" matter. Personally I think nVidia is pretty toxic. Significantly more toxic in terms of standards, as if allowed to do what they do unchecked they would gladly get rid of every competitor that operates in the same space.
The point is, if you're going to point out that "Metal is a destructive and closed standard because it fragments the marketplace", then there should be the same level (or really a much higher level) of outrage at nVidia. As Apple affects Windows machines not at all and nVidia is directly responsible for fracturing the code-base on the PC side without intention of ever stopping.
 
I think nVidia is pretty toxic.
They do their corporate strong arming thing that catches them flack in the community, but the basic reality is that they're an innovation powerhouse.

Their competition is lucky to halfass a response a year or two later.
 
They do their corporate strong arming thing that catches them flack in the community, but the basic reality is that they're an innovation powerhouse.
Debateable. There is no question that they make the fastest hardware. But they could have easily used and developed open standards (the things people are complaining about in here) or adopted the open standards that are in place (like Vulkan and OpenCL). What's more their 'innovation' is far from perfect, they are also just as likely to try and be disruptive jerks for no reason. Just read about TressFX vs Hairworks as an example.

Their competition is lucky to halfass a response a year or two later.
In what way? If you're referring to straight performance, sure, but nothing nVidia is doing is particularly noteworthy. They've invested a huge amount of money (and die space) into RTX, but basically after nearly a year there hasn't been any utilization for all of the tensor cores and RT cores sitting on an RTX. Most of what we see as "ray tracing" is due to the super-fast cache that nVidia have placed on their cards. Which is why it's possible to run RT on non-RT hardware. The "penalty" isn't from the lack of specialized hardware (Tensor cores, etc), it's actually from a lack of super-fast cache. So if that is what you mean about innovation, then sure, they're ahead.

Kudos to nVidia where it's due though, they obviously were able to see the writing on the wall and their goal was to be a first mover in order to use their monopolistic practices to force everyone else to their standards by using the first mover advantage. So they performed incredibly well at that. Here's hoping they fail so that we can use an open standard like Vulkan or DX12 instead of their intentionally proprietary garbage.

---

But more to the point, you're more or less arguing for Apple and their proprietary nature. It's okay so long as you're the winner and/or innovator. I guess Apple wins so much that everyone has to bitch about it. Everyone wants to hate the Yankees and the Packers.
 
But they could have easily used and developed open standards (the things people are complaining about in here) or adopted the open standards that are in place (like Vulkan and OpenCL).
You can complain about Vulcan while Nvidia was fleshing out DX12 with Microsoft, but that will fall on deaf ears.

CUDA exists and was taught in University before OpenCL was a thing.


Just read about TressFX vs Hairworks as an example.
Hairworks is DX-compliant.


But more to the point, you're more or less arguing for Apple and their proprietary nature.
Quite the opposite. Nvidia is straight up inventing technology and then pushing relevant solutions to market.

Apple is about control of their garden full of others inventions.

These are two are not alike beyond their respective market success.
 
You can complain about Vulcan while Nvidia was fleshing out DX12 with Microsoft, but that will fall on deaf ears.

CUDA exists and was taught in University before OpenCL was a thing.
By 2 years. But more to the point, they chose to develop a closed standard rather than an open one. AMD creates Vulkan and gives it away. nVidia creates Cuda and tries to force not everyone just to use the codebase, but also to force everyone to use their hardware.
Apple developed OpenCL (I guess one of those "walled garden things full of other people's inventions" as you would say) and gives it away. You can think whatever you want. But Cuda hasn't offered an advantage over alternatives since near after its inception, except that it plays nice with the current leader in graphics cards.

Hairworks is DX-compliant.
Oh, so you're not aware. Hairworks was intentionally designed to run poorly on AMD hardware. It was then leveraged by nVidia with their status to get as many devs to use it in order to create more titles that would give nVidia a speed advantage. Hairworks by definition is the epitomy of anticompetitive. Actually all of nVidia's "works" products are.

Quite the opposite. Nvidia is straight up inventing technology and then pushing relevant solutions to market.

Apple is about control of their garden full of others inventions.

These are two are not alike beyond their respective market success.
Apple has invented quite a few things and still does. But people's perception of what they do (and whether they like them or not) directly changes the perception of them. And because they are as big as they are, they're almost immediately polarizing. Your statements to me are a direct reflection of that.
It's okay if the company you like does it, but not okay if the company you don't also does. And that can be explained by whatever you deem to be 'relevant'.
 
You can complain about Vulcan while Nvidia was fleshing out DX12 with Microsoft, but that will fall on deaf ears.

CUDA exists and was taught in University before OpenCL was a thing.



Hairworks is DX-compliant.



Quite the opposite. Nvidia is straight up inventing technology and then pushing relevant solutions to market.

Apple is about control of their garden full of others inventions.

These are two are not alike beyond their respective market success.

Apple are a customer just like everyone else, they don’t have to compromise on their ecosystem just for suppliers like NVIDIA, if they won’t meet Apples needs, there are plenty of other suppliers that will. Supporting a completely seperate API to METAL and rewriting OSX and all apps is absolutely not an option.
 
Apple are a customer just like everyone else, they don’t have to compromise on their ecosystem just for suppliers like NVIDIA, if they won’t meet Apples needs, there are plenty of other suppliers that will. Supporting a completely seperate API to METAL and rewriting OSX and all apps is absolutely not an option.


Ahh, because creating and then abandoning OpenCL is just too much work for Apple, RIGHT?

Which is why they did abandon OpenCL in-favor of Metal? A to-the-metal proprietary API that they created as a competitor to NVIDIA's CUDA.. eight year after Cuda launched.

Native Metal uptake hasn't been that huge so if Apple hadn't gotten a bug-up-their-ass (as is their tendency,) they could have supported Vulkan instead.

A number of games / applications are just running a Vulkan ->Metal wrapper through, so why not make it official?

Fuck Apple, and their shitty choices for APIs; they have a long history of choosing the wrong ones, and then taking forever to undo their mistakes. But the "Apple does no wrong" True Believers™ think everything they do is Gospel.

Like you Metal Cheerleaders, who suddenly think that %100 of their apps all run it natively, and Apple hasn't totally abandoned a proprietary 3D API suddenly before :rolleyes:

And the worst part of the Metal transition is, Apple is abandoning BOTH OpenGL and OpenCL in favor of their One True API (this time for sure!)

They also don't support Vulkan on Apple, so the world is working around them.

Whenever Apple has had more money than brains, they have always gone back to "Not Invented Here." And their users just pretend it's not happening again. Sometimes it's for the greater good (Apple custom ARM cores), but most of the time it's not.
 
Last edited:
Back
Top