M1 successor should be coming this summer

Because the GPU is starved for memory bandwidth due to the unified memory architecture.
If the GPU had its own dedicated VRAM, this would be a very different story, but for what it is it is second to none in its TDP-class.
I mean yea it's very power efficient but if it's missing VRAM then that's a problem Apple should solve.
That's becoming a bit apples to oranges in such a comparison.
Certainly any modern GPU with a 150 watt TDP or greater will crush the GPU and hardware in the M1 SoC... an SoC that has a TDP of less than 35 watts.

This is why we are trying to compare the M1 to other CPUs/APUs/SoCs within a similar TDP range.
This video showcases exactly how the M1 flat out destroys similar offerings from Intel and AMD, and is very fair about what is being compared and the results:
We know Apple has the superior TDP and power consumption. Forget Intel as they're still determining which way is up. AMD would have a superior APU if they stopped using 5 year old GPU Vega architecture but as you said they don't have a reason to release it.
 
So to clear, your argument boils down to "x86 is a better and faster CPU architecture than ARM because an x86 processor with a dedicated GPU (which is not x86) can game better than ARM processors with integrated GPUs. But x86 losing in video editing to the M1 doesn't count because the M1 has dedicated hardware for that task built in"

Lol
We all know x86 is old and terrible. Modern day x86 PC's are just a bunch of legacy crap that we drag on, but there's a reason why we drag it on. Anyone can build an IBM compatible. IBM compatibles are mostly adhering to the ATX standard along with many standards. PowerPC was superior to x86 and it failed. Intel's Itanium would have been superior, but it failed. These architectures fail because they come with proprietary hardware with proprietary software. They fail because nobody feels like rewriting all their code to get locked into this new ecosystem.

CPU architectures stagnate and right now ARM is stagnated. Apple had to invest big money to improved ARM to compete with x86. Intel's x86 is stagnate right now because they're stuck on 14nm. The key thing here is that x86 doesn't stagnate for long, as AMD keeps Intel on their toes. Who's keeping ARM on their toes? ARM is bankrupt and we're waiting to see if Nvidia is going to buy them. Who's competing with Apple's M1? Nobody is directly competing with Apple's M1. Can anyone build an Apple M1 clone? Can anyone license Mac OSX?

You could say Qualcomm competes with Apple, but not directly. You could say Intel and AMD compete with Apple, but not directly. If you want an integrated GPU with your M1, then you'll have to wait on Apple, assuming Apple will do it. Apple is not in the business to make hardware, as the M1 uses ARM designs with PowerVR based graphics. The M2 will get an incremental improvement, as well as the M3 and M4. More GPU cores, higher clock speed, and 3nm process. It's not hard to predict what Apple will do. AMD will do chiplet? Intel might do stacking silicon? These companies go beyond to try and innovate.
 
I hear everyone blame SoftBank on ARM's finances but it still doesn't make sense. ARM is literally everywhere from laptops to routers and even with mismanagement you're telling me they're broke? I expect Nvidia to increase the fees for ARM licenses.
You are correct, the ARM license is making them money, but I think the other factors that I mentioned are company mismanagement, making poor decisions/investments, and other failing properties and products.
ARM itself isn't failing them, though SoftBank is looking to sell the license and IP for direly needed funds to patch their proverbial sinking ship, and NVIDIA certainly is up to the task of taking that architecture and excelling with it.

If you have the diverse market penetration that ARM has and you are losing money on your ARM division then well........you're an idiot.
heh, exactly, and I expected better of SoftBank.

I mean yea it's very power efficient but if it's missing VRAM then that's a problem Apple should solve.
It is, and it makes sense to use a unified memory architecture for the Mac Mini, iMac (base-unit), and standard MacBook, but anything further will definitely require the GPU come equipped with its own dedicated VRAM.
It is almost like pairing an NVIDIA GTX 1650 with DDR4 shared with the CPU - lots of GPU performance completely held back by insufficient VRAM and memory bandwidth.

original.jpg
 
Building their stuff in the US would add about $3-5 based on the labor. But the US does not have the existing supply chain for materials, that is a $60B infrastructure project nobody is willing to front. China is the only place in the world you can go from raw material to finished product to a shipping container at a major port with less than 100km of travel from start to finish. There is a massive manufacturing gain there that can’t be matched.
I’m well aware. That’s why they need to start switching now and building it all back up slowly here. We have all the materials here. The issue is the plants were all shut down decades ago. With the amount of money Apple has, there is no excuse they should not be dumping it across the US to rebuild everything from mines to the assembly factories along with paying decent wages.
 
You are correct, the ARM license is making them money, but I think the other factors that I mentioned are company mismanagement, making poor decisions/investments, and other failing properties and products.
ARM itself isn't failing them, though SoftBank is looking to sell the license and IP for direly needed funds to patch their proverbial sinking ship, and NVIDIA certainly is up to the task of taking that architecture and excelling with it.
I don't know and I'd like a good explanation on the issue, as I don't think it's something that should be shrugged off and blamed on mismanagement. ARM is too big to fail, but somehow it failed. Ok, Softbank failed? Whatever the case is, the future of ARM will be determined by someone else. That could be a problem for Apple who hasn't had great relations with Nvidia. If I were Nvidia I'd be a dick to Apple and start making them pay more money. I would withhold new technology from Apple unless they pay up their nose. Apple isn't known to pay license fees for anything in their products. Apple will sue you for years just to avoid paying license fees. It's good tech gossip either way.
 
I don't know and I'd like a good explanation on the issue, as I don't think it's something that should be shrugged off and blamed on mismanagement. ARM is too big to fail, but somehow it failed. Ok, Softbank failed? Whatever the case is, the future of ARM will be determined by someone else. That could be a problem for Apple who hasn't had great relations with Nvidia. If I were Nvidia I'd be a dick to Apple and start making them pay more money. I would withhold new technology from Apple unless they pay up their nose. Apple isn't known to pay license fees for anything in their products. Apple will sue you for years just to avoid paying license fees. It's good tech gossip either way.
I wouldn't be surprised if there was some embezzlement going on, but we'll probably never hear about it if there was.
 
I don't know and I'd like a good explanation on the issue, as I don't think it's something that should be shrugged off and blamed on mismanagement. ARM is too big to fail, but somehow it failed. Ok, Softbank failed? Whatever the case is, the future of ARM will be determined by someone else.
This article is from September 2020:

Liquidity woes​

As a multinational conglomerate with numerous investments in the technology, energy, and financial sectors, SoftBank's finances took a hit this year. The company reported record operating losses on investments in WeWork and Uber Technologies, and it reportedly needs even more funds to prop up the start-ups under its tech-focused venture capital fund VisionFund that have been hard-pressed during the pandemic. The company is also seeking to pay down its growing debt level, as current liabilities increased by 63% from fiscal 2018 to 2019.
That article probably best sums it up.

That could be a problem for Apple who hasn't had great relations with Nvidia. If I were Nvidia I'd be a dick to Apple and start making them pay more money. I would withhold new technology from Apple unless they pay up their nose. Apple isn't known to pay license fees for anything in their products. Apple will sue you for years just to avoid paying license fees. It's good tech gossip either way.
And thus begins the cyberpunk corporate wars. :borg:
 
Well, I'll be looking forward to the new chips. It's time to upgrade my Apple. I no longer can use XCode because my machine is not supported anymore.
 
If I were Nvidia I'd be a dick to Apple and start making them pay more money
Remember that Apple has a perpetual license. nVidia'd have to have something Apple wants, and the M1 core seems to be better than any other ARM core.
 
Remember that Apple has a perpetual license. nVidia'd have to have something Apple wants, and the M1 core seems to be better than any other ARM core.
I'm not sure of the legality of the license, but I would think that once another company buys ARM then license agreements from before are basically toilet paper. That and Nvidia are dicks about everything. They just are. While the M1 is pretty good now, it won't last long in that performance market position. There's a good reason why Apple is releasing the M2. AMD's Zen4 is said to be a beast and Qualcomm isn't sitting around watching Apple beat them at their own game. Like I said, it's not hard to see what Apple will do with the M2. Probably still 5nm, as I doubt they moved to 3nm yet. More CPU and GPU cores. Faster memory and higher clock speeds. The usually stuff that most silicon manufacturers do to improve their products. This works for now as most of the CPU market has been complacent with AMD being stuck on Bulldozer for years and Intel being stuck on 14nm for years makes it easy for Apple to continue to improve their M line of products like this.

Right now AMD is definitely not the same company as before Ryzen. Each new version of Ryzen is pretty substantial upgrade, relatively speaking, plus they now have RDNA2. Intel has been exploring other ways to improve performance as well, like eDRAM and stacking technology, plus soon they'll have Xe graphics. Last I heard Qualcomm was hiring AMD to implement their new graphics into their ARM SoC's. Nvidia definitely has something up their sleeves with ARM as there's already talk of Nvidia implementing ARM+Nvidia hardware into the server market.

At some point the amount of R&D spent from Apple to maintain their 10% laptop market share and 11% smart phone and tablet is probably not worth it. While that is pretty substantial for a single company, that's not enough market share to spend R&D to compete with the 90% of the market that AMD, Intel, Qualcomm, and Nvidia has to compete in. Not to forget that these companies aren't even that focused on laptops or mobile devices, but the server market with the exception of Qualcomm. So there's even more market share for these companies to fight over, unless Apple plans to make modular server hardware that runs Linux?

There's going to be a point when whatever Nvidia's ARM produces will probably be more economical for Apple to purchase, especially graphics hardware. Apple's GPU is getting away with it for now, but realistically they have no hope to compete against AMD and Nvidia when it comes to GPU's. Apple kinda stole PowerVR and they kinda had a problem making it work as they had to go back to imagination for help. I can easily see Apple first buying discrete GPU's for their near future 'M' products.
 
I don't know and I'd like a good explanation on the issue, as I don't think it's something that should be shrugged off and blamed on mismanagement. ARM is too big to fail, but somehow it failed. Ok, Softbank failed? Whatever the case is, the future of ARM will be determined by someone else. That could be a problem for Apple who hasn't had great relations with Nvidia. If I were Nvidia I'd be a dick to Apple and start making them pay more money. I would withhold new technology from Apple unless they pay up their nose. Apple isn't known to pay license fees for anything in their products. Apple will sue you for years just to avoid paying license fees. It's good tech gossip either way.
Basically, ARM is something valuable that Softbank can sell to fund their core business verntures.
 
I'm not sure of the legality of the license, but I would think that once another company buys ARM then license agreements from before are basically toilet paper. That and Nvidia are dicks about everything.
I do not have any inside information about the license agreement between ARM and Nvidia, but I do have experience with M+A where existing license deals like that existed and in almost all circumstances there is a clause in survivability that makes the license survive an acquisition. Companies are not stupid, and realize that when they build a significant reliance on a third party's technology that an acquisition leaves them vulnerable. The company I work for actually licenses some patents from a competitor and the license cannot be terminated in an acquisition.

They just are. While the M1 is pretty good now, it won't last long in that performance market position. There's a good reason why Apple is releasing the M2. AMD's Zen4 is said to be a beast and Qualcomm isn't sitting around watching Apple beat them at their own game. Like I said, it's not hard to see what Apple will do with the M2. Probably still 5nm, as I doubt they moved to 3nm yet. More CPU and GPU cores. Faster memory and higher clock speeds. The usually stuff that most silicon manufacturers do to improve their products. This works for now as most of the CPU market has been complacent with AMD being stuck on Bulldozer for years and Intel being stuck on 14nm for years makes it easy for Apple to continue to improve their M line of products like this.

Right now AMD is definitely not the same company as before Ryzen. Each new version of Ryzen is pretty substantial upgrade, relatively speaking, plus they now have RDNA2. Intel has been exploring other ways to improve performance as well, like eDRAM and stacking technology, plus soon they'll have Xe graphics. Last I heard Qualcomm was hiring AMD to implement their new graphics into their ARM SoC's. Nvidia definitely has something up their sleeves with ARM as there's already talk of Nvidia implementing ARM+Nvidia hardware into the server market.
I am a huge AMD fan and I absolutely can't wait until the next gen Threadrippers come out, I am a day one buyer. But Apple is the most valuable company in the world, and has enough cash in the bank to not just purchase the absolute best talent in the semiconductor design world, they have enough cash in their account to purchase AMD several times over without even taking on financing.

At some point the amount of R&D spent from Apple to maintain their 10% laptop market share and 11% smart phone and tablet is probably not worth it. While that is pretty substantial for a single company, that's not enough market share to spend R&D to compete with the 90% of the market that AMD, Intel, Qualcomm, and Nvidia has to compete in. Not to forget that these companies aren't even that focused on laptops or mobile devices, but the server market with the exception of Qualcomm. So there's even more market share for these companies to fight over, unless Apple plans to make modular server hardware that runs Linux?

There's going to be a point when whatever Nvidia's ARM produces will probably be more economical for Apple to purchase, especially graphics hardware. Apple's GPU is getting away with it for now, but realistically they have no hope to compete against AMD and Nvidia when it comes to GPU's. Apple kinda stole PowerVR and they kinda had a problem making it work as they had to go back to imagination for help. I can easily see Apple first buying discrete GPU's for their near future 'M' products.
Apple's entire MO is to vertically integrate and reduce reliance on other companies. To understand how focused and driven of a company they are, I enjoy an anecdote from when they were trying to make the speaker grill holes in the Macbook Pro. There was only one company in China that made a laser drilling machine that could handle that thickness aluminum with that fine of a tolerance. Apple tried to place an order for X machines, but they couldn't scale fast enough. So Apple purchased the entire company and dumped millions of dollars into scaling up the production and manufacturing of those laser machines.

So they could make speaker holes a certain size.

Apple is not going to purchase Nvidia ARM SoCs. They have one of the most talented semiconductor design teams in the world. They designed the most powerful smartphone and tablet SoCs and it's not close. Now they have the most powerful mobile laptop SoC and it's not close. I'd bet money in five years they have the most powerful mobile GPU in the portable category (13-16in laptop, I don't see them caring much about the high end gaming category where people lug around 17in desktop replacements).

Apple has custom silicon in almost everything they make at this point. Everything. Their freaking airpods and headphones have custom ASICs in them. Right now, the bottleneck for Apple has nothing to do with competitors and everything to do with foundries. Honestly, the next logical move for Apple would be to get into the foundry business. They have vertically integrated everything else there is to integrate.
 
Companies are not stupid,
They aren't geniuses.
and realize that when they build a significant reliance on a third party's technology that an acquisition leaves them vulnerable. The company I work for actually licenses some patents from a competitor and the license cannot be terminated in an acquisition.
I'm sure there are loopholes that Nvidia will exploit. Nvidia could make a new ARM design that Apple will have to fork over major monies and maybe even force them to break up their old agreements with ARM. Like I said, Nvidia is known to be assholes in the industry.
I am a huge AMD fan and I absolutely can't wait until the next gen Threadrippers come out, I am a day one buyer. But Apple is the most valuable company in the world, and has enough cash in the bank to not just purchase the absolute best talent in the semiconductor design world, they have enough cash in their account to purchase AMD several times over without even taking on financing.
That's true but you're missing the point. When does it become unprofitable for Apple to hire the best talent and spend the R&D to compete against companies like AMD, Intel, and etc who are going to spend the R&D for 90% of the market and more? Apple wouldn't be in their position if they spent more money than others. The M1 is just an ARM CPU that is heavily modified along with Imaginations PowerVR that Apple learned from and designed their own GPU, and then later had to go back to Imagination for licensing and help.

The Apple M1 works for the same reason why Stadia works in that the industry has stagnated so much that Apple can now make a SoC that competes against Intel and is technically superior. The bar is set very low.
Apple's entire MO is to vertically integrate and reduce reliance on other companies. To understand how focused and driven of a company they are, I enjoy an anecdote from when they were trying to make the speaker grill holes in the Macbook Pro. There was only one company in China that made a laser drilling machine that could handle that thickness aluminum with that fine of a tolerance. Apple tried to place an order for X machines, but they couldn't scale fast enough. So Apple purchased the entire company and dumped millions of dollars into scaling up the production and manufacturing of those laser machines.

So they could make speaker holes a certain size.
I get it but that's not going to work out for Apple. A lot of Apple's technology like thunderbolt was from Intel and they aren't in good terms with Intel right now. Apple tried to go on their own with FireWire and I still don't have a device that makes use of FireWire. Their CPU is from ARM and their GPU is from PowerVR, so Apple doesn't have the capabilities to build something from the ground up. Love or hate Intel but they are trying to build a GPU from the ground up with Raja Koduri. Intel did poach a number of AMD GPU engineers but they aren't going back to AMD and asking for help and license agreements like Apple. Imagination is now owned by China, yes the Chinese government because they're broke. We are not better off having PowerVR graphics in the hands of China.
Apple is not going to purchase Nvidia ARM SoCs. They have one of the most talented semiconductor design teams in the world. They designed the most powerful smartphone and tablet SoCs and it's not close. Now they have the most powerful mobile laptop SoC and it's not close. I'd bet money in five years they have the most powerful mobile GPU in the portable category (13-16in laptop, I don't see them caring much about the high end gaming category where people lug around 17in desktop replacements).
I bet you by the M3 or M4 that Apple is going to be using a discrete GPU with their SoC chips, and eventually just use someone else's SoC. I don't know what engineers Apple has but most of the stuff you see in the M1 is built on top of a lot of other companies work. Extremely custom work done to the ARM CPU and GPU, but still built on others work. Fun fact, Apple didn't implement certain Vulkan features into their GPU. Apple really has no intention to support API's like Vulkan.
Apple has custom silicon in almost everything they make at this point. Everything. Their freaking airpods and headphones have custom ASICs in them.
Aren't Airpods just modified Powerbeats Pro?
 
They aren't geniuses.

I'm sure there are loopholes that Nvidia will exploit. Nvidia could make a new ARM design that Apple will have to fork over major monies and maybe even force them to break up their old agreements with ARM. Like I said, Nvidia is known to be assholes in the industry.

That's true but you're missing the point. When does it become unprofitable for Apple to hire the best talent and spend the R&D to compete against companies like AMD, Intel, and etc who are going to spend the R&D for 90% of the market and more? Apple wouldn't be in their position if they spent more money than others. The M1 is just an ARM CPU that is heavily modified along with Imaginations PowerVR that Apple learned from and designed their own GPU, and then later had to go back to Imagination for licensing and help.

The Apple M1 works for the same reason why Stadia works in that the industry has stagnated so much that Apple can now make a SoC that competes against Intel and is technically superior. The bar is set very low.

I get it but that's not going to work out for Apple. A lot of Apple's technology like thunderbolt was from Intel and they aren't in good terms with Intel right now. Apple tried to go on their own with FireWire and I still don't have a device that makes use of FireWire. Their CPU is from ARM and their GPU is from PowerVR, so Apple doesn't have the capabilities to build something from the ground up. Love or hate Intel but they are trying to build a GPU from the ground up with Raja Koduri. Intel did poach a number of AMD GPU engineers but they aren't going back to AMD and asking for help and license agreements like Apple. Imagination is now owned by China, yes the Chinese government because they're broke. We are not better off having PowerVR graphics in the hands of China.

I bet you by the M3 or M4 that Apple is going to be using a discrete GPU with their SoC chips, and eventually just use someone else's SoC. I don't know what engineers Apple has but most of the stuff you see in the M1 is built on top of a lot of other companies work. Extremely custom work done to the ARM CPU and GPU, but still built on others work. Fun fact, Apple didn't implement certain Vulkan features into their GPU. Apple really has no intention to support API's like Vulkan.

Aren't Airpods just modified Powerbeats Pro?
There's a lot here that you're missing historical context around. Firewire as an example. It's dead now, but at the time of release in the '90s it was absolutely a god-send for folks in the pre-USB 2.0 days.

A lot of people don't give Apple credit for how much impact they've had on the industry, outside of just Apple products. I'm not sure if Apple is still 'genius' or not, but under Jobs 2.0 they/he were absolutely genius. Regardless, Apple in 2021 has more capability then anyone else, be it nvidia, Intel, or AMD to push the ball forward substantially with technology. They have complete top-down control of everything, to include the OS. They are capable of moving in any direction they want with nothing holding them back, and they have the capital to do it.
 
Last edited:
There's a lot here that you're missing historical context around. Firewire as an example. It's dead now, but at the time of release in the '90s it was absolutely a god-send for folks in the pre-USB 2.0 days.

A lot of people don't give Apple credit for how much impact they've had on the industry, outside of just Apple products. I'm not sure if Apple is still 'genius' or not, but under Jobs 2.0 they/he were absolutely genius. Regardless, Apple in 2021 has more capability then anyone else, be it nvidia, Intel, or AMD to push the ball forward substantially with technology. They have complete top-down control of everything, to include the OS. They are capable of moving in any direction they want with nothing holding them back, and they have the capital to do it.
This is true, Firewire was amazing for external HDDs, media, and video cameras back in the 1990s and 2000s - used it a ton with MiniDV camcorders and scrubbing through footage and editing.
Firewire was hardware-based, requiring a minimal driver, and could do so much more than USB 1.1/2.0 with far less CPU-overhead, which at the time was a premium when video editing.

Looking into it, I didn't realize that Apple released Firewire in 1994, and USB 1.0 wasn't even released until 1996.
Firewire was capable of 400Mbps (50MB/s) - that truly was a god-tier transfer rate in 1994, especially for external devices and when other ports (save for SCSI) weren't faster than 2MB/s - and I thought it was fast for 2005, wow. :eek:
 
Cant wait for the M1X/M2. My M1 MacBook Air has been amazing so far. There are still a few rough edges with stability but overall it is an extremely good user experience.
 
There's a lot here that you're missing historical context around. Firewire as an example. It's dead now, but at the time of release in the '90s it was absolutely a god-send for folks in the pre-USB 2.0 days.
Like I said, I have no FireWire devices. Plenty of USB 2.0 devices and some USB 3.0 devices but no FireWire. I think at the time my first FireWire port came from a SoundBlaster Audigy. USB 2.0 has been around for a long time and I don't think it was faster than FireWire. There's a reason why Intel was making the Thunderbolt standard and not Apple, because adoption would have been slow.
A lot of people don't give Apple credit for how much impact they've had on the industry, outside of just Apple products. I'm not sure if Apple is still 'genius' or not, but under Jobs 2.0 they/he were absolutely genius.
Let's not suck the ghost of his dick too hard.
Regardless, Apple in 2021 has more capability then anyone else, be it nvidia, Intel, or AMD to push the ball forward substantially with technology. They have complete top-down control of everything, to include the OS. They are capable of moving in any direction they want with nothing holding them back, and they have the capital to do it.
Nothing to hold them back but their own hubris. Like I said before, Apple is not a hardware manufacturer. Apple is not going to implement a chiplet design like AMD, or Stacking Technology like Intel. Apple certainly has the money to do this, but not the market share. For companies like Nvidia, Intel, and AMD it makes sense to spend a lot of money on R&D because they know they'll get a lot more out of it. If Apple spends that kind of money then Apple products are not as profitable compared to the competition. Like it or not, Apple will never penetrate enough of the market because you have people like myself who refuse to buy Apple products. Not because we hate Apple but because Apple cannot offer products that would satisfy our needs. Apple doesn't cater to gamers. Apple doesn't cater to administrators or tech support. Schools are dumping Apple for ChromeBooks. For a lot of the industry, in order to buy Apple products they would want competition against Apple. I don't mean Windows or ChromeBooks, but companies that can make Apple clone products. Apple being in charge of everything means Apple is also in charge or pricing. Apple is also in charge of how products are repaired, if at all, because Apple is also in charge in how they're built.

The funny thing is that Apple is aware of this but they do have a lot of hubris. Intel fucked up being stuck on 14nm, and AMD just caught up and now slightly surpassed Intel. Apple's M1 was released at the perfect time as AMD and Intel were in a bad state. I believe that Apple thinks that this fucking up of Intel and AMD will continue, which makes it easy for Apple to maintain their market position with the M1 and soon the M2. Apple also believes they can convince a number of people to jump onto Apple and take market share away from companies like Dell and Lenovo. By the time Intel and AMD fix their shit, Apple would have enough market share to justify the cost of the R&D that would be needed to compete against these companies. I don't think any of this is going to work out the way Apple thinks it will, and while Apple is playing 3D chess, we have Nvidia playing 4D chess by buying up ARM. We're about to enter the APU wars that I've been predicting for some time, and when Intel and AMD start fighting each other in the APU arena, I don't see Apple able to keep up at all and may have to buy up Imagination or add a discrete GPU to their SoC lineup. As smart as Apple is, they are in their own little world, which they have to unfortunately share with other hungry companies.

 
Like I said, I have no FireWire devices.

I'm not sure what this proves, other than showing your continued penchant for owning slower performing hardware than what was state of the art at the time.


Nothing to hold them back but their own hubris. Like I said before, Apple is not a hardware manufacturer. Apple is not going to implement a chiplet design like AMD, or Stacking Technology like Intel.

Huh? AMD is not a hardware manufacturer. Amd is a silicon design company, exactly like Apple is. Both of them rely on other companies to actually make their silicon. Intel is more of a silicon manufacturer than AMD and Apple, but their abilities are not really relevant in the short time horizon since they are so far behind other fabs.

Apple has - by far - the most hardware manufacturing expertise out of any of those companies, and I think you're probably the only person on the entire internet who would argue otherwise.

Apple certainly has the money to do this, but not the market share. For companies like Nvidia, Intel, and AMD it makes sense to spend a lot of money on R&D because they know they'll get a lot more out of it.

You are so focused on marketshare that you overlook the fact that Apple does not want to play in every space. They are the most valuable company in the world. They have insane profit margins specifically because they vertically integrate and carefully choose which markets they go after.

You keep saying how Apple can't monetize, while ignoring the fact that they ARE and HAVE BEEN monetizing better than any other company in the world. Clearly, Apple knows how to make money and I'm not sure why you think you're an oracle that has some grand strategic view that they don't. You're a dude on a hardware forum. They're literally the most valuable company in the world. Relax, they know how to print money.


If Apple spends that kind of money then Apple products are not as profitable compared to the competition.

That's not how it works, vertical integration has lowered Apple's COGS and increased margins. Their M1 is a fraction of the cost of whatever they were paying Intel is, and with Apple's scale the R&D costs are paid off extremely fast.

For example, with the M1 SoC Apple is selling about 7 million Macs per quarter - before it was in the iMac. Now that they have the M1 in the iPad as well, those sell about 10mm per quarter. So, Apple will be shipping about 20mm M1 chips per quarter, or 80 million M1 CPUs per year. Even if R&D costs were 5 billion dollars on the M1 chip, that amounts to only $62.5 per unit over a single year. Subtract out the cost savings from not dealing with Intel, and it's probably almost break even - in the first year. Everything else is gravy.

To give a comparison, Nvidia ships about 9-10mm discrete GPUs per quarter, or about 40mm per year.

By leveraging their ability to vertically integrate and maximize parts compatibility across their product lines, Apple actually has more scale than Nvidia does when it comes to their silicon. And it's only going to get better once the M1 comes to iPhones. Apple will eventually be shipping a unified, scalable ARM platform across all of their products - and at that point, their scale will be larger than Nvidia and AMD combined.

Like it or not, Apple will never penetrate enough of the market because you have people like myself who refuse to buy Apple products. Not because we hate Apple but because Apple cannot offer products that would satisfy our needs. Apple doesn't cater to gamers. Apple doesn't cater to administrators or tech support. Schools are dumping Apple for ChromeBooks. For a lot of the industry, in order to buy Apple products they would want competition against Apple. I don't mean Windows or ChromeBooks, but companies that can make Apple clone products. Apple being in charge of everything means Apple is also in charge or pricing. Apple is also in charge of how products are repaired, if at all, because Apple is also in charge in how they're built.

Apple doesn't need to penetrate all of the market. But what is going to happen is that you'll have to make purchasing decisions kind of like this:

Intel PC Pros:

Cool RGB
Can Overclock
Nostalgic
Can run more games

Apple device pros:

Six times faster at literally everything except gaming due to not running windows
Four times better battery life for mobile devices


So yes, there will always be a steadfast market of Intel users just like there are a steadfast market of manual transmission enthusiasts. You might love driving your manual GT3 and that's totally fine, but a lot of people are going to be blowing by you with their PDK versions.

I believe that Apple thinks that this fucking up of Intel and AMD will continue, which makes it easy for Apple to maintain their market position with the M1 and soon the M2. Apple also believes they can convince a number of people to jump onto Apple and take market share away from companies like Dell and Lenovo. By the time Intel and AMD fix their shit, Apple would have enough market share to justify the cost of the R&D that would be needed to compete against these companies. I don't think any of this is going to work out the way Apple thinks it will, and while Apple is playing 3D chess, we have Nvidia playing 4D chess by buying up ARM. We're about to enter the APU wars that I've been predicting for some time, and when Intel and AMD start fighting each other in the APU arena, I don't see Apple able to keep up at all and may have to buy up Imagination or add a discrete GPU to their SoC lineup. As smart as Apple is, they are in their own little world, which they have to unfortunately share with other hungry companies.



I already explained how they already have more than enough scale to justify their R&D costs.
 
It sounds like what I thought might happen is already coming true: that Apple might just claim the lead in high performance computing outside of gaming, at least in laptops. It's not that Apple will suddenly dominate PC sales or even gain any significant market share, it's that Apple may safely claim the high-end home computer space in the same way it owns tablets, smartwatches, earbuds and (to some degree) phones. It's a shift in cultural perception that might help Apple's sales in the long run, or at least prevent them from sliding back.

And that's the real nightmare for Intel, AMD and Microsoft — not so much a sales hit as the fear that someone will justifiably say "if you want the fastest laptop, you have to get a Mac." If they lose that top spot, even if gamers are still 'safe,' it'll be hard for them to climb back.

Apple isn't guaranteed to reign supreme by any means, but Intel will need to sort its manufacturing problems quickly, and AMD has to be sure that future Ryzen chips deliver meaningful improvements. As Qualcomm learned first hand, you underestimate Apple's ability to continuously improve at your peril.
 
They could quickly become the preferred platform for gaming if/when they start offering the GPU for it given how most publishers & developers will see the value of being to make software that is shared across phone/tablet/desktop.
 
Apple device pros:

Six times faster at literally everything except gaming due to not running windows
Four times better battery life for mobile devices

This is the M1. Now imagine the M2, M3, M4 etc...
Perhaps the leaps won't be as great but i for one am looking forward to the not too distant future where i can run Linux on an M2 device.

Imagines running DX11/12 windows games through Steam play Proton on Linux, via Box86 emulation on a mac :joyful:
 
I'm not sure what this proves, other than showing your continued penchant for owning slower performing hardware than what was state of the art at the time.
I'm pointing out that the adoption of FireWire wasn't anywhere near as widespread as USB. Modern thunderbolt ports are just USB now.
Huh? AMD is not a hardware manufacturer. Amd is a silicon design company, exactly like Apple is. Both of them rely on other companies to actually make their silicon. Intel is more of a silicon manufacturer than AMD and Apple, but their abilities are not really relevant in the short time horizon since they are so far behind other fabs.
I think you're taking my words a little too literal. Apple doesn't engineer their own hardware from scratch like AMD, Intel, and etc. They literally depend on the work of giants. One could say that's the benefit of ARM.
Apple has - by far - the most hardware manufacturing expertise out of any of those companies, and I think you're probably the only person on the entire internet who would argue otherwise.
In sheer volume? Not sure how you came to that conclusion?
You are so focused on marketshare that you overlook the fact that Apple does not want to play in every space.
That doesn't matter. At some point the R&D costs will get too high for Apple's market share. Remember, AMD/Intel/Qualcomm/Nvidia all have much bigger markets and therefore can afford to spend more money on R&D. They obviously haven't for a very long time, which is why Apple's M1 looks so good today. That will obviously start to change from now on. When it will change is not something I can actually say.
They are the most valuable company in the world. They have insane profit margins specifically because they vertically integrate and carefully choose which markets they go after.
I can tell you why, but you won't like the answer. Overpriced hardware, questionable App Store practices, unethical working conditions, and $1000 monitor stands. Also people like you who will buy their products regardless of the quality.
You keep saying how Apple can't monetize,
I did not say that.
while ignoring the fact that they ARE and HAVE BEEN monetizing better than any other company in the world. Clearly, Apple knows how to make money and I'm not sure why you think you're an oracle that has some grand strategic view that they don't. You're a dude on a hardware forum. They're literally the most valuable company in the world. Relax, they know how to print money.
I'm just looking 5 steps ahead. I doubt the creation of the M1 was cheap, but that's what most companies do is spend a lot on a new product and just improve on it for several years. Technically Intel has been improving Sandy Bridge since 2011, and hasn't made any major changes since. AMD was stuck on Bulldozer until they created Ryzen. Apple will certainly spend more money on their M2 and M3 products, but I see it as a futile endeavor.

Apple has one advantage that many companies don't and that's insight. They know more about Intel than Intel knows about itself. Intel was so excited to have Apple use their products that they called their new CPU's the 'i'Core series. You know, like the 'i'Pad and 'i'Phone? Anyway, Intel was probably disclosing everything they were planning to do in the next few years and now Apple knows how to work around this. They probably have a decent understanding of AMD as well since I'm sure AMD went up to Apple and showed them everything they were planning to do in the next few years. Not Nvidia, as I'm sure Nvidia probably told them to screw off. Definitely not Qualcomm either.

That being said I believe that AMD and Intel are changing so much that Apple can't accurately plan around their actions with future 'M' products. Even if they do it wouldn't matter since AMD and Intel can't make enough hardware to sell. As bad as Intel looks, they do have a shortage due to demand.
That's not how it works, vertical integration has lowered Apple's COGS and increased margins. Their M1 is a fraction of the cost of whatever they were paying Intel is, and with Apple's scale the R&D costs are paid off extremely fast.
Definitely in the long run, but not in the short term. Though one could argue they were already doing with with the iPhone and iPad products, so what's the additional cost engineering the M1?
For example, with the M1 SoC Apple is selling about 7 million Macs per quarter - before it was in the iMac. Now that they have the M1 in the iPad as well, those sell about 10mm per quarter. So, Apple will be shipping about 20mm M1 chips per quarter, or 80 million M1 CPUs per year.
Keep in mind we don't know if that's because of the M1 or because of the pandemic. AMD claims "Revenue grew 93 percent while net income and EPS tripled year-over-year" but that's during the pandemic. Considering how popular GPU's are, you'd think Apple would get in on this? Also, got a link to that info you posted?
To give a comparison, Nvidia ships about 9-10mm discrete GPUs per quarter, or about 40mm per year.
Link please.
Six times faster at literally everything except gaming due to not running windows
If it was 6x faster then gamers would move to Apple. If it was 6x faster I'd buy an Apple product. We both know Apple isn't going to make a 6X faster product. I am assuming we're talking about CPU and GPU performance right?
So yes, there will always be a steadfast market of Intel users just like there are a steadfast market of manual transmission enthusiasts. You might love driving your manual GT3 and that's totally fine, but a lot of people are going to be blowing by you with their PDK versions.
Modern automatics are mostly manumatic transmissions. Also PDK is just DCT's or Porsche's version of a double clutch transmission. Just saying.
 
I can tell you why, but you won't like the answer. Overpriced hardware, questionable App Store practices, unethical working conditions, and $1000 monitor stands. Also people like you who will buy their products regardless of the quality.
Apple's switch to in-house silicon dramatically improved the value of its hardware, though. For example, the MacBook Air. Before, you had to get the higher-spec Air just to get decent performance from Intel's frankly underperforming chips. Now, even the base Air outperforms all Intel-based laptops in a comparable power class, and often more power-hungry chips, too... and all the while it's fanless.

That monitor stand is absurdly priced, but I suspect that's more due to economies of scale than anything. The market for the Pro Stand is a subset of a subset of a subset (Mac Pro buyers who want a Pro Display XDR and prefer the official stand for it), so it was never going to sell in huge numbers.

Also, I'm pretty sure many of the devices you use are made in similar working conditions, you just don't hear about it much because they're not made by Apple. What, did you think your Android phone and Windows laptop were made by people who work exactly 40 hours per week and sip lattes on breaks? Even a Samsung phone made in South Korea involves harsh conditions, like those that led to brain tumors and leukemia for chip and display workers. Labor conditions are an industry-wide problem, and the Only Apple Does Bad Things mindset actually makes things worse by letting other companies off the hook.
 
Apple's switch to in-house silicon dramatically improved the value of its hardware, though.
I agree because before Apple was the running joke in terms of how behind their hardware was. Nowadays you can't say that Apple's hardware is outdated.
That monitor stand is absurdly priced, but I suspect that's more due to economies of scale than anything. The market for the Pro Stand is a subset of a subset of a subset (Mac Pro buyers who want a Pro Display XDR and prefer the official stand for it), so it was never going to sell in huge numbers.
The monitor is just one of many examples of where Apple overcharges. I believe it's more about Apple knows their audience who are eager to pay for anything they make.
Also, I'm pretty sure many of the devices you use are made in similar working conditions, you just don't hear about it much because they're not made by Apple. What, did you think your Android phone and Windows laptop were made by people who work exactly 40 hours per week and sip lattes on breaks? Even a Samsung phone made in South Korea involves harsh conditions, like those that led to brain tumors and leukemia for chip and display workers. Labor conditions are an industry-wide problem, and the Only Apple Does Bad Things mindset actually makes things worse by letting other companies off the hook.
While that's probably true, Apple has unique situations that are just generally fucked up. You know, like the riot in India due to Apple not paying them.
 
While that's probably true, Apple has unique situations that are just generally fucked up. You know, like the riot in India due to Apple not paying them.
I hadn't heard that "Apple" was spelled "Wistron" in Hindi. The More You Know⭐

Also, it's kind of interesting how every sin of every entity Apple's ever interacted with is immediately communicable back to them, but they can't even take credit for their own successes. Curious.


Hint, for the English impaired: "Apple Supplier Does A Bad" is not the same thing as "Apple Does A Bad". When Apple genuinely does a bad, the tech press will, in fact, explicitly state such in their clickbait headlines. They won't be able to help themselves.

But, I guess "I Don't Use Apple Products Because I Disagree With Their Licensing Terms; Also, Their Hardware Is Just A Touch Pricy For My Tastes" just doesn't bring in the chicks...
 
Apple's switch to in-house silicon dramatically improved the value of its hardware, though.
But that is only a secondary consequence. Apple's switch to their own silicon is fundamentally not a move to save cost, it is to make the Mac better.
I hadn't heard that "Apple" was spelled "Wistron" in Hindi. The More You Know⭐
Apple surely knows how to control their suppliers when it comes to preventing leaks.
But when it comes how to suppliers treat employees, that is suddenly outside of Apple's control? As is whether their products are made using Uyghur forced labor.
 
Apple surely knows how to control their suppliers when it comes to preventing leaks.
But when it comes how to suppliers treat employees, that is suddenly outside of Apple's control? As is whether their products are made using Uyghur forced labor.
I wouldn't say labor conditions are out of Apple's control — rather, it can only do so much. For example, it can tell a supplier "no underaged workers" and punish that supplier if it's caught, but the only way to guarantee that would be to run background checks on every new worker, with frequent spot checks to make sure the factory isn't misrepresenting ages or sneaking workers in outside of the official record.

That WaPo piece isn't as black-and-white as it's made out to be. Apple would clearly rather not have any forced labor in its pipeline, even if it's just for cynical PR reasons, but it doesn't like something in the bill... unfortunately, without specifics we don't know if these are key provisions or minor quibbles.
 
I wouldn't say labor conditions are out of Apple's control — rather, it can only do so much. For example, it can tell a supplier "no underaged workers" and punish that supplier if it's caught, but the only way to guarantee that would be to run background checks on every new worker, with frequent spot checks to make sure the factory isn't misrepresenting ages or sneaking workers in outside of the official record.

The other way would be to use suppliers in countries where they do enforce such laws. Then they would not have to check every subcontractor worker themselves.
 
The other way would be to use suppliers in countries where they do enforce such laws. Then they would not have to check every subcontractor worker themselves.
That raises its own problems, though. It's not the pay — it's things like finding enough appropriately-trained staff (in a timely fashion) and having easy access to material and component suppliers. If Apple wants to increase production right now, for example, a contractor like Foxconn can recruit thousands more people and have them on the line in a couple of weeks. In the US, Apple would be thankful if it could get that many people in several months.
 
That raises its own problems, though. It's not the pay — it's things like finding enough appropriately-trained staff (in a timely fashion) and having easy access to material and component suppliers. If Apple wants to increase production right now, for example, a contractor like Foxconn can recruit thousands more people and have them on the line in a couple of weeks. In the US, Apple would be thankful if it could get that many people in several months.
Sounds like "while we're building the factory would be an excellent time to recruit and train people" to me.
 
That raises its own problems, though. It's not the pay — it's things like finding enough appropriately-trained staff (in a timely fashion) and having easy access to material and component suppliers. If Apple wants to increase production right now, for example, a contractor like Foxconn can recruit thousands more people and have them on the line in a couple of weeks. In the US, Apple would be thankful if it could get that many people in several months.

<shrug>Then they and their supporters should not complain that people look at their hiring practices if they are not willing to trade one problem for another.
 
Sounds like "while we're building the factory would be an excellent time to recruit and train people" to me.
Yes and no. A key problem in the States and other service-oriented economies is that it's difficult to find people who are just qualified enough. Trade education is in relatively short supply; you're more likely to find people with engineering degrees, and they're not going to work for typical factory pay. I also can't imagine Foxconn et. al. having much luck convincing entitled Americans that their future will be bright sitting on an iPhone assembly line.

For that matter, let's not forget that any new or updated factories will likely include a lot of robots, wherever they're built — companies may get around labor conditions simply by having fewer people involved.
 
Yes and no. A key problem in the States and other service-oriented economies is that it's difficult to find people who are just qualified enough. Trade education is in relatively short supply; you're more likely to find people with engineering degrees, and they're not going to work for typical factory pay. I also can't imagine Foxconn et. al. having much luck convincing entitled Americans that their future will be bright sitting on an iPhone assembly line.

For that matter, let's not forget that any new or updated factories will likely include a lot of robots, wherever they're built — companies may get around labor conditions simply by having fewer people involved.
You have to remember that Foxconn's largest factory where much of the Apple stuff is made currently employs upwards of 400,000 people, a rival company is supposedly building on in India that will get the same output but with fewer than 20,000 employees.
 
Yes and no. A key problem in the States and other service-oriented economies is that it's difficult to find people who are just qualified enough. Trade education is in relatively short supply; you're more likely to find people with engineering degrees, and they're not going to work for typical factory pay. I also can't imagine Foxconn et. al. having much luck convincing entitled Americans that their future will be bright sitting on an iPhone assembly line.

For that matter, let's not forget that any new or updated factories will likely include a lot of robots, wherever they're built — companies may get around labor conditions simply by having fewer people involved.
Again, I would be happy to pay upwards of $3k for whatever the latest iPhone was if that meant it was fully assembled in the US. Apple has the money to build the plants in the US, and slowly build everything up here. I'm aware of the lack of people to do the job, etc. However, this is all just bullshit. Time and money fixes that. If you were paying these people $20-$30 a hour with decent benefits the problem of lack of labor would solve itself fast. It pisses me off hearing people say 'this is no longer possible here'. No, it's possible, we just need the people with the money to start re-investing that money in our country. No, it won't instantly happen. I get that.

And yes, i'll repeat it, I have no problem paying two to three times the cost of whatever product you can name if it were actually made in the USA. It's the same reason I have no problem buying expensive watches, and certain cars. Where something is built matters to me. I don't give a flying fuck about people in China. People in the UK, Ireland, most of western Europe, US & Canada? I care about those people because we generally share the same values.
 
Honestly, I loved my mac laptop from 2003 and as a media machine it was fantastic. That said, I always ended up using either linux or windows for work and never got into the ecosystem. I was issued an m1 macbook pro for work and GD does it blow away the 10750h cpu in my msi laptop. Granted, that msi laptop has a 2070 super in it and I think the 8core gpu in the m1 is somewhere around a 1050ti in performance, when it comes to getting actual work done that macbook just runs circles around it. I have legitimately pulled an 18 hour day completely on battery with it and I don't believe that is possible on any laptop I have laid my hands on. Certainly not one with desktop cpu levels of performance. I look forward to seeing the performance of the 14 and 16 inch releases over the next couple of years.
 
Back
Top