Intel insider claims it finally lost Apple because Skylake QA 'was abnormally bad'

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,786
I didn't even think of what this would mean for Intel, not that i care about intel

""For me this is the inflection point," says Piednoël. "This is where the Apple guys who were always contemplating to switch, they went and looked at it and said: 'Well, we've probably got to do it.' Basically the bad quality assurance of Skylake is responsible for them to actually go away from the platform."


It has to be said that this is just the publicly stated opinion of one former Intel engineer and can't necessarily be taken as fact, and obviously isn't the only reason for the switch either. But Piednoël was always an interesting character while he was at Intel and a very outspoken one too, often much to the chagrin of his PR handlers in front of us journalists.
But however much the quality assurance of the Skylake architecture did or didn't impact Apple's decision to switch wholesale over to ARM, it's still an interesting perspective on why it has finally happened."


https://www.pcgamer.com/intel-skylake-why-apple-left/
 
I didn't even think of what this would mean for Intel, not that i care about intel
Oh come on erek, we know... you care. :D
Nice article and good find!

Really starting to think Haswell was the last decent CPU Intel released.
 
Didn't they fire a bunch of people a few years ago or so. Maybe that wasn't the best move.
 
I can't really blame them, Apple has been wanting to move away for some time now. These back to back dumpster fires were just extra pushes towards that goal.
 
Skylake was very bad, had a lot of baked in bugs. The newer generations (ie coffee lake onwards) are much better in that respect, but 5 years is a long time to fix it
 
Apple for a long time has wanted faster more efficient chips and Intel was just unable to give that to them.

Some examples:
-In the early 2010's it was really Apple pushing Intel that made the first Macbook Air possible and then eventually as a result, Intel launched a design program to create the platform we now call Ultrabooks.
-Apple has constantly been pushing up against the TDP of Intel designs. They basically wanted to create another G4 cube in the 2013 Mac Pro, but after one generation there basically was no processors or video cards that they could update that system with. You can blame Apple for making a design that was so thermally limited, but a good portion of it was also that Intel and AMD (their D500/D700 graphics cards) just burned really hot and were power hungry, so after even a single model release, that version of Mac Pro for 2013 to 2019 was the only option.
-The iMac Pro Apple really started taking Pro machines into their own hands by building a ridiculously over engineered cooling system. Quite frankly it was about time and also awesome. The 2017 iMac Pro was capable of having an 18-Core Skylake Processor along with a Vega 64X both pushing their top clock speeds for extended periods of time while staying virtually silent. Only under incredible load does it start to make any real noise and in all circumstances that machine never truly gets "hot".
-This trend continued with the 2019 Mac Pro and the 28 Core Skylake-X Intel Processor along with Dual-GPU Vega II cards. Apple basically over-engineered these inefficient Intel and AMD designs to be able to stay boost clocked for extended periods of time and still stay silent, owing to the efficiency of their cooling designs. Most Apple fans rejoiced, even if it essentially took them 8 years (since the 2012 Mac Pro) to finally produce another machine that operated as well.

All that said, I more or less concur with what people are saying in this thread as well as François Piednoël that Intel's heat problems definitely contributed to Apple leaving at an accelrated pace. If anything, I would say Apple, considering their design paradigms, stuck it out way longer than anyone really expected. The ARM transition rumors have been going since around 2011, at which point Apple didn't have mature enough silicon to really power their entire Mac lineup. At this point they probably are 'just' at the tipping point where they really can cover every machine they make in a much more efficient ARM package in comparison with anything Intel has been twiddling their thumbs with.
 
Last edited:
The performance advantage they needed has been around since around 2016 or so, so the The 50% higher per-core performance advantage they now enjoy will stifle any naysayers.

I think it was coming since Intel spent 5 years stuck at 14nm for anything high-end; I'm sure all the exploits helped, but they were not the majority of the momentum. AMD isn't consistent enough to bet your trendy business on, and Intel isn't selling new x86 licenses to a company like Apple, so they went platform transition (hopefully for the last time)!
 
Apple lost the plot on the professional laptop market when they dropped to only 4 USB C ports on the MPB around 2015. Sure it looked neat and the ports had high bandwidth, but who wants to have to carry around a sack of adapters when you go to a meeting etc.
Too bad as before the retina macbooks were pretty solid all around option, even if expensive.
 
I’m sure that was a contributing factor, but they’re also designing excellent ARM processors for the iPhone, which has become Apple’s main product line, surpassing traditional PCs, so I would imagine it’s also easier to consolidate their phone and PC OS’s on to the ARM platform.

Also, Apple is a company that traditionally wants to design their products from the ground-up entirely on their own as much as possible. They have a lot more control over what they’re doing by making this decision.
 
  • Like
Reactions: Dan_D
like this
Apple lost the plot on the professional laptop market when they dropped to only 4 USB C ports on the MPB around 2015. Sure it looked neat and the ports had high bandwidth, but who wants to have to carry around a sack of adapters when you go to a meeting etc.
Too bad as before the retina macbooks were pretty solid all around option, even if expensive.

Apple is actually primarily an adapter/dongle company that also happens to make phones and computers.
 
Eh, I would blame Apple more than Intel for their ever fanatical drive of making laptops thinner and thinner while coming up with bad cooling designs.
We can always point the finger, but Apple was the internal customer for Intel. And Apple wanted to have designs be a certain way that Intel was either unable or unwilling to do. Obviously like I noted, Apple has eventually "come around" to massively increasing their cooling performance because Intel's issues forced them to.
Still, even with your argument are you saying you'd rather just have big and bulky machines? Obviously not that this is currently physically possible: but if it's possible to have all your computer in something as thin as a piece of paper, as strong as titanium, and as powerful as a 3990x you'd prefer something big and bulky just to be big and bulky?
Apple's design ethos is obviously to pack as much power in as little space as possible, Intel didn't let them do that, ARM clearly will. So whether you blame Apple for their thermal problems or Intel, at the end of the day this transition will get them closer to the paradigms and form factors they want that Intel either would not or could not do. And that is completely undeniable.

Well, tbh, thin and cool doesn't work well together.

Sure it does. That's a function of design more than anything and also efficency. This is why performance per watt is a measurement that is talked about all the time. If Intel could quadruple their performance per watt, obviously this would be an entirely different conversation. That means they could be just as performative as they are currently at 1/4 the power. And power=heat. Not saying that a change like that is "magically feasible" (clearly Intel has struggled for the better part of a decade to do this), but year over year Apple has increased their performance per watt by something ridiculous like 30% on their phones and iPads on their A series of processors. Which is why they have exceeded all other cellphone manufacturers in terms of performance while maintaining excellent battery life. This same measurement is definitely something to consider especially for laptops. PCMR of course will argue for desktop it doesn't matter, but it does if you want to have a quiet and compact machine, which for a lot of users (like say designers, audio engineers, film editors, etc) are both nice things to have.

Please educate, Sir. Very interested!
I remember back in mid 2000's powermacs (g5) were all aluminum chassis, but I doubt it made them run cool, unless it was used as a giant heatsink somehow? I've used one, but never got a chance to peek inside.
Sure. The short version is Apple basically created a completely new massive cooling system internally for iMac Pro that's 30% larger than their standard iMac. Basically something ridiculous like 2/3rds of the internal space is just the cooling solution. You can go to Apple's own website to see the internals, just scroll down to the cooling portion to see their nifty animation demonstrating airflow:
https://www.apple.com/imac-pro/ If you want to control+f to get there, search for: "Advanced thermal management. Cool."
If you'd prefer to just see the internals as compared to a regular 5k iMac you can see that here:
https://www.imore.com/imac-pro-vs-imac-5k Just scroll down a page or so.

Here MKBHD shows how iMac Pro improved his life as well as his thoughts on it in his review. Because basically he used iMac Pro to run his channel, and he notoriously is using 8K Red RAW footage, it's a pretty good showing of how well Apple cooled this machine in a real world setting:

Here is a more "scientific" test from Max Yuryev trying to force the iMac Pro to throttle. Basically in order to get the machine to not be using max boost clock of 3.9GHz for any extended period of time he shows you more or less have to have an unrealistic workload and says so:


In short, the iMac Pro was part of a design turn that shows Apple is and does take thermals seriously.

Apple lost the plot on the professional laptop market when they dropped to only 4 USB C ports on the MPB around 2015. Sure it looked neat and the ports had high bandwidth, but who wants to have to carry around a sack of adapters when you go to a meeting etc.
Too bad as before the retina macbooks were pretty solid all around option, even if expensive.
I'd say that was an issue in 2015, but not really one now. Speaking as someone who actually owns and uses a Macbook Pro everyday.

Apple is actually primarily an adapter/dongle company that also happens to make phones and computers.
Good one.
 
Last edited:
I’m sure that was a contributing factor, but they’re also designing excellent ARM processors for the iPhone, which has become Apple’s main product line, surpassing traditional PCs, so I would imagine it’s also easier to consolidate their phone and PC OS’s on to the ARM platform.

Also, Apple is a company that traditionally wants to design their products from the ground-up entirely on their own as much as possible. They have a lot more control over what they’re doing by making this decision.

I think this has a lot to do with it. Intel's inefficiencies being stuck on 14nm forever and all the vulnerabilities just pushed Apple over the edge.
 
i wouldnt believe a word that comes out of that guys mouth. do some research into the former intel engineer and youll see he is a very suspect character, as alluded to in the article. thinks hes gods gift to earth. hes very active on twitter, and extremely abrasive.

bottom line just take that story with a bucket of salt.
 
i wouldnt believe a word that comes out of that guys mouth. do some research into the former intel engineer and youll see he is a very suspect character, as alluded to in the article. thinks hes gods gift to earth. hes very active on twitter, and extremely abrasive.

bottom line just take that story with a bucket of salt.
I think we're talking about it because even if he isn't completely credible, there is a kernel of truth here. The move to ARM was inevitable. But if Intel was staying competitive with 10 and 7nm parts and they had hit their design and productions on time, this would be a very different conversation right now.
 
We can always point the finger, but Apple was the internal customer for Intel. And Apple wanted to have designs be a certain way that Intel was either unable or unwilling to do. Obviously like I noted, Apple has eventually "come around" to massively increasing their cooling performance because Intel's issues forced them to.
Still, even with your argument are you saying you'd rather just have big and bulky machines?

Have you paid any attention to Intel's own NUC line up in the last 3-4 years or several other brands that have been making "mini pc's"? The mac mini has been a joke for awhile, either by lack of updates or lazy design and way way overpriced new hardware since 2018. The newest models overheat because Apple refuses to design them well, not to mention they went m.2 ssd soldered to the motherboard. Apple has consistently in their laptop lineups gone for thin over a better product that actually runs at its rated speeds and doesn't cook itself. They are far from the only OEM's with that problem but they seem to be the most consistent with it. And don't give me this false equivalence dream machine with some 3950x 16 core cpu and the footprint of a sheet of paper. All I'm asking for is .5 inch here and there for decent airflow and heat pipes. This youtuber built a hackintosh mini for half the money with an i5 vs i3, and removable storage.

Video starts 2min and shows size comparisons.

 
Have you paid any attention to Intel's own NUC line up in the last 3-4 years or several other brands that have been making "mini pc's"?
Yes, I would argue the NUC is the only one worth getting on the PC side. Intel has done a great job with it in general. I seriously considered their Skull Canyon for awhile as a small rig to use with an EGPU.

The mac mini has been a joke for awhile, either by lack of updates or lazy design and way way overpriced new hardware since 2018. The newest models overheat because Apple refuses to design them well, not to mention they went m.2 ssd soldered to the motherboard. Apple has consistently in their laptop lineups gone for thin over a better product that actually runs at its rated speeds and doesn't cook itself. They are far from the only OEM's with that problem but they seem to be the most consistent with it.
This issue has been reversed on the 16" Macbook Pro. It doesn't have throttling issues. You could say it was the Macbook Pro update we were waiting for. And it is slightly thicker than its predasessor in order to address thermal issues the 15" models had. My point stands. Apple has been taking the issues its had with Intel inefficency seriously in terms of thermal concerns. It's just they they've only gone and done so with the product lines that are most important first.

I would argue that the Mini hasn't been updated because this ARM update has been forthcoming for a while and they didn't bother to put in resources into it. They more or less shoehorned in the 8700k without increasing its thermal capacity. And the design is so limited they didn't even bother to "upgrade it" to the 9700k or 9900k because at this point they didn't want to bother spending the resources to get it there (the 9900k is found in the current iMac that also needs to get updated, and it performs just fine). So your criticisms about the mini are fair.
That said though, I'm guessing that one of possibly 2-3 machines that get launched under ARM chips either at the end of this year or early next year will likely be the next Mac Mini. Especially so that they can get some relatively inexpensive hardware for devs out there using actual final consumer hardware rather than the dev boxes that they launched using the A12Z. And at that point not only do I expect that there will be no thermal issues, but I expect in their finalized design they'll have an option that will be faster than a 9900k at much lower TDP. Big words I know, but feel free to hold me to them. Considering that Mini's traditionally have had integrated graphics, I actually think that this will be the desktop machine they'll try to push to casuals. It will have faster, more accelerated, integrated graphics than any other offering. And editing multiple 4k streams on it will be possible (like the A12Z demonstration), but even faster.

And don't give me this false equivalence dream machine with some 3950x 16 core cpu and the footprint of a sheet of paper. All I'm asking for is .5 inch here and there for decent airflow and heat pipes. This youtuber built a hackintosh mini for half the money with an i5 vs i3, and removable storage.
You missed the point. The bottom line is what are the design goals? And the point I was making was that Intel has failed consistently at being able to make fast, small, low TDP parts to serve their internal customer Apple. Apple has demonstrated their own silicon can get there. Let me ask you this: What is going to happen with the Macbook Air is more powerful than a $2000 Lenovo? Because that's the future I'm looking forward to. That might not be "sheet of paper" thin, but it's tiny and it's also a very real possibility. The iPad Pro has no active cooling and CPU wise it crushes a lot of desktop parts (GPU wise it's decided mid-tier, but that's still incredible considering it's integrated into the SoC and once again it is passively cooled). When Apple releases their real dedicated "for Mac" ARM chip that's going into laptops not only will they be thin and light they're going to be powerful. We know this because we've already seen it. We just don't know how crazy it's going to get.

If I could have a machine the thickness of a sheet of paper that could do everything I want it to do, then I'd absolutely want that device. And while that may not be realistic for the next 20 years, Apple is trying to make devices for those of us in which size and weight as well as efficiency are real considerations. If those aren't things you care about, then really all of our discussions about Mac hardware are probably moot.

And also for reference, I said 3990x. Which is a 64-Core part.
 
Last edited:
The bottom line is what are the design goals?

No, I got your point just fine. Their design goals are/have been unrealistic thin machines that are full of hopium unicorn dreams that are somehow blamed on Intel's lack of low power high perf chips. Sure Intel has been stagnant but its not as if Apple couldn't have actually done decent design engineering instead of worrying about how much thinner the laptop is by half an inch or less along with goofy features or killing ports because everyone should run things through a pricey dongle. The Mac Pro trashcan is a neat design, a great proof of concept that really should have been a consumer machine. Its not what pro multimedia creators needed or wanted. Then apple doesn't even bring it to consumers with a cheaper model.

Maybe ARM will save Apple for their supposed ultra light thin disposable machines. I'll believe it when I see it. They've also appeared to have given up on their desktop/pro user market in the past 8+ years and I don't think they'll get them back. Not with machines that have no upgradability, just for storage options even.
 
No, I got your point just fine. Their design goals are/have been unrealistic thin machines that are full of hopium unicorn dreams that are somehow blamed on Intel's lack of low power high perf chips. Sure Intel has been stagnant but its not as if Apple couldn't have actually done decent design engineering instead of worrying about how much thinner the laptop is by half an inch or less
Half an inch is huge. I'm not sure you know how to measure stuff. But again, the 16" isn't thermally limited. So you're repeatedly harping on an issue that doesn't exist anymore.

along with goofy features or killing ports because everyone should run things through a pricey dongle.
Dongles are cheap, but you don't even need dongles. You can use your monitor as a dock. You can buy USB Micro-B to USB-C cables, which also are cheap and are my preferred option on all my external drives. Or actually just buy things that are USB-C, which is the future anyway. If you're annoyed at them for being forward looking then you're not into technology.

The Mac Pro trashcan is a neat design, a great proof of concept that really should have been a consumer machine. Its not what pro multimedia creators needed or wanted. Then apple doesn't even bring it to consumers with a cheaper model.
What I don't get about you is you're continuing to talk about problems that Apple has already solved. The Trashcan Mac Pro is not the current Mac Pro. The current Mac Pro is upgrade-able. So your complaint is already fixed. The 16" Macbook Pro again already fixed the thermal issues you're talking about. So why are we talking about Apple's past that currently isn't a problem? Like literally this doesn't make sense to me. Basically what I can surmise at this point is you're annoyed at Dongles, but the rest of it isn't even relevant.

Maybe ARM will save Apple for their supposed ultra light thin disposable machines. I'll believe it when I see it.
Have you looked at the 2018 iPad Pro? Have you looked at benchmarks? I'm not even talking about what their future designs will be.
If they took an A12Z which is passively cooled, and put it into a laptop form factor, and threw macOS onto it, it would already destroy most laptops under $1000. In CPU limited applications probably $1500. And that's a 2 year old tablet chip. It's already a monster.
Did you watch Apples demo? Their dev kit is just an A12Z. It's literally capable of editing 3x 4k streams in real time. They also showed it playing Shadow of the Tomb Raider through Rosetta, meaning emulating x86, and it played flawlessly. We haven't even seen full fat Mac ARM chips yet. But the point is that future is already nearly here and we see plenty of evidence on a 2 year old passively cooled tablet chip.

They've also appeared to have given up on their desktop/pro user market in the past 8+ years and I don't think they'll get them back. Not with machines that have no upgradability, just for storage options even.
Really? The Mac Pro is the fastest workstation dollar for dollar as compared with any OEM (eg: Dell and HP). It has the fastest graphics cards, dual GPU Vega II's. A GPU that you literally can't buy in any other workstation machine. And it's using the fastest processor that Intel has, the 28-Core Skylake-X. It literally doesn't get more professional than that. That is the Proest, pro-level machine a Pro can Pro.
It's also fully upgrade-able and customizable. There are Youtube videos adovcating for buying lower end Mac Pros and filling it up with your own hardware like Radeon VII's, your own RAM, and whatever. If for whatever reason you want to go that route.
The Afterburner card in that thing is also ridiculous as an option. It's worth the price of admission alone if you're a professional editor as no $8000 PC would be able to keep up with it.

The iMac Pro, though in theory not user upgradeable is also a really hard hitting Pro level machine. I wouldn't buy one as they haven't updated it since 2017, but if you watch that MKBHD video I linked above, it is more than capable of doing serious Pro work with the best of them. Such as editing Red 8k compressed RAW and doing so incredibly quickly. However, Linus also showed that if you're willing, you can upgrade the 2017 iMac Pro, and he did by maxing out the processor with his own RAM and CPU. Now with Apple releasing their HDD upgrade program for Mac Pro, you could also do that for the iMac Pro.

So I'm not sure what you're talking about in terms of not catering to pros. They are and have been for the past 3+ years minimum, and if you need only a laptop, then basically the entire time since they've moved to Intel.

EDIT: Added links.
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
Eh, I would blame Apple more than Intel for their ever fanatical drive of making laptops thinner and thinner while coming up with bad cooling designs.

Or they were looking at the iPad Pro all this time thinking how can this thing with no real cooling keep pace with our actively cooled Macbook Pros...

Macbook pros showed what Apple wanted vs. what Intel was giving them, a shit processor that didn’t meet their needs.
 
Apple lost the plot on the professional laptop market when they dropped to only 4 USB C ports on the MPB around 2015. Sure it looked neat and the ports had high bandwidth, but who wants to have to carry around a sack of adapters when you go to a meeting etc.
Too bad as before the retina macbooks were pretty solid all around option, even if expensive.

I am so tired of seeing this statement. It might be true if all you ever do is hook your shitty laptop up to a 1995 projector that requires VGA plus USB A. But that is hardly the only professional work that people use computers for.

For the work I do (cutting edge scientific computing/microwave data capture), I desperately need as many high bandwidth I/O ports as possible. The Macbook Pro is the ONLY 13, 15, or 16 inch laptop that has 4 thunderbolt 3 ports at full bandwidth. There is literally no PC laptop on the market that could do my "professional work" as well as the Macbook does. It's frustrating actually since I can't run linux natively on a Macbook due to the T2 chip - I'd rather be able to get a PC laptop that could handle the I/O requirements I have and dual boot.

But quite literally, there is no PC 13 inch laptop on the market that even remotely approaches Apple's high speed I/O capability. Without daisy chaining, I can have a eGPU, TB3 external NVME SSD, 10GBe, power charging, and a USB C SDR attached. I look at a $3k "surface pro" or "surface book" and just shake my head in disgust - it's completely fucking insane Microsoft is charging people that much without having even a fraction of the hardware I/O capability that Apple does. If you gave me the maxed out Surface, I literally couldn't even do my job due to deficient hardware compared to Apple.

I can get a dongle if I have to connect a modern laptop to an outdated projector or thumb drive. You can't get a dongle to convert USB A into a 40gbps I/O interface.
 
Last edited:
I am so tired of seeing this statement. It might be true if all you ever do is hook your shitty laptop up to a 1995 projector that requires VGA plus USB A. But that is hardly the only professional work that people use computers for.

For the work I do (cutting edge scientific computing/microwave data capture), I desperately need as many high bandwidth I/O ports as possible. The Macbook Pro is the ONLY 13, 15, or 16 inch laptop that has 4 thunderbolt 3 ports at full bandwidth. There is literally no PC laptop on the market that could do my "professional work" as well as the Macbook does. It's frustrating actually since I can't run linux natively on a Macbook due to the T2 chip - I'd rather be able to get a PC laptop that could handle the I/O requirements I have and dual boot.

But quite literally, there is no PC 13 inch laptop on the market that even remotely approaches Apple's high speed I/O capability. Without daisy chaining, I can have a eGPU, TB3 external NVME SSD, 10GBe, power charging, and a USB C SDR attached. I look at a $3k "surface pro" or "surface book" and just shake my head in disgust - it's completely fucking insane Microsoft is charging people that much without having even a fraction of the hardware I/O capability that Apple does. If you gave me the maxed out Surface, I literally couldn't even do my job due to deficient hardware compared to Apple.

I can get a dongle if I have to connect a modern laptop to an outdated projector or thumb drive. You can't get a dongle to convert USB A into a 40gbps I/O interface.

It is a breath of fresh air to see someone else who can appreciate pushing forward with new standards instead of hold on to the shitty IO of the past for fear of what inconveniencing someone. If everyone just got their shit together and moved on like Apple, USB 2 and 3 would have died by now and USBc and thunderbolt would be everywhere.
 
i wouldnt believe a word that comes out of that guys mouth. do some research into the former intel engineer and youll see he is a very suspect character, as alluded to in the article. thinks hes gods gift to earth. hes very active on twitter, and extremely abrasive.

bottom line just take that story with a bucket of salt.
Agreed, he has posted some amazingly poor lies about AMD however in this case he's likely right if you compare the two offerings on their merits.
 
Half an inch is huge. I'm not sure you know how to measure stuff. But again, the 16" isn't thermally limited. So you're repeatedly harping on an issue that doesn't exist anymore.


Dongles are cheap, but you don't even need dongles. You can use your monitor as a dock. You can buy USB Micro-B to USB-C cables, which also are cheap and are my preferred option on all my external drives. Or actually just buy things that are USB-C, which is the future anyway. If you're annoyed at them for being forward looking then you're not into technology.
My business partner uses the Mac laptop crumbocaust keyboard with 4x USB c shitbox for audio creation with a dual boot setup and regrets it. He has a fucking big bunch of dongles and half the time has compatibility issues or lack of function issues because a professional PC needs more than four ports. It's a joke of an approach. If they had 8 usbç ports it might cost another ten bucks. That's very unprofessional. He won't be getting another laptop from them any time soon.
It also struggles to drive 4k 30 without dropping frames.. sigh.
He's building a Windows PC next up for gaming and some video editing duty.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
My business partner uses the Mac laptop crumbocaust keyboard with 4x USB c shitbox for audio creation with a dual boot setup and regrets it. He has a fucking big bunch of dongles and half the time has compatibility issues or lack of function issues because a professional PC needs more than four ports. It's a joke of an approach. If they had 8 usbç ports it might cost another ten bucks. That's very unprofessional. He won't be getting another laptop from them any time soon.
Not entirely sure what you're saying here. No idea what your partner bought or what his solutions were or why he needs the amount of ports he/she does.
We all have our use cases, but there aren't many laptops in general with 8 USB ports. I in fact know of precisely zero. So, your business partner would be screwed regardless of platform apparently.

The Dell XPS as an example has fewer ports than a Macbook Pro if your need is specifically USB: https://www.dell.com/en-us/work/sho...500-laptop/ctox15w10p1c2500#techspecs_section
Other lines like the Inspire are specifically at 4 (when combining USB A, C, and Thunderbolt together).
Lenovo Legion is also 4x ports total: https://www.lenovo.com/us/en/laptop...-y-series/Lenovo-Legion-Y740-17/p/88GMY701062
HP Omen is also 4x: https://store.hp.com/us/en/pdp/omen-by-hp-15-dh002nr

This is just what I could dig up after a few minutes, so if you can find some laptops with more USB/Thunderbolt ports, so be it. But I can see that having more than 4 is certainly not the "standard". It seems to me that Apple is at the same standard in terms of number of specifically USB ports. Your business partners issue can't specifically be with that. You'd also have to be talking about using 4x USB + HDMI + Ethernet + whatever other type of ports and not specifically 8x USB because as far as I can tell no manufacturer meets that, meaning you'd have to dongle it up anyway in that use case. However to that end, I agree with paradoxical that I'd much rather have 4x Thunderbolt 3 ports than the limitation offered of "only" USB A or USB type C. It's basically the only laptop capable of running an eGPU and a bunch of full fat Thunderbolt data drives simultaneously. So, it turns out way more of this has to do with his/her use case. If your partner was using an eGPU or a Monitor that he/she could use as a hub, likely even the number of USB ports would be much less of an issue and neither would dongles.

It also struggles to drive 4k 30 without dropping frames.. sigh.
I'll admit that certain older models of MPB can easily get pushed into thermal throttling and can present problems. But if in general he/she is dropping frames at 4k 30, he/she is either dealing with a massive decoding issue, overheating, or a crazy amount of multitasking (or some combination thereof).
I can literally be rendering in FCPX/Compressor, have all my processor cores be pegged and still watch 4k videos on Youtube. I do that all the time without dropping frames. And my only machine is a Macbook Pro. In case it's not clear, I'm an independent filmmaker (primarily documentaries) and a photographer. I am editing and rendering videos constantly. So I'm not saying I do this anecdotally. It's my life's work.

He's building a Windows PC next up for gaming and some video editing duty.
Absolutely he should get what works for him. I'd humbly recommend trying to get a 3900X or above, 32GB of Ram or above, a 2060 Super or above (unless he can wait for nVidia 3000 series and/or Big Navi), an NVME drive to boot, and a raid array to edit off of and edit in Blackmagic Design's Davinci Resolve. I would personally also recommend avoiding Adobe Premiere.
Finding a good monitor to do color grading off of and a colorimeter is also crucial. I'd recommend using a X-Rite i1 Display pro for colorimeter, and do your best to find a display that is 10-bit (not 8-bit+FRC), and meets at least 90% DCI-P3 color gamut. I found what I consider to be the perfect monitor for the price, but they don't make it anymore and I've stopped seeing them appear on eBay or Craiglist so it's hard to recommend it to others anymore. However Eizo, NEC, and to a lesser extent Benq and LG all have monitors designed to meet professional standards. Benq is probably the best bang for your buck. The SW320 is an excellent option if you can find it used and test it before purchasing (uniformity issues). The Benq SW321 replaced it, and it's a great monitor, but also expensive. There's also the SW270c and sw2700PT as excellent budget monitors that can be found used.
 
Last edited:
I am sure the decision to move on from x86 involved many meetings with bullet pointed spreadsheets.

No doubt not dealing with Intel anymore had to be a major selling point. lol

Of course Apple wants to take complete design control, make more money if they can, unify all their products under one ISA ect ect. But Intel has been having security issues galore most of which they couldn't find themselves. They continue to roll out the same chip every year with a new model number, which has made it really hard for Apple to offer new products with any real CPU improvements. Intel in general has been loosing performance mind share vs AMD for a few years. No doubt there are also many other behind the scenes customer / partner relation issues that piss off ALL their partners (not just Apple).

Intel likely did little to make the choice a hard one. If they had been delivering consistent improvements in performance and or efficiency the last number of years perhaps Apple would have been happy to continue with x86 and keep laying the cross compile groundwork for the further off future. At this point though when they weighed the pros and cons... Intel really hasn't done a thing to keep the business. I would also not be shocked to find out at some point that Apple has seen what Intel has been cooking for 2021/22 with XE GPUs and such and where underwhelmed to say the least. I can very much see Cook asking the CPU dept at Apple... this is what Intel is going to be offering us in a macbook power envelope can you do better. At this point the answer is pretty much we already have... but ya we can do even better. lol

Apple just pouched the designer of the A76 Mike Filippo. He is a legit ARM rockstar... he was lead designer on Cortex-a76, and Arms Zeus their next gen ARM server core. He is also the co-author of the AMBA SOC interconnect spec. Considering Filippos experience with ARM server designs, higher performance mobile, and past work as chief designer for Intels HPC back when Intel was leading in that market. (he was lead on the 24 core jobs back in 09) It looks like Apple may actually be looking to put a real desktop class performance ARM chip together... and not just clock bump their A13s. They got THE guy for high performance ARM core design... who also happens to be one of the co authors of Arms latest greatest SOC interconnect spec, which will come in handy if Apple is planning multiple co processors for the marketing wonks to name. lol
https://www.theverge.com/2019/6/26/18760083/apple-arm-architect-hire-cortex-a76-mac-processors-intel
 
Last edited:
Thanks for the gotcha journalism and for also punctuating precisely why Apple is dropping Intel for much more efficient ARM SoC's.
Are they dropping Intel because they cant afford to put thermal pads in a laptop, or they cant mill a heat-sink to spec.
By design, for the longest time. Nothing is going to change. Just wanted to give you a more recent example of thermal control at apple.
 
Last edited:
Are they dropping Intel because they cant afford to put thermal pads in a laptop, or they cant mill a die to spec.
By design, for the longest time. Nothing is going to change. Just wanted to give you a more recent example of thermal control at apple.
It’s not a more recent change. Because the MacBook air hasn’t had a redesign. And neither has the Mac mini. And neither has the MacBook Pro 13 inch. That’s why this is gotcha journalism. You’re not talking about anything that people Who follow Apple technology news don’t already know about.

To me, I think it’s obvious that those three product lines are the product lines that Apple will address first with arm. Because they haven’t addressed the thermals yet in those designs or done any form of re-design, only installing new hardware into old designs. The Mac mini dev kit I think is a taste of what is to come. Because even without a thermal design change those processors can operate without throttling.
 
The mac mini was a new design in 2018. They didn't learn from their earlier problems in other platforms and made things worse on a number of fronts with the 2018 mini.
 
The mac mini was a new design in 2018. They didn't learn from their earlier problems in other platforms and made things worse on a number of fronts with the 2018 mini.
Sure, I’ll bite. Other than thermals, which I would attribute to putting a much hotter chip in it and clearly Intels growing ineffiecencies gen after gen, what’s worse?
 
Last edited:
I am sure the decision to move on from x86 involved many meetings with bullet pointed spreadsheets.

No doubt not dealing with Intel anymore had to be a major selling point. lol

Of course Apple wants to take complete design control, make more money if they can, unify all their products under one ISA ect ect. But Intel has been having security issues galore most of which they couldn't find themselves. They continue to roll out the same chip every year with a new model number, which has made it really hard for Apple to offer new products with any real CPU improvements. Intel in general has been loosing performance mind share vs AMD for a few years. No doubt there are also many other behind the scenes customer / partner relation issues that piss off ALL their partners (not just Apple).

Intel likely did little to make the choice a hard one. If they had been delivering consistent improvements in performance and or efficiency the last number of years perhaps Apple would have been happy to continue with x86 and keep laying the cross compile groundwork for the further off future. At this point though when they weighed the pros and cons... Intel really hasn't done a thing to keep the business. I would also not be shocked to find out at some point that Apple has seen what Intel has been cooking for 2021/22 with XE GPUs and such and where underwhelmed to say the least. I can very much see Cook asking the CPU dept at Apple... this is what Intel is going to be offering us in a macbook power envelope can you do better. At this point the answer is pretty much we already have... but ya we can do even better. lol

Apple just pouched the designer of the A76 Mike Filippo. He is a legit ARM rockstar... he was lead designer on Cortex-a76, and Arms Zeus their next gen ARM server core. He is also the co-author of the AMBA SOC interconnect spec. Considering Filippos experience with ARM server designs, higher performance mobile, and past work as chief designer for Intels HPC back when Intel was leading in that market. (he was lead on the 24 core jobs back in 09) It looks like Apple may actually be looking to put a real desktop class performance ARM chip together... and not just clock bump their A13s. They got THE guy for high performance ARM core design... who also happens to be one of the co authors of Arms latest greatest SOC interconnect spec, which will come in handy if Apple is planning multiple co processors for the marketing wonks to name. lol
https://www.theverge.com/2019/6/26/18760083/apple-arm-architect-hire-cortex-a76-mac-processors-intel


Thanks for adding that info, sounds like they have the right guys to make it happen. Yeah, intel was coasting for years with their 4 core lineup. Plus, as Apple, how could you even believe Intel's roadmap and trust their ability to deliver, after the repeated failures with 10nm, security issues that keep coming up etc.
 
The mac mini was a new design in 2018. They didn't learn from their earlier problems in other platforms and made things worse on a number of fronts with the 2018 mini.

Everyone rags on the Mac Mini too much, even Linus uses them as video ingest stations because it is literally the only NUC sized product that can be optioned with 10gig ethernet
 
Everyone rags on the Mac Mini too much, even Linus uses them as video ingest stations because it is literally the only NUC sized product that can be optioned with 10gig ethernet

Nah, thats just the height of the irony of the situation. Lots of people ended up using the mac mini for mini servers over the last 7-8 years and the 10gig port was probably the best part of the 2018/20 refresh.
 
Back
Top