Apple Turns Its Back on Customers and NVIDIA with macOS Mojave

we know Apples does it, but have anyone ever know of 'why' they do it?

Apple believe in a chuck away society, constant turnover of devices is how they make money - Steve Jobs himself was a big believer in this philosophy. The engineers tried everything they could to provide expand ability to the original Macintosh and Steve stopped them every step of the way - They even tried inserting a 'diagnostic port' into the design that could be hacked as a form of expansion - Steve 'the visionary' stopped it.

In the mean time, the Apple IIGS, which was a great little machine designed by good 'ol Woz, was fitted with a castrated CPU and gained literally no marketing over the Macintosh - And the IIGS allowed for expand ability with proper expansion slots and a great OS.

Somehow, the cMP 4,1/5,1 slipped through the cracks and now Apple would like to put an end to this '10+ year old and still perfectly capable machine' shit.
 
Apple knows best. Ask them, they will tell you that themselves. Just be prepared to be told you do not have the mental capacity to comprehend the grand plan.
 
Mojave runs fine on my Mid2012 MBPr 15" with it's Nvidia 650M-GT. No issues.

I cant bitch about Apple, their stuff works year in year out. My laptop runs usually 24/7, over six years by now, my last iPhone4 lasted 6+ years without a glitch, survived 2 kids growing up and saw Samsungs come and go.

It convinced me that much that I got myself an Xs, hoping it will also last 6 or more years to pay back. BTW...all my Apps from my 1st iPhone3 back still work, on all our iPhones, dont need to by twice or even more for all the kids and wife etc..

Yes, they want an arm & a leg 1st place, but in the end, it is the cheaper product with the better experience for many tasks, not all. I use all 3 main OS's and I dislike current Win10 the most. I admire Apple for having thr balls to do it their way, at least

their stuff goes to sleep mode and wakes up again, a thing my recent Game rig just wont do, and much more....
 
I think this is a bit over the top.

Quite a few users of certain creative applications have always preferred Apple as their platform of choice over anything else. In the past it was actually how Apple built their desktop market share. In many of these scenario's the GPU is a very important and very utilized component.

Sure, you wouldn't buy a Mac for gaming, but I see very few people actually gaming on Mac's.

People can 'prefer' what ever they wish, doesn't mean it's a good choice.
 
From a development angle I imagine this makes developers' lives easier, having a set combination of cards to program and optimize for, ala console development. But is it worth the loss in consumer choice? I guess that's the consumer's call.
 
I am on 10.13.6 for my GT 650M and GTX 1070.

The GT 650M is a discrete GPU inside my Macbook from 2012. Is that somehow not supported?

I bet my Mac will work fine on 10.14, but I will not be able to install Nvidia's own driver.

On 10.13 (using Nvidia's driver), there is an Nvidia Control Panel that lets users toggle which driver they want to use; Nvidia's or Apple's.

Using Apple's driver turns my 1070 into something slower than integrated graphics and using Nvidia's boosts gaming performance on the 'supported' GT 650.

I regret getting a 1070 over a 980 ti because of MacOS.

I had to wait over a year just for NV and Apple to allow Pascal support whereas Maxwell was supported on earlier versions of MacOS.

The Hackintosh world is severely affected, and AMD is becoming the preferred choice. Having 1 GPU option really sucks.

Apples response would be "Why have you not upgraded every six months as expected?" Even if it's the same product.
 
From a development angle I imagine this makes developers' lives easier, having a set combination of cards to program and optimize for, ala console development. But is it worth the loss in consumer choice? I guess that's the consumer's call.

Consumer choices? I thought we were talking about Apple products.
 
Apple knows best. Ask them, they will tell you that themselves. Just be prepared to be told you do not have the mental capacity to comprehend the grand plan.

Well, Microsoft aren't much better in that regard...
 
Well, Microsoft aren't much better in that regard...

I do not think people at Microsoft are as well versed at side stepping a question as those at Apple. Better marketing at Apple.

Two sides of the same coin. One is extremely arrogant and ego-maniacal, while the other is very good ,....wait.....hmmmm. You might be on to something.
 
Having an Apple, with an nVidia card... That's 2 mistakes right there already, what do you expect :D
 
Meh who needs good GPUs for Mac anyway. What are there, like 10 games released in the last decade that run on Mac that would benefit from something other than Intel igpu?

The 5 people I know that are the biggest Apple fanboys on the planet ask me all the time why I don't "Mac". I said "games?", and they were all like "Pffft. Us super humans don't play games" At which point I stopped talking to those fidiots about computers or technology. Change the subject. They're still my friends but we just talk about other things.
 
Last edited:
I love the 4,1/5,1 cMP. A beautifully designed machine that still holds it's own even today. Even if the EFI supported NUMA, I don't think the OS does? Although I could be wrong.

EDIT:

Depending on application, there's pro's and cons to NUMA vs SMP. However, these days with applications coded the way they are, NUMA generally holds a slight performance advantage as we move to more cores over outright clock speed.

Having said that, the advantage is minimal.

i appreciate you being so gracious when you've pointed out the issue to me, but you needn't be.

what i found interesting about the article is that it seems apple made a choice to run in "interleaved" mode as opposed to whatever mode allows 2-CPU NUMA.
  • i use the term "2-CPU NUMA" to acknowledge the reports that running LINUX on the 4,1/5,1 may result in only "one NUMA node" being detected.
  • this is exactly what you're saying: no NUMA support (no triple socket systems, right? i could be wrong... you never know. just making the necessary reservations ;) )
i would like to have the option, as the owner of the hardware, to be able to configure NUMA mode for my machine. i'm not sure if there are limitations on the ASIC.

from my perspective, i thought the northbridge would still have the elements of classical E-ATX OEM builds that allowed NUMA.
  • i'm not sure if the choice to use interleaved mode in the firmware was the consequence of a hardware limitation.
kind of interesting from both a computing science and electrical engineering perspective (comp E "withstanding" ;) )
 
i appreciate you being so gracious when you've pointed out the issue to me, but you needn't be.

what i found interesting about the article is that it seems apple made a choice to run in "interleaved" mode as opposed to whatever mode allows 2-CPU NUMA.
  • i use the term "2-CPU NUMA" to acknowledge the reports that running LINUX on the 4,1/5,1 may result in only "one NUMA node" being detected.
  • this is exactly what you're saying: no NUMA support (no triple socket systems, right? i could be wrong... you never know. just making the necessary reservations ;) )
i would like to have the option, as the owner of the hardware, to be able to configure NUMA mode for my machine. i'm not sure if there are limitations on the ASIC.

from my perspective, i thought the northbridge would still have the elements of classical E-ATX OEM builds that allowed NUMA.
  • i'm not sure if the choice to use interleaved mode in the firmware was the consequence of a hardware limitation.
kind of interesting from both a computing science and electrical engineering perspective (comp E "withstanding" ;) )

NUMA essentially allows both CPU's to access memory as if it was one big pool. So instead of having 128GB but only 64GB available per CPU, as is the case with the Mac Pro, you have 128GB available as one big pool of memory to both CPU's. This has advantages and disadvantages, one of the disadvantages is that the time taken for CPU A to access the memory pool of CPU B is greater than if CPU A simply accessed it's own pool of memory, the advantages of NUMA become really apparent when software is coded in a way as to take memory access patterns into consideration and becomes more apparent the more processors you add.

Linux has a very good NUMA implementation due to the fact it's designed to run clusters of CPU's under supercomputer applications.

The EFI has to support NUMA. On my Dell workstation I can switch between SMP and NUMA via the bios. I've found performance to be generally faster under NUMA.
 
I'll try to act surprised...Nope, it doesn't work.
 
NUMA essentially allows both CPU's to access memory as if it was one big pool. So instead of having 128GB but only 64GB available per CPU, as is the case with the Mac Pro, you have 128GB available as one big pool of memory to both CPU's. This has advantages and disadvantages, one of the disadvantages is that the time taken for CPU A to access the memory pool of CPU B is greater than if CPU A simply accessed it's own pool of memory, the advantages of NUMA become really apparent when software is coded in a way as to take memory access patterns into consideration and becomes more apparent the more processors you add.

Linux has a very good NUMA implementation due to the fact it's designed to run clusters of CPU's under supercomputer applications.

The EFI has to support NUMA. On my Dell workstation I can switch between SMP and NUMA via the bios. I've found performance to be generally faster under NUMA.

yeah for my application, having the larger pool would be beneficial even though, as you stated, there would be an added lookup penalty due to the increased size.

wondering if the brains at tsmc who engineered this board had a way to turn on NUMA.

my application can't use clusters, as it needs a large shared memory pool. it can't find the optimal solution without having the entire set of solutions in the same memory store, which is difficult in the distributed paradigm where most of the slave boxes push 4-8GB each and primarily (if not exclusively) operate on their own pool

cool stuff though. thanks for sharing your knowledge. now i know what the issue is!
 
I installed an AMD Vega Frontier Edition in my Mac Pro 5.1 because they use the same chipset in the new Imac pro. Whala support and driver updates by Apple automatically. You have to be more picky about what you install in a Mac. You can't go fanboyism as that won't work. You have to go with whats supported.

I also don't get the line "Apple Turns Its Back on Customers"

Apple didn't sell you a 1080 etc to put in your Mac.

However, soon they will have Nvidia driver support anyway because they are expanding external GPU support.


 
Last edited:
The GT 650M is a discrete GPU inside my Macbook from 2012. Is that somehow not supported?

7 year old macbook... Do you even have to ask that question, it's apple. It wasn't supported in 2015.
 
7 year old macbook... Do you even have to ask that question, it's apple. It wasn't supported in 2015.

It is listed as supported!
macOS 10.14 Mojave System Requirements

MacBook (Early 2015 or newer)
MacBook Air (Mid 2012 or newer)
MacBook Pro (Mid 2012 or newer)
Mac mini (Late 2012 or newer)
iMac (Late 2012 or newer)
iMac Pro (2017)
Mac Pro (Late 2013; Mid 2010 and Mid 2012 model support coming in later beta)

Apple has used 1 Kepler GPU in their Macbooks. The GT 650M/GT 750M, which is just a downclocked desktop GTX 650. They did use a 680MX for the iMac which is a downclocked 680 iirc.

Before Kepler, the last NV chips used were the 300 series which was a rebadged 200 series.

That is not a lot of NV hardware to support.
 
all things for quite some time have pointed out that apple is going the pure consumer level on their gear.

if you are still surprised by this, you havent been paying attention
 
Back
Top