Nvidia Job Postings Suggest Chips Returning To Apple Macs

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Job listings suggest that the return of Nvidia tech in Apple systems is a great possibility. The last time a Mac had a Nvidia GPU was, what, 2013?

Current Mac computers feature graphics chips from Advanced Micro Devices Inc. Nvidia, which is the leading manufacturer of high-end graphics chips used in gaming machines, hasn’t been an option for multiple generations of computer models from Cupertino, California-based Apple. Nvidia, in a job ad for a software engineer, said a successful applicant will “help produce the next revolutionary Apple products.” The role would require “working in partnership with Apple” and writing code that will “define and shape the future”’ of graphics-related software on Macs.
 
Want to see a mobile Pascal in a MBP ...

This. This time around there is no competition when it comes to mobile GPUs. The MacBook Pro 15" is expensive enough that Nvidia is probably fine. I assume they'll go with a GTX 1050Ti.
 
This. This time around there is no competition when it comes to mobile GPUs. The MacBook Pro 15" is expensive enough that Nvidia is probably fine. I assume they'll go with a GTX 1050Ti.

Either NVidia will be forced to reduce their cost or Apple will just increase their price.
 
Pascal seems to be a good fit for the MacBook Pro line. Though I wonder if this doesn’t refer to a Mac Pro refresh. A semi-custom Zen SOC would seem to be a better fit for Apple’s laptop line.
 
Apple is forced to do so, as the perf/watt difference keeps increasing. Just as they had to drop PPC and there is no alternative to using Intel CPUs.

The only question left is when.
 
Last edited:
Apple is forced to do so, as the perf/watt difference keeps increasing. Just as they had to drop PPC and there is no alternative to using Intel CPUs.

The only question left is when.
What increasing? Isn't the RX 480 using slightly more power than the 1060? That's a big difference from the R9 390 vs GTX 970 in power consumption.
 
What increasing? Isn't the RX 480 using slightly more power than the 1060? That's a big difference from the R9 390 vs GTX 970 in power consumption.

no, you probably mean GTX 1070, which in that case: yes, the RX 480 uses slightly more power than the GTX 1070 which is in another higher level of performance, and that's only reference and Founders Edition models.. AIB RX 480 go easily over 180W and some cards over 210W and that put it over the GTX 1080, even AIB models, which offer double the performance of the RX 480, AMD can only compete in perf/watt against old generation nvidia cards... nvidia can offer a slightly underclocked GTX 1070 or GTX 1080 and put the RX 480 in shame so as Shintai Said Apple is simply forced to go with Nvidia.
 
Last edited:
What increasing? Isn't the RX 480 using slightly more power than the 1060? That's a big difference from the R9 390 vs GTX 970 in power consumption.


The difference is quite large, specially in mobile. Example from desktops with decreased power consumption.

Performance-vs.-Power-Consumption.png
 
The difference is quite large, specially in mobile. Example from desktops with decreased power consumption.

Performance-vs.-Power-Consumption.png
That's pointless for 3 reasons.

#1 Cause it's not a Vulkan/DX12 game. Try using Vulkan Doom as part of the benchmark. As we all know, Mac OSX uses DirectX11.

#2 This is Apple computers, so they probably don't care about gaming benchmarks.

#3 Apple will use the Metal API, so who knows how AMD or Nvidia hardware will perform on that.
 
That's pointless for 3 reasons.

#1 Cause it's not a Vulkan/DX12 game. Try using Vulkan Doom as part of the benchmark. As we all know, Mac OSX uses DirectX11.

#2 This is Apple computers, so they probably don't care about gaming benchmarks.

#3 Apple will use the Metal API, so who knows how AMD or Nvidia hardware will perform on that.

1. AMD doesn't do any better in DX12/Vulkan. Its a myth that got busted. Nice PR attempt tho.

2. If you think the perf/watt delta is different in something else more Apple related. Feel free to post it.

3. The API wont change the performance metrics of the hardware as such. Specially not in this amount.

You may also have noticed the absence of Polaris based laptops, while Pascal based laptops are out in force.
 
  • Like
Reactions: DPI
like this
The difference is quite large, specially in mobile. Example from desktops with decreased power consumption.

Performance-vs.-Power-Consumption.png

There using Metro Last Light to make the determination?!?!? That's like doing the same thing using Doom under Vulkan. This game is just about the worst case.
 
There using Metro Last Light to make the determination?!?!? That's like doing the same thing using Doom under Vulkan. This game is just about the worst case.

Do you understand how perf/watt works? The RX480 uses as much as a GTX 1070 while performing roughly the same as a GTX 1060. The GTX 1060 consumes even less power. AMD's perf/watt is once again shit this time around compared to Nvidia, while cost/performance is pretty good in the case of the RX470.
 
1. AMD doesn't do any better in DX12/Vulkan. Its a myth that got busted. Nice PR attempt tho.
Here's a comprehensive list of benchmarks from websites. It shows how AMD and Nvidia do with each game title. AMD wins 5 out of 7 DX12 games. Though Nvidia does win 39 out of 44 DX11 games. Vulkan is 50/50 due to Talos Principal. It really does depend on what API you use.


2. If you think the perf/watt delta is different in something else more Apple related. Feel free to post it.
There aren't a lot of websites that benchmark using DX12/Vulkan. A lot of them refuse for some reason. Even less that use those results to determine performance per watt.

3. The API wont change the performance metrics of the hardware as such. Specially not in this amount.
You looked at Vulkan Doom lately?
 
Do you understand how perf/watt works? The RX480 uses as much as a GTX 1070 while performing roughly the same as a GTX 1060. The GTX 1060 consumes even less power. AMD's perf/watt is once again shit this time around compared to Nvidia, while cost/performance is pretty good in the case of the RX470.
There are some situations where the RX 480 comes close to the GTX 1070 in Doom. The Adored guy kinda showed this in his review. Also, the RX 480 isn't consuming a metric ton more power over the 1060 either. At stock clocks we're talking about 19 watts. The 1070 is only using 45 watts more. Some people were even able to increase the performance of the RX 480 while lowering power consumption. If there are some games the RX 480 does better in, then the performance per watt works out to be the same.

gtx-1060-bench-power.png
 
You are using outdated Doom benches to show something AMD should be far a head with, due to GCN Shader Extentions used.

Snabbtest: Doom med Vulkan - Test

Just look at the 1440p numbers or 4K. I did look at Vulkan and Doom lately, you didn't.

And I wonder if you are aware that a card like the RX480 uses as much power when gaming as a GTX1080!

power_average.png

perfwatt_1920_1080.png


And it leaves the question you avoided, where are all the Polaris laptops?
 
Last edited:
Do you understand how perf/watt works? The RX480 uses as much as a GTX 1070 while performing roughly the same as a GTX 1060. The GTX 1060 consumes even less power. AMD's perf/watt is once again shit this time around compared to Nvidia, while cost/performance is pretty good in the case of the RX470.
Yep and the games you use to test this matter ALOT. Using Metro as your baseline is just horrible. No one does their perf/w test using the latest APIs and many of them use TWIMTBP titles. When you see a performance variance of 25% or more depending on the title you definitely should average many games to determine Perf/w.
 
Yep and the games you use to test this matter ALOT. Using Metro as your baseline is just horrible. No one does their perf/w test using the latest APIs and many of them use TWIMTBP titles. When you see a performance variance of 25% or more depending on the title you definitely should average many games to determine Perf/w.

Whose fault is it that every game doesn't run perfectly on AMD hardware? The only game so far that has shown a huge AMD advantage was DOOM, and after some patches that lead isn't as big anymore. Pascal is more efficient than Polaris, it doesn't matter what you are running. On something like a MacBook Pro, where gaming isn't a huge concern, the more power efficient GPU will be more desirable.
 
There are some situations where the RX 480 comes close to the GTX 1070 in Doom. The Adored guy kinda showed this in his review. Also, the RX 480 isn't consuming a metric ton more power over the 1060 either. At stock clocks we're talking about 19 watts. The 1070 is only using 45 watts more. Some people were even able to increase the performance of the RX 480 while lowering power consumption. If there are some games the RX 480 does better in, then the performance per watt works out to be the same.

No point to keep reading after that. That guy is the biggest AMD fanboy who does reviews and what he says should be taken with a huge grain of salt.
 
You are using outdated Doom benches to show something AMD should be far a head with, due to GCN Shader Extentions used.

Snabbtest: Doom med Vulkan - Test

Just look at the 1440p numbers or 4K. I did look at Vulkan and Doom lately, you didn't.

And I wonder if you are aware that a card like the RX480 uses as much power when gaming as a GTX1080!

power_average.png

perfwatt_1920_1080.png


And it leaves the question you avoided, where are all the Polaris laptops?
So you pick the low resolution no one uses but disregard 1920 x1080 the most ubiquitous resolution of all time?? OK
 
Whose fault is it that every game doesn't run perfectly on AMD hardware? The only game so far that has shown a huge AMD advantage was DOOM, and after some patches that lead isn't as big anymore. Pascal is more efficient than Polaris, it doesn't matter what you are running. On something like a MacBook Pro, where gaming isn't a huge concern, the more power efficient GPU will be more desirable.

It's not about fault. It's about knowing that different engines, and different games have preferences. Vulkan isn't the only API. Native DX12 titles show pretty big gains as well and different engines like Unigine also show differences. Notice I said native and not bolted on DX12 which so far seems to offer lower performance across the board when it's attempted.
 
It's not about fault. It's about knowing that different engines, and different games have preferences. Vulkan isn't the only API. Native DX12 titles show pretty big gains as well and different engines like Unigine also show differences. Notice I said native and not bolted on DX12 which so far seems to offer lower performance across the board when it's attempted.

Again, how is this relevant to a MacBook Pro user? If you buy a MacBook for Windows you're doing it wrong, since it doesn't run even close to as well as OSX. The discussion you're trying to have is all over the video card section already.
 
So you pick the low resolution no one uses but disregard 1920 x1080 the most ubiquitous resolution of all time?? OK

Low resolution? 1440p and 4K is higher resolution than 1080p. And try check what resolution Apple products ship with.

And I showed 1080p via TPU.
 
Low resolution? 1440p and 4K is higher resolution than 1080p. And try check what resolution Apple products ship with.

And I showed 1080p via TPU.
Sorry brain freeze. Still the 1080 results show opposite of the 1440p and higher. Depends on model. Macbook Airs are usually lower than the rest. I also don't usually use results from a page I have to translate. Why not use the results here?

That's 1080...from what game?
 
Again, how is this relevant to a MacBook Pro user? If you buy a MacBook for Windows you're doing it wrong, since it doesn't run even close to as well as OSX. The discussion you're trying to have is all over the video card section already.
Since you are basing your findings on Windows games that's all we have. If you have Mac benches you can share.
 
You are using outdated Doom benches to show something AMD should be far a head with, due to GCN Shader Extentions used.

Snabbtest: Doom med Vulkan - Test

Just look at the 1440p numbers or 4K. I did look at Vulkan and Doom lately, you didn't.

And I wonder if you are aware that a card like the RX480 uses as much power when gaming as a GTX1080!
You base this on one benchmark, where I base it on all the benchmarks. Remember this link? It's a big problem when nearly every website has inconsistent results in their benchmarks. Where one website shows different winners for the same games.


And it leaves the question you avoided, where are all the Polaris laptops?
I avoided the question cause it's a stupid question. I have no idea where the Polaris laptops are, nor do I know where the Pascal Laptops are.

Quick Google, here. Hope you like Chinese.
【惠普OMEN 15-ax101TX?】惠普(HP)暗影精灵II代北极星 15.6英寸游戏笔记本(i5-6300HQ 8G 1T RX460 4G独显 GDDR5 FHD Win10)【行情 报价 价格 评测】-京东

Whose fault is it that every game doesn't run perfectly on AMD hardware?
Gameworks fault.
The only game so far that has shown a huge AMD advantage was DOOM, and after some patches that lead isn't as big anymore.
I'd like a benchmark done by a reputable website before making such a conclusion. Cause as you should already tell, different websites can have different conclusions.
 
Well sorry, but I don't care about the big bad Gameworks because I actually play games I like instead of having to pick and choose because of my GPU manufacturer.

Your Doom benchmarks were all done before the fix for Nvidia cards, so not really relevant. Either way, it's literally one game that doesn't even have very demanding graphics. Until we see more actual DX12 and Vulkan games, I don't give a flying fuck. When I buy my GPU's, I buy them so they can perform well right now and also in the future. I don't just buy a card on the hope that some magical technology gets adopted by every game and it works in the future.
 
Well sorry, but I don't care about the big bad Gameworks because I actually play games I like instead of having to pick and choose because of my GPU manufacturer.
That's not the point. With the exception of Doom, whenever a GPU manufacturer gets involved in the production of a game, it suddenly favors that manufacturers GPU. For example, Ashes of Singularity/Hitman/Deus Ex all perform well on AMD hardware regardless of DX11/DX12 API you use. The same goes for Gameworks titles, but there's a lot more Gameworks titles than AMD titles.

It really shouldn't work like this, and games should try to do the best for all hardware, but that's the nature of the PC gaming industry today.
Your Doom benchmarks were all done before the fix for Nvidia cards, so not really relevant.
I'd really like to know which driver was the "fix" for Nvidia cards, cause a before and after would be interesting to see. Cause you know, different websites like to have different results.
Either way, it's literally one game that doesn't even have very demanding graphics. Until we see more actual DX12 and Vulkan games, I don't give a flying fuck. When I buy my GPU's, I buy them so they can perform well right now and also in the future. I don't just buy a card on the hope that some magical technology gets adopted by every game and it works in the future.
Newer API's have been a thing for PC gaming since the Voodoo 1 graphics cards was introduced. There's no magic technology. DX11 has been around for too long, compared to previous iterations of DirectX, and we're not used to the idea of a new API. Game developers should be making the switch much sooner, but they're reluctant to do so. And graphic cards today last much longer than before, mostly because DX11 has been around since 2009. Before DX11, we were using DX9 since like 2002/2004?

If you upgrade your GPU often, the here and now makes sense. If you plan to keep it for 5 years, then the future of Vulkan/DX12 makes sense. Also in the case of Mac OSX, we should be focusing on Vulkan/DX12 since these API's are more closely related to Metal. I don't know how the adoption of Metal is going for Mac, but that's the direction Apple wants developers to go.
 
Last edited:
Back
Top