Total War: WARHAMMER Ported to Metal API for Mac Launch

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
Total War: WARHAMMER has been ported to the Metal API for an April 18th launch on Mac OSX. This was done by the porting wizards at Feral Interactive who have also ported the game to Linux. They also have released a Vulkan port of Mad Max. The quality of this company's releases has been impeccable and I can't wait for their next PC title. This is one of the few companies that I pay attention to when it comes to porting games as they have fully embraced low level APIs.

Thumbs up!

Total War: WARHAMMER brings its deep strategy and high fantasy to the Mac on April 18.

Fight for control of the Old World in a spellbinding game of colossal battles, sophisticated tactics, and relentless conquest. Total War: WARHAMMER comes to Mac in fearsome form using Metal, Apple’s new graphics API.
 
Is this total war game just like all the others where the AI blatantly cheats until you decide to rage quit?
 
Lets see, lets expend a boat load of money on a platform which has no good graphics card available, yet, lets ignore the other unix-like platform that is in need of software and has access to the same hardware and APIs as windows....
 
That is not supported by Apple themselves as part of any of their orders, either this will change or Nvidia is just sticking the middle finger at them to make a point of being excluded from the Apple line :)
Cheers

Nvidia has stated in the past that they have not been able to release drivers without cooperation from Apple. The timing of the drivers releasing was also suspiciously close to the timing of Apple announcing that they plan to release a new Mac Pro next year that will be fully expandable. This is all speculation of course, but it does help professionals who still use Macs so it seems good all around.
 
Nvidia has stated in the past that they have not been able to release drivers without cooperation from Apple. The timing of the drivers releasing was also suspiciously close to the timing of Apple announcing that they plan to release a new Mac Pro next year that will be fully expandable. This is all speculation of course, but it does help professionals who still use Macs so it seems good all around.

Yep as I said, either it will change or Nvidia is sticking the middle finger (they could do drivers if they wanted as it has no use for now), neither is known for now but Apple is still heavily committed to AMD and not a peep on any rumour front about Apple opening the line to Nvidia.
Even the article made it clear when they said:
Nvidia has maintained macOS drivers for older Maxwell-based GPUs even though Apple never shipped them in any Macs (every new Mac starting from the 2013 Mac Pro has used Intel's and AMD's GPUs exclusively).

So as the article said they also did older Maxwell GPU related drivers and those GPUs were also excluded by Apple.
But maybe Apple is coming to their senses as they must be hurting by excluding Nvidia for the professional clients.
Cheers
 
Last edited:
Awesome, let me buy a Titan XP/Vista or what ever the flavor of the month is and install it in my trash can mac....

Seriously, read my original comment again.

I did read your comment, and while I find it amusing it isn't technically correct. Older upgrade-able mac's can still run MacOS Sierra and install a Pascal card. You can also use these GPU's using e-gpu or hackintosh if you want to get adventurous (and don't care about stability).

With that said, I think we all know any sane person wouldn't go this route if you really cared about performance in your games or as a workstation. As a professional, you would need proper CUDA support which usually lags behind regular driver support. And as a gaming enthusiast you limit your options significantly on what can even run.
 
I did read your comment, and while I find it amusing it isn't technically correct. Older upgrade-able mac's can still run MacOS Sierra and install a Pascal card. You can also use these GPU's using e-gpu or hackintosh if you want to get adventurous (and don't care about stability).

With that said, I think we all know any sane person wouldn't go this route if you really cared about performance in your games or as a workstation. As a professional, you would need proper CUDA support which usually lags behind regular driver support. And as a gaming enthusiast you limit your options significantly on what can even run.

Apple does not sells nvidia cards, neither includes them in any of their systems.

No respectable developer will make anything for a Mac with the assumption that nvidia will continue releasing drivers for "unauthorized" hardware.

And talking about that situation, it is clear that nvidia pissed off apple (like they did with microsoft, sony and soon, nintendo) and the only reason that nvidia is releasing drivers is because some people prefer to continue upgrading their cheese graters, instead of jumping to a pc and have a modern system.

Lastly, stop spreading FUD, hackintoshes can be very, very stable and reliable, in spite of apple.
 
Apple does not sells nvidia cards, neither includes .them in any of their systems.

No respectable developer will make anything for a Mac with the assumption that nvidia will continue releasing drivers for "unauthorized" hardware.

And talking about that situation, it is clear that nvidia pissed off apple (like they did with microsoft, sony and soon, nintendo) and the only reason that nvidia is releasing drivers is because some people prefer to continue upgrading their cheese graters, instead of jumping to a pc and have a modern system.

Lastly, stop spreading FUD, hackintoshes can be very, very stable and reliable, in spite of apple.

Nvidia did have job postings a while back indicating that they were hiring software engineers to work on the "next generation of apple devices" which is obviously very ambiguous. They also have current job postings to further development efforts for Metal compute and Open CL.

I will be curious to see if Apple will sell Nvidia cards with the next Mac Pro as an option. Of course we don't truly have a way of knowing Apple's intentions in the future, but I think it is always in the best interest of the consumers to have multiple options with GPU regardless of what platform you are on. For that reason alone, I hope they do provide a little more variety going forward.

I wouldn't consider the stability comment as spreading FUD. Hackintoshes can be stable, but the conventional wisdom is that you shouldn't take any significant upgrades without planning and preparation. I have been in that situation myself several times in the past. Anyone who has mucked around with sleepenabler, nvenabler, and other similar things has been there as well.

If you rely specifically on income that is generated from work done on a Mac, I don't think a hackintosh is necessarily the wisest approach. With that said, I did initial development of an ios app using a hackintosh for a personal project. I eventually had problems because I couldn't update xcode to publish it. The xcode update required the new osx update which the community had not yet figured out how to do in a safe manner.

To circle back to the original article at hand and your initial comment, I own Total War Warhammer from the humble bundle monthly and it already works on Linux. Mac is just the latest platform to get support.
 
Nvidia did have job postings a while back indicating that they were hiring software engineers to work on the "next generation of apple devices" which is obviously very ambiguous. They also have current job postings to further development efforts for Metal compute and Open CL.

Wonder if it might be related to the fact they dumped PowerVR.

I will be curious to see if Apple will sell Nvidia cards with the next Mac Pro as an option. Of course we don't truly have a way of knowing Apple's intentions in the future, but I think it is always in the best interest of the consumers to have multiple options with GPU regardless of what platform you are on. For that reason alone, I hope they do provide a little more variety going forward.

Options are always good, but Apple is one of those companies that can hold a grudge and it does looks like they have one with Nvidia.

I wouldn't consider the stability comment as spreading FUD. Hackintoshes can be stable, but the conventional wisdom is that you shouldn't take any significant upgrades without planning and preparation. I have been in that situation myself several times in the past. Anyone who has mucked around with sleepenabler, nvenabler, and other similar things has been there as well.

On that aspect, yes, the update situation is a headache, suffered through it, but the rest of the time, it was super stable.

If you rely specifically on income that is generated from work done on a Mac, I don't think a hackintosh is necessarily the wisest approach. With that said, I did initial development of an ios app using a hackintosh for a personal project. I eventually had problems because I couldn't update xcode to publish it. The xcode update required the new osx update which the community had not yet figured out how to do in a safe manner.

Thats one of my problems with the fanboism, Apple has shown very little interest in the Pro market for a while now. As a professional, if I need a better system, I would buy a better system and if that means jumping to windows or linux, I would do it, yet, the hardcore fanbois will prefer to go back to pen and paper before doing that.

To circle back to the original article at hand and your initial comment, I own Total War Warhammer from the humble bundle monthly and it already works on Linux. Mac is just the latest platform to get support.

Oops, looks like i grabbed the pitchfork a bit to quick!

But like they say, you need practicing, so you dont get dull... ;)
 
  • Like
Reactions: Fleat
like this
....
And talking about that situation, it is clear that nvidia pissed off apple (like they did with microsoft, sony and soon, nintendo) and the only reason that nvidia is releasing drivers is because some people prefer to continue upgrading their cheese graters, instead of jumping to a pc and have a modern system.

Lastly, stop spreading FUD, hackintoshes can be very, very stable and reliable, in spite of apple.

Apple's reason was a bit more ruthless like and possibly a bit anti-competitive practice; Apple dropped Nvidia because Apple wanted to promote OpenCL more and to gain traction over CUDA.

Cheers
 
Apple's reason was a bit more ruthless like and possibly a bit anti-competitive practice; Apple dropped Nvidia because Apple wanted to promote OpenCL more and to gain traction over CUDA.

Cheers
To be honest, I side with them there, OpenCL is an open standard, CUDA is just one more of nvidia dirty tactics to create a hardware monopoly via proprietary software.
 
To be honest, I side with them there, OpenCL is an open standard, CUDA is just one more of nvidia dirty tactics to create a hardware monopoly via proprietary software.
An open standard but they are blocking one competitor in terms of API-hardware to their OS/PC/laptops.
Not very open of them in that respect, because CUDA actually is performing better in terms of optimised applications and this is the only way to push OpenCL as far as they are concerned, albeit it is fair to say OpenCL has many more applications supporting it than CUDA and why Nvidia has pretty strong support for 1.2.
But people will just buy Microsoft/Linux machines with Nvidia and CUDA instead along with the same applications they required on them, so Apple is mostly hurting themselves.


On a different topic that I think started this.
If Nvidia was to be opened to Apple again, I would possibly expect them to support OpenCL 2.0 but we have not seen anything on that front.
This is something I would had expected Apple to insist upon and one reason I am a bit wary of if Nvidia will finally be available on Apple again in the near future.
But we digress and I accept many will have a different view whether Nvidia or Apple is at fault with regards to accss to that market.
Cheers
 
Last edited:
An open standard but they are blocking one competitor in terms of API-hardware to their OS/PC/laptops.
Not very open of them in that respect, because CUDA actually is performing better in terms of optimised applications and this is the only way to push OpenCL as far as they are concerned, albeit it is fair to say OpenCL has many more applications supporting it than CUDA and why Nvidia has pretty strong support for 1.2.
But people will just buy Microsoft/Linux machines with Nvidia and CUDA instead along with the same applications they required on them, so Apple is mostly hurting themselves.


On a different topic that I think started this.
If Nvidia was to be opened to Apple again, I would possibly expect them to support OpenCL 2.0 but we have not seen anything on that front.
This is something I would had expected Apple to insist upon and one reason I am a bit wary of if Nvidia will finally be available on Apple again in the near future.
But we digress and I accept many will have a different view whether Nvidia or Apple is at fault with regards to accss to that market.
Cheers
Sadly, out of two bad scenarios, apple choosed the less evil.

Nvidia will never license or worse, give cuda for free and thats how the roll.

I really dont see why you still mention them.

This is like back in the days, when rambus tried to become the defacto monopoly with their ram, intel cut them off as soon as they could, even though their ram was better/faster.

Same reason for me to side with apple, regardless of how good cuda is.

The problem is, very few people look ahead, only care for the here and now.
 
Sadly, out of two bad scenarios, apple choosed the less evil.

Nvidia will never license or worse, give cuda for free and thats how the roll.

I really dont see why you still mention them.

This is like back in the days, when rambus tried to become the defacto monopoly with their ram, intel cut them off as soon as they could, even though their ram was better/faster.

Same reason for me to side with apple, regardless of how good cuda is.

The problem is, very few people look ahead, only care for the here and now.

Cuda is integral to Nvidia why should they share it?
There is nothing wrong with developing your own solutions, otherwise what about all the supercomputer or larger professional middleware out there that are proprietary?
Do you think Google/Microsoft/etc share their algorithms and coding for search engines and other professional suites in its entirety?
CUDA is 'open' enough that 3rd parties can hook into it very well and optimise way more than they can with OpenCL, in most of the cases.


Cheers
 
Last edited:
Cuda is integral to Nvidia why should they share it?
There is nothing wrong with developing your own solutions, otherwise what about all the supercomputer or larger professional middleware out there that are proprietary?
Do you think Google/Microsoft/etc share their algorithms and coding for search engines and other professional suites in its entirety?
CUDA is 'open' enough that 3rd parties can hook into it very well and optimise way more than they can with OpenCL, in most of the cases.


Cheers

Then fine, OpenCL it is.

Why do you have such obsession for cuda and by consequence, nvidia?

Open and freely available technologies are always better for the consumer, but somehow, you miss that.
 
Then fine, OpenCL it is.

Why do you have such obsession for cuda and by consequence, nvidia?

Open and freely available technologies are always better for the consumer, but somehow, you miss that.

Although what ironically gives AMD an edge currently over Nvidia with Vulkan is the use of bespoke Shader Extensions :)
Nvidia recently completed the 1st step of their own extensions for Vulkan.
Here is AMD's comment on this:
The GCN architecture contains a lot of functionality in the shader cores which is not currently exposed in current APIs like Vulkan™ or Direct3D® 12. One of the mandates of GPUOpen is to give developers better access to the hardware, and today we’re releasing extensions for these APIs to expose additional GCN features to developers.

Shader Extensions
With those shader extensions, we provide access to wavefront-wide functions, which is an important building block to exploit the SIMD execution model of GPUs. For instance, the use of mbcnt and ballot can replace atomics in various cases, drastically boosting performance. The wavefront-wide instructions also include swizzles, which allow individual lanes to exchange data without going through memory.

In Direct3D, the shader extensions are exposed through the AMD GPU Services (AGS) library. For more information on AGS, visit GPUOpen’s AGS page.

The extension allows you to query the presence of the various extension functions. The functions have to be enabled before you can load a shader that uses them – this guarantees that your shader is compatible with the underlying hardware.

In your code, all you need to do is include amd_ags.h in your c/cpp file, initialize AGS with agsInit, and then check for shader extension support as shown in the code below:

So what both AMD and Nvidia are doing with Vulkan is moving away from 'open', and in fact AMD is offering this in Direct3D.
So shall we boycott Vulkan as the primary performance gain is from manufacturer specific designed extensions?
Cheers
 
Back
Top