Apple announces macOS 15 Sequoia with window tiling, iPhone mirroring, and more

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
11,264
Gaming takes center stage on macOS, pretty cool :)

"For gamers, Apple has announced the second version of its Game Porting Toolkit, to make it easier to bring Windows games to macOS and macOS games to iOS and iPadOS.

Some of the changes also mirror those that Apple announced in the iOS and iPadOS portions of the presentation—including RCS support and expanded Tapback reactions in Messages, a redesigned Calculator app that mirrors the one introduced on the iPad, and the Math Notes feature for typed-out equations in the Notes app. All of Apple's platforms, plus Windows, are also getting a new Passwords app that should be able to replace many standalone password managers like 1Password and Bitwarden."


1718043441230.png

Source: https://arstechnica.com/gadgets/2024/06/apple-announces-macos-15-sequoia/
 
The changes made to the porting Toolkit are interesting, the difficult part of porting DX12 to Metal or Vulkan is how very differently shaders are treated between them.
The Metal Shader Language (MSL) is too simple and does a lot of hand-holding, which is great if you are working with canned engines, but DX12 game engines rarely use the defaults and they get quite creative with the HLSL (High-Level Shader Language).
Vulkan has introduced its translator for HLSL to SPIR (Standard Portable Intermediate Representation), but it's also quite janky and requires a lot of manual work with namespaces and syntax.

And let's not even go into GLSL...

Either way, Microsoft has a lot of awesome support and documentation on HLSL so it should just be a matter of everybody getting the right translation libraries in place and ensuring that their language has an equivalent, Apple has been making changes to Metal for some time now and it looks like with the updates that came along with the M3 they have a lot more 1-1 equivalents, which eases the porting process.
 
woohoo window tiling! i went to do that the other day, didnt work obviously, and the person next to me chuckled and went "your a windows user eh!?"
we do have an apple section though, ya know....
 
Guess all the people that said Mac isn't for gaming are looking pretty stupid now. I think this push for gaming on Mac is probably due to their lacking sales. It's probably most people stick with Windows.
 
Considering the HiDPI display and GPU performance equivalent to a mid level tier PC GPU at best, combined with the fact that Apple refuse to support native Vulkan - Mac's aren't well suited to gaming.
 
Last edited:
Guess all the people that said Mac isn't for gaming are looking pretty stupid now. I think this push for gaming on Mac is probably due to their lacking sales. It's probably most people stick with Windows.
I'd say people stick with Windows mainly because it's more ubiquitous and available on cheaper systems. It's not hard to get a $400 laptop; it'll be low-quality, but you'll have it. And that matters in markets where even $100 more could be too much.

With that said, Apple is absolutely seeing gaming as a way to drive sales. The M-series may not have as much graphics power as higher-end dedicated GPUs, but it still delivers a surprising amount of performance for an SoC... and when you know even someone with a base MacBook Air or Mac mini can play a recent game, that offers a lot of potential.

Apple hasn't "solved" gaming, but it's doing much better than it was. If this takes off, it'll be a slow burn where more game devs sign on as they see existing titles doing well enough to recoup the investment. You'll know Apple has made it if you're playing the latest Call of Duty on a Mac.
 
Has anybody played BG2 on a Apple Silicon Mac? How does it do?
 
I'd say people stick with Windows mainly because it's more ubiquitous and available on cheaper systems. It's not hard to get a $400 laptop; it'll be low-quality, but you'll have it. And that matters in markets where even $100 more could be too much.
If we're talking about cheap laptops then Chromebooks are for that market, but they're not capable. Gaming and legacy applications are what keeps people using Windows. Does it ARM? Only 50% say yes to Apple Silicon. Four years later and it's at 50%?
With that said, Apple is absolutely seeing gaming as a way to drive sales. The M-series may not have as much graphics power as higher-end dedicated GPUs, but it still delivers a surprising amount of performance for an SoC... and when you know even someone with a base MacBook Air or Mac mini can play a recent game, that offers a lot of potential.
The problem is that to play a recent game on a Macbook Air or Mac Mini, you'll generally get worse performance and compatibility compared to a $400 Windows laptop. Now if you spend the same amount of money as a Macbook Air or Mac Mini, then frame rates are in the hundreds.
Apple hasn't "solved" gaming, but it's doing much better than it was. If this takes off, it'll be a slow burn where more game devs sign on as they see existing titles doing well enough to recoup the investment. You'll know Apple has made it if you're playing the latest Call of Duty on a Mac.
Back when Apple was on PowerPC they had Halo and World of Warcraft, but that doesn't mean they had made it. As long as Apple closes off MacOS from being installed on hardware that isn't made by Apple, then Apple will always struggle to get games. Apple can make a decent SoC, but against AMD or Nvidia? Even Intel is now beyond what Apple's GPU's can do.
 
I wonder what the 40-core GPU in the M3 Max can do. Should be a lot faster than the 8- or 10-core GPU in the Air or Mini.

Come on, somebody here must have loaded BG3 on one?
 
I wonder what the 40-core GPU in the M3 Max can do. Should be a lot faster than the 8- or 10-core GPU in the Air or Mini.

Come on, somebody here must have loaded BG3 on one?
It depends on the workload, Apple has some specific optimizations in place, for some workloads, it can go toe to toe with a 3080, and for others, you could swear that your old TI-86 runs circles around it.

Apple GPUs right now require you to do some specific use case analysis and using the conventional "How does it compare to this" doesn't necessarily work because you can get some huge swings on the bar graphs.

But if it falls into the Apple planned use case set then you are coming away laughing.
 
I wonder what the 40-core GPU in the M3 Max can do. Should be a lot faster than the 8- or 10-core GPU in the Air or Mini.

Come on, somebody here must have loaded BG3 on one?
This guy with an M3 Max is running 3072X1728 with max settings but no upscaling. He's getting 40 to 60 fps. He maybe frame rate capped at 60 fps but he's dipping down into the 40's and 50's most of the time.

View: https://youtu.be/ZyzNR0YX01A?si=MhXwPcJyOVJejGQ8

This guy with an RTX 4090 is running 3840x2160 with Ultra and also no upscaling. He's getting 150 to 180 fps.

View: https://youtu.be/qaNFIgXlJ0U?si=9uCh_ObYmI9MbP1U

So the question is, what is the equivalent to a M3 Max in BG3? From what I could find, it's a RTX 3070. This person did a test on a RTX 3070 at 4k Ultra with no upscaling and he was hovering in the 40's. Considering the person above wasn't even running at 4k, this is probably a closer match to the gaming capabilities of the M3 Max.

View: https://youtu.be/NQp9plxGHlQ?t=960
 
This guy with an M3 Max is running 3072X1728 with max settings but no upscaling. He's getting 40 to 60 fps. He maybe frame rate capped at 60 fps but he's dipping down into the 40's and 50's most of the time.

View: https://youtu.be/ZyzNR0YX01A?si=MhXwPcJyOVJejGQ8

This guy with an RTX 4090 is running 3840x2160 with Ultra and also no upscaling. He's getting 150 to 180 fps.

View: https://youtu.be/qaNFIgXlJ0U?si=9uCh_ObYmI9MbP1U

So the question is, what is the equivalent to a M3 Max in BG3? From what I could find, it's a RTX 3070. This person did a test on a RTX 3070 at 4k Ultra with no upscaling and he was hovering in the 40's. Considering the person above wasn't even running at 4k, this is probably a closer match to the gaming capabilities of the M3 Max.

View: https://youtu.be/NQp9plxGHlQ?t=960


I'd say that is pretty good. It's usable. Could have been worse.

Of course a MacBook with M3 Max isn't very cheap...
 
The problem is that to play a recent game on a Macbook Air or Mac Mini, you'll generally get worse performance and compatibility compared to a $400 Windows laptop. Now if you spend the same amount of money as a Macbook Air or Mac Mini, then frame rates are in the hundreds.
You won’t get worse performance than those $400 machines. Compatibility is another story, of course, but Intel’s low-end CPUs and integrated graphics are still pretty mediocre.

And frame rates in the hundreds? Not really — you’re not going to get 100+ FPS from a $999 laptop in many cases. Now, I still wouldn’t tell someone to buy any Mac with gaming as a primary use case, but Apple also isn’t building them with games as a major focus. They’re fast, slim, quiet, easy to use computers that just happen to play some games decently enough.

Back when Apple was on PowerPC they had Halo and World of Warcraft, but that doesn't mean they had made it. As long as Apple closes off MacOS from being installed on hardware that isn't made by Apple, then Apple will always struggle to get games. Apple can make a decent SoC, but against AMD or Nvidia? Even Intel is now beyond what Apple's GPU's can do.
Halo didn’t actually make it to the Mac. Microsoft snapped it up to get a blockbuster launch title for the original Xbox.

Intel’s Battlemage integrated GPUs are nice, but remember that it’s just adding ray tracing and other features Apple had last year. I’d like to see how M4 Macs stack up.
 
Guess all the people that said Mac isn't for gaming are looking pretty stupid now. I think this push for gaming on Mac is probably due to their lacking sales. It's probably most people stick with Windows.

Macs are still a very closed system. Which is part of why MacOS is so excellent on it. Almost like a console. Design the OS for exactly what the hardware is. Windows is designed to run on a huge range of hardware from many different vendors, mix and match.

Mac still isn't for serious gaming. You can play games on it, you'll be able to play more games on it, but it'll never be a serious gaming machine like a PC with Windows. Even Linux struggles, but it has the advantage of running on any config similar to Windows.

This will be a nice addition and Mac will be able to play a lot more games on it. But, it's not going to be a major player for serious gamers. In my opinion, anyway.
 
I'd say that is pretty good. It's usable. Could have been worse.

Of course a MacBook with M3 Max isn't very cheap...
Macbook Pro M3 Max costs $3,199 and it comes with 36GB of ram and a 1TB SSD. The people who bought laptops with RTX 4090's are suddenly financial geniuses.
You won’t get worse performance than those $400 machines. Compatibility is another story, of course, but Intel’s low-end CPUs and integrated graphics are still pretty mediocre.
The problem with gaming on Mac is that you'll still be dependent on emulation since 99% of games ever released are running on x86 Windows. Even still, lets compare a cheap $430 laptop with a Ryzen 5500U to a M2 Macbook. Why M2? Because I can't find any videos of someone using a base M3, playing Baldur's Gate 3.

Here is a person running BG3 on a M2 Macbook and it's getting at best 30fps with some tweaks. Base configuration is a slideshow, but FSR is turned on and the frame rate is a more stable 30fps. The M3 is likely better but again an M3 Macbook Air is like $1,100, and the lack of a fan won't be able to sustain that frame rate. The Macbook Pro M3 is $1,600.

View: https://youtu.be/Pa_KLaVcXeo?si=WPvKTFabbc11iCyA

Here a probably Russian person is running BG3 on a 5500U and like the M2 the frame rate is horrible, but when FSR is turned on the frame rate is a stable 30fps. At some point the person switches to FSR performance mode and the frame rate jumps to 50 fps, but the image gets grainy.

View: https://youtu.be/eVR_cISMpiw?si=XcGpjWWDDOZoARmE
And frame rates in the hundreds? Not really — you’re not going to get 100+ FPS from a $999 laptop in many cases. Now, I still wouldn’t tell someone to buy any Mac with gaming as a primary use case, but Apple also isn’t building them with games as a major focus. They’re fast, slim, quiet, easy to use computers that just happen to play some games decently enough.
Depends on the game of course. BG3 is going to make $999 laptops grunt hard but you can also find laptops in that price range with an RTX 4070, which makes it superior to an M3 Max at playing games. Also, even with the best processing power for playing games, the main problem would be the screen. Apple screens are very color accurate, but have the worst latency ever. Cheap Chinese monitors have far better latency than Apple. Not by a little either, with something like 30+ ms latency.
Halo didn’t actually make it to the Mac. Microsoft snapped it up to get a blockbuster launch title for the original Xbox.
Halo was released for MacOS as well as Windows and Xbox. I have played Halo on a Mac.
Intel’s Battlemage integrated GPUs are nice, but remember that it’s just adding ray tracing and other features Apple had last year. I’d like to see how M4 Macs stack up.
Intel ARC based GPU's have always had Ray-Tracing, and this includes chips like the Core Ultra 7 155H. They beat Apple to Ray-Tracing.
Macs are still a very closed system. Which is part of why MacOS is so excellent on it. Almost like a console.
You mean like Xbox? That console is failing hard. Being console like in 2024 is not a boon.
Even Linux struggles, but it has the advantage of running on any config similar to Windows.
As a Linux gamer I can tell you that is not the case. Online gaming is a challenge for sure but I don't play many online games besides World of Warcraft.
This will be a nice addition and Mac will be able to play a lot more games on it. But, it's not going to be a major player for serious gamers. In my opinion, anyway.
This is Apple's attempt to stop the bleeding their user base. Not having x86 hardware is starting to become a hindrance in Mac sales. Why you think they have M2 iPad's in 2024, or why the $3,500 Vision Pro came with an M2 and not an M3? Better yet, why is the M4 out on iPads but not Macbooks? With Qualcomm about to release their Snapdragon whatever and AMD's new Ryzen AI chips are next month, so you'd think Apple would have the M4 in Macbooks by now. Gaming plays a big part of why people are leaving Mac. Apple thinks that bringing a few games over to MacOS is going to solve this problem, but x86 Windows has all the games. Valve fixed this problem by creating Proton, because again you're not going to convince developers to port games to Linux, which still has a higher market share to MacOS according to Steam. Apple is better off taking Proton and adapting it on Mac. Apple is also better off taking Vulkan and implementing it on Mac.
 
Macbook Pro M3 Max costs $3,199 and it comes with 36GB of ram and a 1TB SSD. The people who bought laptops with RTX 4090's are suddenly financial geniuses.

The problem with gaming on Mac is that you'll still be dependent on emulation since 99% of games ever released are running on x86 Windows. Even still, lets compare a cheap $430 laptop with a Ryzen 5500U to a M2 Macbook. Why M2? Because I can't find any videos of someone using a base M3, playing Baldur's Gate 3.

Here is a person running BG3 on a M2 Macbook and it's getting at best 30fps with some tweaks. Base configuration is a slideshow, but FSR is turned on and the frame rate is a more stable 30fps. The M3 is likely better but again an M3 Macbook Air is like $1,100, and the lack of a fan won't be able to sustain that frame rate. The Macbook Pro M3 is $1,600.

View: https://youtu.be/Pa_KLaVcXeo?si=WPvKTFabbc11iCyA

Here a probably Russian person is running BG3 on a 5500U and like the M2 the frame rate is horrible, but when FSR is turned on the frame rate is a stable 30fps. At some point the person switches to FSR performance mode and the frame rate jumps to 50 fps, but the image gets grainy.

View: https://youtu.be/eVR_cISMpiw?si=XcGpjWWDDOZoARmE

Depends on the game of course. BG3 is going to make $999 laptops grunt hard but you can also find laptops in that price range with an RTX 4070, which makes it superior to an M3 Max at playing games. Also, even with the best processing power for playing games, the main problem would be the screen. Apple screens are very color accurate, but have the worst latency ever. Cheap Chinese monitors have far better latency than Apple. Not by a little either, with something like 30+ ms latency.

Halo was released for MacOS as well as Windows and Xbox. I have played Halo on a Mac.

Intel ARC based GPU's have always had Ray-Tracing, and this includes chips like the Core Ultra 7 155H. They beat Apple to Ray-Tracing.

You mean like Xbox? That console is failing hard. Being console like in 2024 is not a boon.

As a Linux gamer I can tell you that is not the case. Online gaming is a challenge for sure but I don't play many online games besides World of Warcraft.

This is Apple's attempt to stop the bleeding their user base. Not having x86 hardware is starting to become a hindrance in Mac sales. Why you think they have M2 iPad's in 2024, or why the $3,500 Vision Pro came with an M2 and not an M3? Better yet, why is the M4 out on iPads but not Macbooks? With Qualcomm about to release their Snapdragon whatever and AMD's new Ryzen AI chips are next month, so you'd think Apple would have the M4 in Macbooks by now. Gaming plays a big part of why people are leaving Mac. Apple thinks that bringing a few games over to MacOS is going to solve this problem, but x86 Windows has all the games. Valve fixed this problem by creating Proton, because again you're not going to convince developers to port games to Linux, which still has a higher market share to MacOS according to Steam. Apple is better off taking Proton and adapting it on Mac. Apple is also better off taking Vulkan and implementing it on Mac


You have it backwards. Having x86 hardware is starting to become a hinderance to everyone else's sales. That's why Qualcomm is about to release their Snapdragon "whatever". It's why Microsoft moved away from x86 for the new surface copilot machines.

Graphics intensive gaming is a single, minority use case of computers. It doesn't matter as much as you think it does. Nvidia only ships between 5-10 million consumer GPUs a year - many of those which are not used for gaming at all (video editing, AI, etc). In 2023 Apple shipped more than 20 million macs and more than 14.8 million ipads.

Of Nvidia's revenue in 2023 Q4 - the strongest quarter for consumer GPUS - only $2.9B came from gaming/consumer GPUs. During that same quarter, Apple posted an $89B revenue. Apple could literally take 100% of Nvidia's gaming GPU revenue and it wouldn't move the needle for them.

In FY 2023 Apple posted a $383 Billion dollar revenue (without good gaming support) compared to Nvidia's total revenue of $27 Billion, the VAST majority of which was datacenter driven. Even in the most recent quarters where Nvidia has shown blockbuster growth (far surpassing last year's revenues and profit) that growth has all come in datacenter/AI.

Gaming doesn't matter.
 
You have it backwards. Having x86 hardware is starting to become a hinderance to everyone else's sales. That's why Qualcomm is about to release their Snapdragon "whatever". It's why Microsoft moved away from x86 for the new surface copilot machines.
Apple hasn't seen any sales growth going ARM, and again 34% decline year over year. If we go by web traffic, we can see that MacOS has it's lowest representation since the introduction of the M1. Speaking of Baldur's Gate 3, the SnapDragon Elite X was said to get 30 fps with Super Resolution according to Microsoft. The problem is that I've proven that game runs on a potato at 30fps with upsclaing. The 3 generations old 5500U can achieve that.

As for why Microsoft is pushing for ARM aggressively, it's for many reasons. One because when Apple did release the M1 it made Microsoft shit bricks. Here we are with AMD and Intel who look like filament light bulbs where Apple looks like they're on LEDs. Microsoft's entire empire depends on x86, which is probably why Microsoft wants to expand away from x86. The problem is that we forget that Microsoft did beat Apple to ARM with the Surface X and it was traaash. Since the 4 years Apple introduce their M1's, both AMD and Intel have caught up in power efficiency which is something Microsoft didn't anticipate would happen. The Snapdragon Elite X is just really too late to respond to Apple's M1. Which is really strange because again, nobody is buying Apple Macbooks. If Intel is even half right about the capabilities of Lunar Lake, I think it's game over for ARM. Technically it's game over for ARM because of AMD, but like the PowerPC days where nobody likes to compare to AMD because they are the performance leader right now. It's always advantageous to compare to Intel when they're the ones with the worse product at the moment. PowerPC beats Intel's P4, but nobody wanted to talk about AMD's Athlon 64. History tends to repeat.
Graphics intensive gaming is a single, minority use case of computers. It doesn't matter as much as you think it does.
Keep telling yourself that. That's why Apple is again updating their games porting toolkit, which as a remind is just Apple appropriating CrossOver. Ask why Apple put mesh shader support and Ray-Tracing into their hardware?
Nvidia only ships between 5-10 million consumer GPUs a year - many of those which are not used for gaming at all (video editing, AI, etc). In 2023 Apple shipped more than 20 million macs and more than 14.8 million ipads.

Of Nvidia's revenue in 2023 Q4 - the strongest quarter for consumer GPUS - only $2.9B came from gaming/consumer GPUs. During that same quarter, Apple posted an $89B revenue. Apple could literally take 100% of Nvidia's gaming GPU revenue and it wouldn't move the needle for them.
Nvidia is an AI company now, at least until the AI market crashes. Before that, Nvidia was a GPU compute company for Crypto, until it crashed. For the past several years Nvidia has sold more GPU's for... "productivity" than for gaming. That doesn't change the fact that at their core they make GPU's for gaming. Most of Nvidia's existence was for gaming. The GPU in of itself was created primarily for gaming, until people found out that GPU's are really good at other things.
Gaming doesn't matter.
And yet here we are with a thread about Apple caring about gaming.
 
Last edited:
You have it backwards. Having x86 hardware is starting to become a hinderance to everyone else's sales.

And yet, considering HiDPI Apple displays, and the fact that Apple silicon contains a GPU that at best matches a mid range Nvidia GPU - x64 with a dedicated GPU still reigns supreme when it comes to gaming. I see no evidence that x86 is a hindrance to every one else's sales in any way whatsoever.

If anything, the 0.85% leeway Linux currently has over MacOS under Steam is evidence that ARM is not some wonder solution over x64 with a decent dGPU when it comes to gaming.
 
And yet, considering HiDPI Apple displays, and the fact that Apple silicon contains a GPU that at best matches a mid range Nvidia GPU - x64 with a dedicated GPU still reigns supreme when it comes to gaming. I see no evidence that x86 is a hindrance to every one else's sales in any way whatsoever.

If anything, the 0.85% leeway Linux currently has over MacOS under Steam is evidence that ARM is not some wonder solution over x64 with a decent dGPU when it comes to gaming.
It matches pretty well against the best mobile solutions Nvidia has to offer.
When you have a GPU solution supported by both, the M3 lands somewhere between the 4070 and 4080 mobile solutions while drawing a hell of a lot less juice.

That said, a PC with windows or Linux has a lot more supported games than Apple does. Apple has a crapload of casual games or mobile games. But any A+ titles are few and far between, that may be changing but it will take time. The M1 was the first mildly serious GPU in a MacBook offering in decades, until then your options was an Intel Iris, which wasn’t even good when they were new let alone when they were a few generations out, and Apple was never on the latest Intel generation.

The M3 GPU’s are right on par with the best that anybody is offering for mobile GPU’s or APUs, which is good, it’s at least console level, which makes it something a publisher could reasonably support now that Apple has at least added the needed updates to Metal for upscaling, shader support, and Ray tracing.

Looking into it upscaling support is mandatory if you want to run anything UE5, developers aren’t building anything right now intended for native resolutions. The available hardware doesn’t support it, the 4090 barely manages it.
 
It matches pretty well against the best mobile solutions Nvidia has to offer.
When you have a GPU solution supported by both, the M3 lands somewhere between the 4070 and 4080 mobile solutions while drawing a hell of a lot less juice.

The problem is that mobile dGPU solutions are fairly gutted, probably struggling to achieve desktop 4060 8GB levels of performance.

that may be changing but it will take time.

Not if Apple persists in sticking with Metal as opposed to adopting native Vulkan support. Developers are going to be hesitant to develop for a platform that makes up 1.47% of Steam sales, while using an API that's not cross platform compatible.
 
And yet, considering HiDPI Apple displays, and the fact that Apple silicon contains a GPU that at best matches a mid range Nvidia GPU - x64 with a dedicated GPU still reigns supreme when it comes to gaming. I see no evidence that x86 is a hindrance to every one else's sales in any way whatsoever.

If anything, the 0.85% leeway Linux currently has over MacOS under Steam is evidence that ARM is not some wonder solution over x64 with a decent dGPU when it comes to gaming.

Because like Dukenukem, you can't understand that the vast majority of computer users don't play AAA games.

Apple's ARM is a wonder solution over x86 when it comes to pretty much everything else. No other laptop in the world can touch their combination of performance, battery life, lack of heat/power consumption, etc. It gets especially hilarious when you start comparing battery life/heat/etc against the PC laptops you're talking about that do have powerful GPUS comparable to Apple's.

Companies exist to make money; they aren't a charity. Companies wouldn't be starting to copy Apple and move to ARM if they didn't see a business case for it.
 
It matches pretty well against the best mobile solutions Nvidia has to offer.
When you have a GPU solution supported by both, the M3 lands somewhere between the 4070 and 4080 mobile solutions while drawing a hell of a lot less juice.
I've already pointed out it matches a RTX 3070 when it comes to BG3. The only product that even comes close in the M3 Max, which again costs over $3k. At RTX 4070 laptop is like $1300.
That said, a PC with windows or Linux has a lot more supported games than Apple does.
The reason it does is because the primary hardware used on Linux is x86, with Linux also supporting Vulkan.
Apple has a crapload of casual games or mobile games. But any A+ titles are few and far between, that may be changing but it will take time.
Until Apple takes gaming seriously then this won't change. Like I said, they need Vulkan for starters.
The M1 was the first mildly serious GPU in a MacBook offering in decades, until then your options was an Intel Iris, which wasn’t even good when they were new let alone when they were a few generations out, and Apple was never on the latest Intel generation.
What about the Macbooks with AMD discrete GPU's?
Looking into it upscaling support is mandatory if you want to run anything UE5, developers aren’t building anything right now intended for native resolutions. The available hardware doesn’t support it, the 4090 barely manages it.
You think RTX 4090's need upscaling for UE5? You have an example of this claim?
It's worse than that, there are different M3 chip levels all labeled "Max", The one in the youtube video is the 16-core CPU with 40 GPU cores, which starts at $3349, although that includes 48 GB RAM and 1 TB disk.
Oh I know, and the price goes to $4k if you switch to 16" Macbook M3 Max with 16 core CPU and 40 core GPU. I thought going ARM was suppose to make these products cheaper?
Because like Dukenukem, you can't understand that the vast majority of computer users don't play AAA games.
According to this because it was the first thing I found when Google'ing this, "Globally, there are around 1.86 billion PC gamers". Gamers in genearl are over 3 Billion. So the question is how many people use Macs world wide? The answer is "There were over 100 million Mac users worldwide by the beginning of 2023." So to recap, more than 1/3rd of the human population plays games, but on PC it's less than 1/4 of the human population. Assuming these statics are correct because it's the first thing I Google'd. Just in Mac users in general is less than a fraction of the amount of people playing video games. According to Microsoft there's 1 Billion people who use Windows world wide. The amount of Windows users that play video games must be staggering.
Apple's ARM is a wonder solution over x86 when it comes to pretty much everything else. No other laptop in the world can touch their combination of performance, battery life, lack of heat/power consumption, etc. It gets especially hilarious when you start comparing battery life/heat/etc against the PC laptops you're talking about that do have powerful GPUS comparable to Apple's.
Except for the Asus Zenbook 14 when compared to the M3 Macbook Air.

View: https://youtu.be/oIVSJX8dy2I?si=Zuz-fdBL8_HR7iJo
Companies exist to make money; they aren't a charity. Companies wouldn't be starting to copy Apple and move to ARM if they didn't see a business case for it.
Apple isn't exactly doing well sales wise, and Qualcomm who can only make ARM based SoC's is trying again to see if they can put a dent into the x86 Windows market with Microsoft's blessing. Qualcomm right now is pumping a lot of money to spread propaganda about how bad Intel and sometimes AMD is compared to their Snapdragon Elite X. Linus Tech Tips released a video that is essentially just repeating the marketing benchmarks Qualcomm has to offer while offering comparisons to older Intel laptops that aren't Meteor Lake by just holding it in their hand. If anyone wonders if Linus Tech Tips skews their opinion based on sponsorship, the answer is yes.

View: https://youtu.be/C0bEew9dqNs?si=T7BkdOPc5UJQRQar

Again for comparison, AMD's current for one month 7840 series with 780m GPU is so much faster at BG3, while the SnapDragon Elite X is reported to do 30 fps.

View: https://youtu.be/rPS5jrvyj30?si=NrngEVs7KtDhXhzX
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
Considering the person above wasn't even running at 4k,
and a 12100f could be slowing things down a bit here.
You have it backwards. Having x86 hardware is starting to become a hinderance to everyone else's sales. That's why Qualcomm is about to release their Snapdragon "whatever". It's why Microsoft moved away from x86 for the new surface copilot machines.
That a bit a timing impression, x86 copilot machine are coming up and it would not be a big surprise if Microsoft will have some.
Since the 4 years Apple introduce their M1's, both AMD and Intel have caught up in power efficiency which is something Microsoft didn't anticipate would happen.
Yet to see a source about that ? (the video you show, seem to clearly indicate the exact opposite of that statement)

Which is really strange because again, nobody is buying Apple Macbooks.
Would we look at the percentage of over $1000 laptop sold are Macbooks, I doubt this would feel like that the case.

80m GPU is so much faster at BG3, while the SnapDragon Elite X is reported to do 30 fps.
Lot of under 15 fps in that video going on...

I thought going ARM was suppose to make these products cheaper?
Cheaper to make, you sell them the maximum you can wtih an offer-demand after that.

RTX 4090's need upscaling for UE5? You have an example of this claim?
Always depends on the performance you want (if one does not mind 30-35fps), but obviously dev does not make game with a 4090 in mind:
min-fps-3840-2160.png
min-fps-3840-2160.png
min-fps-3840-2160.png
 
Last edited:
You think RTX 4090's need upscaling for UE5? You have an example of this claim?
Yes, straight from the UE5 guidelines regarding the engine when enabling all the Nanite and Lumen features
WCCF tech talks about it in some detail here.
https://wccftech.com/unreal-engine-5-cant-run-native-4k-engine-design/

And it is essentially by design, the Engine and how it structures things are built under the assumption that it is being used, and if you aren't then you will very quickly encounter all sorts of issues.
The existing cards have neither the power nor the VRAM to accomplish the task.

Neither AMD nor Nvidia will give us, the consumer market, anything powerful enough to do the task any time soon. Supposedly the 5090 could be up to the task but will likely be unable to deliver due to the Memory layout, but at this stage, we all know we aren't getting more Memory.
 
Please do not put all your eggs in one basket and use Apples password manager.....(at that should not even be saving passwords and info in browsers either, not secure)
 
Because like Dukenukem, you can't understand that the vast majority of computer users don't play AAA games.

Well I don't need to, as almost every title I've seen running on Apple silicon shows anemic performance at best in direct comparison to a desktop PC with x64 and a capable dGPU - AAA or not.

You don't buy $3000.00 Apple hardware to game - Period.
 
Yet to see a source about that ? (the video you show, seem to clearly indicate the exact opposite of that statement)
Still not wrong, but do keep in mind that AMD's 8840hs as well as the 7840hs is going to be outdated by next month by AMD's AI 300 series, which is still not on TSMC's 3mn like Apple. We're not going to be talking about Dragon Range chips for much longer. The rest of this year is going to be interesting for laptop chips.
Would we look at the percentage of over $1000 laptop sold are Macbooks, I doubt this would feel like that the case.
I don't understand? Apple sold less and less than any other manufacturer.
Lot of under 15 fps in that video going on...
It starts to dip 2/3rds into the video but because he's lowering the wattage at that point. He drops it to 30 watt to eventually 15 watt which is why the frame rate keeps dropping. The whole point of the video was to test which driver works better at which power setting on the ROG Alley. In my opinion the AMD driver is better. If you really want to see the fps drop then play the game until you reach.... Bladur's Gate the games namesake city which is really late into the game. That's when I had to switch to Vulkan and turn on FSR. I had to update my FSR to 2.2 because FSR 1.0 sucked.

Also another thing to note is that the game was running at 1080p on the ROG Alley, where on the M2 Macbook it was set to 1680x945. It gives you an idea how ideal Apple's hardware is at playing games.
Always depends on the performance you want (if one does not mind 30-35fps), but obviously dev does not make game with a 4090 in mind:
View attachment 659878View attachment 659879View attachment 659880
If you want 60 fps then you need upscaling, but otherwise the RTX 4090 runs games without upscaling faster than most of the hardware here I described playing Baldur's Gate 3. It's all about what your standards are I guess.
Funny he is comparing the Zen against the MacBook Air.
I don't get this statement. What should I be comparing to the Macbook Air? AMD's Bulldozer based chips?
You don't buy $3000.00 Apple hardware to game - Period.
Apple would sure like you to.
 
You demographic of people gaming on MacBooks matches with the demographic of people gaming on a threadripper.

You can do it in a pinch, but it’s not something you buy for the express purpose of doing.
 
Still not wrong,
Maybe but source, the video you gave completely contradict the claim.

I don't understand? Apple sold less and less than any other manufacturer.
In units. in $, Apple market share is much bigger in the US than worldwide, in the not cheap device, specially for Laptops, in rich market Apple could be quite big.

What should I be comparing to the Macbook Air?
That was all discuted, when talking about power efficiency the Macbook pro should be used not the air (to have similar battery).
 
In units. in $, Apple market share is much bigger in the US than worldwide, in the not cheap device, specially for Laptops, in rich market Apple could be quite big.
This is hard to prove unless you have a source.
That was all discuted, when talking about power efficiency the Macbook pro should be used not the air (to have similar battery).
It's not like I have a choice. You think there's a lot of benchmarks comparing Apple to AMD, let alone Intel? 99% of reviewers run Geekbench and Cinebench and call it a day. Very rarely do they run other benchmarks, and almost nobody runs battery tests. If they do run a battery test, then it's with Cinebench. I got this video where he compares several Macbooks and two Intel Ultra 7 155H with 4 hours of Netflix playback. He does show that with Cinebench R24 the Macbook 13 ran longer than on the Macbook 14 Pro. The AMD 8840hs was 85% for a 30 minute run and has a higher score than the Macbook Pro 14 M3 Pro 11 core which had 81% battery left. What's also interesting is the extra single core has dropped the 12 core variant of the Macbook Pro down to 77%. As for the Netflix 4 hour test the AMD 8840hs laptop did get to 74% while the Macbook Pro M3 Pro 11 core got 83%. The Macbook Pro M3 Pro 11 core has a 72.4Wh battery while the AMD Zenbook 14 AMD 8840hs has a 75Wh battery. In terms of efficiency that's pretty close, but does favor Apple when it comes to video playback but not with raw compute. As a reminder the Macbook Pro 14 M3 Pro is $2k while the AMD 8840hs powered Zenbook is $800.

As you can see I had to jump between two videos to get all the needed info because nobody includes all the benchmark results in just one. This is more of a Gamers Nexus and Hardware Unboxed thing, as they do this. This won't last for long as next month AMD's new chips will be out.
 
That look like because the presentation is a bit backward and unclear (when talking about efficacy)

The AMD loose 26% of a 75wh the M3 lost 17% of a 72.4wh battery, that 19.5 wh vs 12.3 wh, the AMD system took more than 50% more energy during that playback. Could be a lot of thing outside the SOC vs an APU like that, could be the gpu part obviously, the OS, but that a massive gap and in some of the easiest thing to compete, video decoding...

Cinebench would be a test for the physicaly battery, yes if you push the laptop to 100% with a all-cores test, battery life will be battery what hours / system max watt regardless of efficacy, how much battery after an actual render could be interesting, battery left after 30 minutes of a cpu going 100% (without looking at the work done), that not ... I mean what would we be looking at here ? what if during the 30 minutes one laptop rendered the cinebench scene 40 time and the other 50, that seem trying to look at efficacy (work done / electricity spent) without controlling the work done.


As you can see I had to jump between two videos

Well no one was very clear, despite a smaller battery, same percentage left after the same task... that tell you quite well which is more efficiant (if we believe it was actually the same task, not sure we can take for granted that resolution, screen size, nits, sound level, HDR on or off was all well made reasonably equal... but maybe it is a serious channel)
 
Last edited:
You mean like Xbox? That console is failing hard. Being console like in 2024 is not a boon.
No. Like it's a system that you know exactly what you're developing for. You have a strict hardware set that you're targeting. It's not like a PC that could have an NVIDIA, AMD, Intel, Matrox, whatever video card, Intel/Realtek/whatever NIC, sound card, etc...

Xbox, PS5, SNES, whatever. It's the same system. PC's can have so many different configs. That's where the MacOS can shine. It has a specific set of hardware to target. It's not going to shit the bed on a weird video card with funky drivers...
 
You demographic of people gaming on MacBooks matches with the demographic of people gaming on a threadripper.

You can do it in a pinch, but it’s not something you buy for the express purpose of doing.
Pretty much. Yes, Macs CAN run games but not they are not powerful enough to do it very well and, of course, there's no upgrades. You gets what you gets. For the non-gamer laptop market it competes pretty well, but it doesn't compete so well against gamer laptops, and of course on the desktop it is just abysmal from any kind of price/performance perspective. Their hardware offerings just aren't really that appealing to gamers. You CAN game on them, but if gaming is your big use case you probably WOULDN'T.

You can also see that in game availability: There's a reason every single Mac gaming comparison in the last year has been BG3: It is practically the only major new game that runs at all decently that has been ported for it. There just aren't a lot of bigger games that get ported to the Mac, probably both because the hardware isn't that fast but also because the market isn't that big.

Xbox, PS5, SNES, whatever. It's the same system. PC's can have so many different configs. That's where the MacOS can shine. It has a specific set of hardware to target. It's not going to shit the bed on a weird video card with funky drivers...
Not really, no. In addition to just having far more variance than a console there's the fact that Apple will change how they do things, with no warning, whenever they want. So you don't get to develop something and target it and say "This will work on all the Macs out there, no problem." You can make something, and test it against all their hardware and software variants and have it working great, release it, and then Apple can suddenly decide to change how shit is done and you have to update.

We saw that happen at work with the M1 Macs. As soon as they came out some faculty bought them, and then immediately started whining about shit that wouldn't work. While most things worked fine under Rosetta, not everything did.

Consoles are different because they are static targets for a generation, which lasts a long time these days. They are a single piece of hardware and software, not a number of related ones, that you target and they remain static. Even when they get a mid-cycle refresh or new gen, it is often done with strict compatibility built in where the hardware can be set in a mode to limit itself to function as the previous hardware did. So even if you did something wacky like coded things to precise CPU timings, it'll still run just as it should.
 
That look like because the presentation is a bit backward and unclear (when talking about efficacy)

The AMD loose 26% of a 75wh the M3 lost 17% of a 72.4wh battery, that 19.5 wh vs 12.3 wh, the AMD system took more than 50% more energy during that playback. Could be a lot of thing outside the SOC vs an APU like that, could be the gpu part obviously, the OS, but that a massive gap and in some of the easiest thing to compete, video decoding...
Video decoding is probably not using the CPU but whatever media decoding hardware AMD uses. AMD's video decoding is not known to be very good.
Cinebench would be a test for the physicaly battery, yes if you push the laptop to 100% with a all-cores test, battery life will be battery what hours / system max watt regardless of efficacy, how much battery after an actual render could be interesting, battery left after 30 minutes of a cpu going 100% (without looking at the work done), that not ... I mean what would we be looking at here ? what if during the 30 minutes one laptop rendered the cinebench scene 40 time and the other 50, that seem trying to look at efficacy (work done / electricity spent) without controlling the work done.
What's important is when real work is being done the difference between the AMD x86 and Apple ARM chips are not very different. If you're doing productivity work loads then you probably wouldn't see a difference in terms of power consumption.
Well no one was very clear, despite a smaller battery, same percentage left after the same task... that tell you quite well which is more efficiant (if we believe it was actually the same task, not sure we can take for granted that resolution, screen size, nits, sound level, HDR on or off was all well made reasonably equal... but maybe it is a serious channel)
Which is why I wish Gamers Nexus and Hardware Unboxed did laptop reviews. Most of these laptop reviewers are terrible. Would be nice to know which part of the laptop is consuming power. How's the battery life running DaVinci Resolve or playing a video game? Outside of desktop PC's, the tech reviewers are really bad.
No. Like it's a system that you know exactly what you're developing for. You have a strict hardware set that you're targeting. It's not like a PC that could have an NVIDIA, AMD, Intel, Matrox, whatever video card, Intel/Realtek/whatever NIC, sound card, etc...
You think Apple is much better at this? There's no more Matrox, so I don't know why you mentioned them. They do exist but they're using AMD Cape Verde GPU's. On Windows PC you have AMD and Intel which make up x86 and maybe Qualcomm adding ARM for the 3rd time, but nobody expects ARM to take off. Apple though has spent 15 years with Intel and therefore x86, and has now moved onto ARM, which means developers have two equally relevant CPU architectures to deal with. On the graphics end of things, Apple has used AMD, Nvidia, Intel, and now their own GPU's. To make matters worse, it's not like Apple supports Vulkan, which would make development on Apple's hardware much easier for developers. It's also not like Apple doesn't use Realtek for networking and audio stuff as well. It's not like Apple's MacOS history starts with the M1. Plenty more people using MacOS are on Intel still.
Xbox, PS5, SNES, whatever. It's the same system. PC's can have so many different configs. That's where the MacOS can shine. It has a specific set of hardware to target. It's not going to shit the bed on a weird video card with funky drivers...
Except developers who've worked on Mac have hated it.

View: https://youtu.be/qRQX9fgrI4s?si=N89SyAIBMfOXmbco
 
Last edited:
If you're doing productivity work loads then you probably wouldn't see a difference in terms of power consumption.

I dunno, I have been busy typing this since I saw your post (on my battery-powered, portable triple screen M3 Max, M4 iPad Pro, LG Gram Monitor rig) and my battery life is on pace for 10+ hours running both monitors powered from my Macbook.

IMG_0802.png
 
I dunno, I have been busy typing this since I saw your post (on my battery-powered, portable triple screen M3 Max, M4 iPad Pro, LG Gram Monitor rig) and my battery life is on pace for 10+ hours running both monitors powered from my Macbook.

View attachment 659994
That is impressive battery life for displaying three letters for over 10 hours. Some say paradoxical is still displaying those letters on his screen to this day, possibly due to image burn in.
 
Back
Top