Apple Plans To Overhaul Entire Mac Line With AI-Focused M4 Chips

You could not pay me $10k to use that machine on a daily basis. It is absolutely embarrassing that a $1600 computer has a 1920x1200 screen with a garbage-tier touchpad and terrible battery life. And don't say "I'd just dock it" - what's the point in a laptop then? My MBP is excellent both on the go AND in a dock.
That display also does 144hz, because it's a gaming laptop with a RTX 4060. Which anyone who's seen a Macbook display in action knows that the response time is so bad it's like someone smeared petroleum jelly on it. You can nit pick that laptop all day long, but the reality is that laptops RTX 4060 has the same amount of ram as a $1,600 Macbook. You want a better display? This Asus ROG has 2560 × 1440 that runs at 240hz for $1,800. Half the ram of the Lenovo with 1/4 of the storage, but also 4x the ram of a Macbook and 8x the storage, while again not including the VRAM of the RTX 4070. It's just sad and pathetic the lengths you'll go to defend a company who charged you $7k for that 128GB Macbook you bought.
That is garbage to me because I have a Macbook Pro which I often run a mobile triple screen setup off of on battery with an iPad Pro and a LG Gram.

In this configuration, the resolution and screen quality available dwarf the PC as does the battery life with two other devices sucking off battery. I was able to work the entire day off battery from a hotel bar with this configuration on my last road trip!
You carry 3 screens with you? Do they run off battery?
The comparison with that Lenovo isn't straightforward. The LOQ has a lower resolution (if slightly higher refresh rate) and lower quality display, just one USB-C port (no surprises at the lack of Thunderbolt with AMD), build quality will undoubtedly be worse... you get the idea.
Laptops are never straight forward and build quality from Apple is subjective when they had keyboard gate.
Performance will be great for games and some pro apps while plugged in, of course, but it's going to tank while on battery and won't fare as well with some creative apps. I recently used an ASUS ROG Strix Scar 16 and that thing took a speed hit while unplugged (even with performance modes forced on). It also didn't last more than a few hours on battery when doing pedestrian tasks.
This is 2024 where even AMD laptops run several hours on battery. LukeTbk even linked you the info. Not sure what that Asus ROG Strix Scar has that you speak of, but as long as you're with AMD then you don't have battery problems. This is like when AMD used to run hot while Intel wasn't, but after Ryzen the shoe is on the other foot but it took a while before Intel fanboys agreed that AMD is now the cooler less power consuming CPU.
That kind of RAM and SSD capacity is great for the money, but Lenovo is prioritizing different things than Apple. And while Apple is undoubtedly charging higher profit margins, I know I'd much rather have the MacBook Pro for audiovisual editing (provided I get a config with enough storage, at least) even with those memory and storage deficits.
How much work you think you can do with 8GB of ram? What exactly is Apple prioritizing with $1,600 laptops with 8GB of ram? Other than the elephant in the room where as Darunion put it, the 8GB is meant to entice users to buy more expensive models. If Apple has to overhaul their entire Mac line it, then that tells me that things aren't doing well for Apple's Mac lineup.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
You carry 3 screens with you? Do they run off batt
The screen come from an other laptop and a tablet.

How much work you think you can do with 8GB of ram?
If you ssh (or other) to your work computer/server session, infinite ? If you write text, html, finalcut/premiere is able to do quite a bit on 8GB, specially if you do not mind for 4k
 
Last edited:
Laptops are never straight forward and build quality from Apple is subjective when they had keyboard gate.
You're still pitting a gaming laptop (and not ASUS' best-built, at that) against a high-end creative workstation. The markets for the two are wildly different!



This is 2024 where even AMD laptops run several hours on battery. LukeTbk even linked you the info. Not sure what that Asus ROG Strix Scar has that you speak of, but as long as you're with AMD then you don't have battery problems. This is like when AMD used to run hot while Intel wasn't, but after Ryzen the shoe is on the other foot but it took a while before Intel fanboys agreed that AMD is now the cooler less power consuming CPU.
That's... actually not all that great in 2024, especially since the system will throttle back while on battery power (yes, I appreciate the irony of Apple being the one with much better throttling control). If you're finishing an urgent art/video project, you don't want to have to wait ages for renders or risk running out of battery after a couple of hours in the field.

I will say the Strix Scar 16 I used was running a 13th-gen Core i9, so I wasn't expecting the greatest battery life. It's good to know modern Ryzens are faring better than that; I just don't think either is in the same ballpark as the M3 line if you value performance while unplugged.

How much work you think you can do with 8GB of ram? What exactly is Apple prioritizing with $1,600 laptops with 8GB of ram? Other than the elephant in the room where as Darunion put it, the 8GB is meant to entice users to buy more expensive models. If Apple has to overhaul their entire Mac line it, then that tells me that things aren't doing well for Apple's Mac lineup.
I was referring to the 16-inch MacBook Pro, but with 8GB... again, it's not what I would do, but more than you think. Let's put it this way: I'm on a 27-inch iMac from 2019 with a Core i7 and 8GB of RAM, and it's only in the past year or so that I've really noticed it fighting to cope with a workload that includes Photoshop, Slack, and a ton of browser tabs. There is a dedicated GPU, but it only helps so far.
 
I have a late 2019 Intel-based MPB that I keep in my desk drawer for when I need screenshots for guides and such and it is certainly more than usable, but Nov 2020 M1 MBP blows it away in every measurable aspect.

Is that a fact?

Im still rocking an Intel based MBP, but I also have a M1 Studio.
 
Is that a fact?

Im still rocking an Intel based MBP, but I also have a M1 Studio.
Yep, fact.
1713304052728.png

It exists now for me to test deployments and create screenshots.
But the M1 of the same size it was replaced with is lighter, cooler, faster, better battery, with faster storage.

We had some programs that would not work on the Intel Mac’s even in Parallels. But the M1’s with Parallels it works fine, so we replaced the fleet and was able to do away with the RDP server for that department.

Administrative feels good all around.
They got newer better faster,
I got to decommission an expensive and annoying server. Overall cost neutral move and everybody was happier for it.
 
  • Like
Reactions: Liver
like this
The screen come from an other laptop and a tablet.


If you ssh (or other) to your work computer, infinite ? If you write text, html, finalcut/premiere is able to do quite a bit on 8GB, specially if you do not mind for 4k
I presume the tablet is screen extended over the network? It’s neat in theory, but completely useless when using VPN for work.

The SSH case is valid though, for needing almost no hardware - VPN in, ssh to your real host with access to everything, and off you go. Could almost use an old 2010 netbook for that use case
 
I presume the tablet is screen extended over the network? It’s neat in theory, but completely useless when using VPN for work.

No, Apple implements things far more intelligently than that. You can either use the network OR a Thunderbolt cable and it works natively with MacOS without any third party software. My travel setup is incredible for productivity; Macbook 14 in the middle, iPad Pro running in extended desktop mode on the left (straight up second monitor, color calibrated and high PPI), with a LG USB C IPS 16in display on the right of my Macbook also in extended desktop mode. All three items fit in basically the same footprint stacked on top of each other and fit neatly in my lightweight travel bag. They all run off the built in Macbook battery for hours and hours. And if I want to charge the Macbook, it can go indefinitely - while I still have another Thunderbolt 4 port available for other high speed IO.

It's also amazing because having the tablet serve as a second screen consolidates how many devices I need to bring with me, and lets me seamlessly switch workflow from the iPad on the plane during takeoff/landing to the larger Macbook setup when I'm at an airport or hotel desk (with everything automatically synced, even my clipboard).

Even better is the fact that I travel with a single tiny GAN charger brick for:

Macbook Pro
iPad Pro
Airpods Max
Airpods Pro
Apple Watch
iPhone

I can charge ALL of these simultaneously with the single GAN charger and Macbook ports in the hotel with zero other charging bricks; just cables.

You simply cannot touch Apple for portable productivity.
 
No, Apple implements things far more intelligently than that. You can either use the network OR a Thunderbolt cable and it works natively with MacOS without any third party software. My travel setup is incredible for productivity; Macbook 14 in the middle, iPad Pro running in extended desktop mode on the left (straight up second monitor, color calibrated and high PPI), with a LG USB C IPS 16in display on the right of my Macbook also in extended desktop mode. All three items fit in basically the same footprint stacked on top of each other and fit neatly in my lightweight travel bag. They all run off the built in Macbook battery for hours and hours. And if I want to charge the Macbook, it can go indefinitely - while I still have another Thunderbolt 4 port available for other high speed IO.

It's also amazing because having the tablet serve as a second screen consolidates how many devices I need to bring with me, and lets me seamlessly switch workflow from the iPad on the plane during takeoff/landing to the larger Macbook setup when I'm at an airport or hotel desk (with everything automatically synced, even my clipboard).

You simply cannot touch Apple for portable productivity.

Can you link that LG monitor? I’d like to check it out, thanks.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.

Thanks. Thats sweet. It blows my mind that a laptop can power that screen via usb c. And still have enough juice for hours worth of work. Thats legit cool.

Edit. Link your charger too please.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
You're still pitting a gaming laptop (and not ASUS' best-built, at that) against a high-end creative workstation. The markets for the two are wildly different!
Because most laptops with that kind of power are usually made for gaming.
That's... actually not all that great in 2024, especially since the system will throttle back while on battery power (yes, I appreciate the irony of Apple being the one with much better throttling control). If you're finishing an urgent art/video project, you don't want to have to wait ages for renders or risk running out of battery after a couple of hours in the field.
AMD cpu's do not throttle, but I don't know about the RTX GPU. Even if you did make use of AMD's 780m GPU which doesn't throttle unplugged, it isn't meant for video editing. The RTX GPU though can beat Apple GPU's by a good amount even unplugged with a 13900HX. Also remember Intel's Meteor Lake chips work full speed unplugged too. Again it's 2024, not 2021.

View: https://youtu.be/6kazbYEp_rM?si=SsdCtqsZVw0RS_Mf
I will say the Strix Scar 16 I used was running a 13th-gen Core i9, so I wasn't expecting the greatest battery life. It's good to know modern Ryzens are faring better than that; I just don't think either is in the same ballpark as the M3 line if you value performance while unplugged.
It's 2024 and AMD has caught up even with Rembrandt from 2 years ago. Apple is still slightly ahead in battery and power consumption, but it's really insignificant. This is mostly due to Apple being on 3nm while AMD is on 5nm and Intel is on 7nm.
I was referring to the 16-inch MacBook Pro, but with 8GB... again, it's not what I would do, but more than you think. Let's put it this way: I'm on a 27-inch iMac from 2019 with a Core i7 and 8GB of RAM, and it's only in the past year or so that I've really noticed it fighting to cope with a workload that includes Photoshop, Slack, and a ton of browser tabs. There is a dedicated GPU, but it only helps so far.
Max Tech did do a video that showed 8GB was significantly slower than 16GB, and especially slower compared to 16GB on a Windows PC. It really depends on what you do, but again 8GB is just wrong in a machine with a powerful CPU. Not to forget you're thrashing that SSD which is trying to make up for the lack of ram. You also can't replace the SSD nand in them either when they inevitably wear out.

View: https://youtu.be/u1dxOI_kYG8?si=2DtKwpFLl5NnnJgg
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
It's 2024 and AMD has caught up even with Rembrandt from 2 years ago. Apple is still slightly ahead in battery and power consumption, but it's really insignificant. This is mostly due to Apple being on 3nm while AMD is on 5nm and Intel is on 7nm.
The battery life on MacBooks compared to every other Windows laptop is not even remotely "insignificant." The competition is catching up, thankfully, but they're not there yet. However, this will be irrelevant once USB-C charging reaches safe 200+ watt charging capabilities and I can a bring battery pack with me to keep it going all day. Also ... heat. Windows gaming laptops get stupidly hot under load.
 
Last edited:

I really appreciate the real [H] user experiences on how it benefits your work flow (life). You can link legit benchmarks on whatever metric you are interested in, but it’s the user experience that really matters. Either it improves your workflow or it doesn’t. Seems like you got it.

I get it. Workflow can be whatever you are doing. If your workflow is running benchmarks and metrics, so be it. Hope you can find that machine that does it best for you.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
I really appreciate the real [H] user experiences on how it benefits your work flow (life). You can link legit benchmarks on whatever metric you are interested in, but it’s the user experience that really matters. Either it improves your workflow or it doesn’t. Seems like you got it.

I get it. Workflow can be whatever you are doing. If your workflow is running benchmarks and metrics, so be it. Hope you can find that machine that does it best for you.
His list of Apple products is like five grand or more. It's hard for me to believe that he can't have the same productivity with a Surface Pro or other Windows based machine. USB monitors can connect to anything 🤷‍♂️

I didn't see any mention of the actual kind of work he does but going from my desktop to mobile for just about anything is hindered quite a bit simply because of screen size.

Not even taking into account that software I use almost daily doesn't exist on a Mac, I've seen nothing mentioned that would make me want to transition to a MacBook to justify the increased cost.

Also, this thread made me realize that Microsoft has released another Arm based laptop/Surface 🤔
 
His list of Apple products is like five grand or more. It's hard for me to believe that he can't have the same productivity with a Surface Pro or other Windows based machine. USB monitors can connect to anything 🤷‍♂️

I didn't see any mention of the actual kind of work he does but going from my desktop to mobile for just about anything is hindered quite a bit simply because of screen size.

Not even taking into account that software I use almost daily doesn't exist on a Mac, I've seen nothing mentioned that would make me want to transition to a MacBook to justify the increased cost.

Also, this thread made me realize that Microsoft has released another Arm based laptop/Surface 🤔

I’m not being snarky in the slightest. If he can get another computer to do whatever he does, is irrelevant. I have a a suspicion you already know that.

What matters is the user is very satisfied with the set up and it allows the user to be more productive (by the user’s assessment). Apparently the price tag is acceptable to the user.

Most people would agree with me that the user experience is much greater than the speed of the RAM, data drive, screen resolution or whatever. Of course those have an influence, but ultimately it’s the total user experience that matters.

Apple nails that very important metric.
 
I have a late 2019 Intel-based MPB that I keep in my desk drawer for when I need screenshots for guides and such and it is certainly more than usable, but Nov 2020 M1 MBP blows it away in every measurable aspect.

Not when it comes to running x86 virtual machines. That's the major reason I still use my 2019 i9 Macbook Pro. And the battery life is enough to carry it from one A/C outlet to another :)
 
The SSH case is valid though, for needing almost no hardware - VPN in, ssh to your real host with access to everything, and off you go. Could almost use an old 2010 netbook for that use case

I do ssh around for actual work, but the problem is that the web browser which I also need is local. And that can be a RAM killer when you have many tabs.
 
AMD cpu's do not throttle, but I don't know about the RTX GPU. Even if you did make use of AMD's 780m GPU which doesn't throttle unplugged, it isn't meant for video editing. The RTX GPU though can beat Apple GPU's by a good amount even unplugged with a 13900HX. Also remember Intel's Meteor Lake chips work full speed unplugged too. Again it's 2024, not 2021.
That's an RTX 4080 versus the lower end M3 Max. Now, even the higher-end M3 Max probably won't beat an RTX 4080, but neither result would be shocking. The RTX 4080 by itself can chew more power than the entire MacBook Pro (typically 150W); if Apple could achieve 4080-level performance with its total power draw, it would have a revolution on its hands.

I look at the MacBook Pro as a calculated tradeoff: it's shedding some raw plugged-in GPU performance in exchange for better overall audiovisual editing performance, longer battery life, a thinner chassis, and less noise.


Max Tech did do a video that showed 8GB was significantly slower than 16GB, and especially slower compared to 16GB on a Windows PC. It really depends on what you do, but again 8GB is just wrong in a machine with a powerful CPU. Not to forget you're thrashing that SSD which is trying to make up for the lack of ram. You also can't replace the SSD nand in them either when they inevitably wear out.
I've seen that video. SSD wear isn't really the issue, since even that kind of thrashing isn't going to kill the drive prematurely. Rather, it's just that there's a ceiling to what 8GB of RAM can accomplish, and it is a bottleneck on a machine like this. Like I've mentioned, I think Apple should be using 16GB as standard across the MacBook Pro line; I believe it will when M4 arrives.
 
The battery life on MacBooks compared to every other Windows laptop is not even remotely "insignificant." The competition is catching up, thankfully, but they're not there yet. However, this will be irrelevant once USB-C charging reaches safe 200+ watt charging capabilities and I can a bring battery pack with me to keep it going all day. Also ... heat. Windows gaming laptops get stupidly hot under load.
I can literally work all day on my M1 Pro MBP on battery and not even think about having to look and see what my battery is at.
 
His list of Apple products is like five grand or more. It's hard for me to believe that he can't have the same productivity with a Surface Pro or other Windows based machine. USB monitors can connect to anything 🤷‍♂️

I didn't see any mention of the actual kind of work he does but going from my desktop to mobile for just about anything is hindered quite a bit simply because of screen size.

Not even taking into account that software I use almost daily doesn't exist on a Mac, I've seen nothing mentioned that would make me want to transition to a MacBook to justify the increased cost.

Also, this thread made me realize that Microsoft has released another Arm based laptop/Surface 🤔

I have a feeling you already know this isn't the case because the examples of how that aren't true are so obvious. For example, if I couldn't seamlessly use my iPad Pro as a second monitor I would have to purchase and carry ANOTHER LG external monitor to achieve a portable triple monitor setup (and I'm not giving up my iPad on the plane).

Regarding jumping from desktop to mobile, I agree - it's always hindered by screen size. But more screen size is always better and one of the best things for me is how robust and reliable Thunderbolt 4 docking is on MacOS. At work I have desks (on different floors) with identical setups (which also match my home setup). On each of them, a single Thunderbolt 4 cable drives a Caldigit TS4 dock with a Samsung G95NC at 7680x2160 @ 120hz, an iPad Pro, a SFP 10GB interface, and a bunch of other USB peripherals like DACs, audio interfaces, software defined radios, etc. At any time I can get up, unplug a single cable, walk to another desk, plug in and everything just works. So while obviously running a 57in screen is ideal, when I'm portable I still value being able to carry around a triple monitor setup (with each panel having high color accuracy and a minimum of 2500x1600 resolution) in the same overall total size and weight footprint as a single PC behemoth laptop. And obviously, I can work for hours in this configuration where a windows PC simply could not.

Regarding my other peripherals, you're also missing out on how the Apple ecosystem benefits me. If I forgot my phone in the other room, I can turn off the lights from my watch. If I took of my watch and am laying in bed reading an iPad, I can do it from there - or from my TV in the living room. This is just one small example, but that extends to how I've built my entire workflow. If I copy something on my iPhone, I can paste it on my Mac. If I am running with AirPods Pros in listening to a podcast, I can seamlessly answer a call and chat while the watch pauses the podcast - then when I get back home, I can transfer it to my bathroom Homepod so I can finish listening to the podcast while I shower (and my workout data is automatically synced from my watch across all of my devices and to Strava). If I want to share my screen, I can wirelessly airplay to the TVs in our conference rooms - or do the same with my iPhone or iPad if I left my laptop at home by accident. With 8TB of on-device storage on my Macbook Pro (which a Surface doesn't offer), I can carry my company's entire asset library with me at all times so I can work regardless of internet connection to the NAS or dropbox.

And when I get to the hotel after a long day and I'm done with work and exercise, I can enjoy watching movies with the absolute best mobile movie watching audio experience in the world - Apple Spatial Audio on Airpods Max (which is astonishingly good for movies - and before you question my audio bona fides and say I don't know what good headphones sound like, my bedside rig is a Dan Clark E3 with a RME ADI-2 FS DAC/amp).

Basically, I have a lot going on in my life and am responsible for a lot of people between family and employees. I don't have time to fuck around with science projects and half-baked solutions - I just need my shit to do what I need it to do when I need it to do it. That is why the cost of hardware is completely irrelevant - the value of my time and me being maximally productive pays back the difference between Apple and PC solutions in a couple days, let alone compounded over multiple years. I consider the $7k I spent on my laptop the cheapest money I ever spent because of how it has improved my workflow for both work and play.
 
The battery life on MacBooks compared to every other Windows laptop is not even remotely "insignificant." The competition is catching up, thankfully, but they're not there yet. However, this will be irrelevant once USB-C charging reaches safe 200+ watt charging capabilities and I can a bring battery pack with me to keep it going all day. Also ... heat. Windows gaming laptops get stupidly hot under load.
With Meteor Lake we see that Max Tech got 4 - 5 hours on a "heavy load" vs 6 - 7 hours on M3 Macbook Pro. Meteor Lake wouldn't be my first choice for an x86 laptop, but it's not a bad choice. AMD is so much closer that it's minutes in difference.


View: https://youtu.be/05joCv0j_Cc?t=279
And I don't have to worry about the MacBook's heat sterilizing me.
At 114C you won't have balls left to be sterilized. At that point it's sex reassignment by Apple. It's a feature.
https://www.techspot.com/news/102227-m3-based-macbook-air-hits-114-degrees-celsius.html

View: https://youtu.be/yXNc9Xv1DoQ?si=0_7a8HkB-D275kEW
 
With Meteor Lake we see that Max Tech got 4 - 5 hours on a "heavy load" vs 6 - 7 hours on M3 Macbook Pro. Meteor Lake wouldn't be my first choice for an x86 laptop, but it's not a bad choice. AMD is so much closer that it's minutes in difference.
I doubt that the case in realistic days of work load (heavy load would be quite niche, who run all day all core task on battery ?), obviously if you find way to use the same amount of watts of 2 computers with similar battery you end up with similar battery life , Apple will shine in how much stuff are not heavy for it and how well it does in idle-near idle usual worktime.
 
Last edited:
With Meteor Lake we see that Max Tech got 4 - 5 hours on a "heavy load" vs 6 - 7 hours on M3 Macbook Pro. Meteor Lake wouldn't be my first choice for an x86 laptop, but it's not a bad choice. AMD is so much closer that it's minutes in difference.


View: https://youtu.be/05joCv0j_Cc?t=279

At 114C you won't have balls left to be sterilized. At that point it's sex reassignment by Apple. It's a feature.
https://www.techspot.com/news/102227-m3-based-macbook-air-hits-114-degrees-celsius.html

View: https://youtu.be/yXNc9Xv1DoQ?si=0_7a8HkB-D275kEW

You can keep talking out of your arse, but it's not going to change reality. 90% of the time, my Mac is literally cold on my lap. Even heavier load tasks remain warm. Every gaming laptop I've ever used, including recent AMD ones (I try the new ones every single year as I like to see where they're at), start getting hot in a very short period of time. They're uncomfortable to use on the go because of their heat. Also, ffs dude, you posted a video of an Apple Air laptop that has no fans and only has passive cooling.

I doubt that the case in realistic days of work load (heavy load would be quite niche, who run all day all core task on battery ?), Apple will shine in how much stuff are not heavy for it and how well it does in idle-near idle usual worktime.
He can't do anything but use the most extreme situations to try and prove some point. And his point is that he hates Apple products. As someone that frequently uses Apple laptops and the most current cutting edge Windows laptops, Apple still handily beats the Windows laptops every single time with battery life usage and heat. Normal battery life. A mix of light, normal, and heavy workloads. I get roughly 10-15 hours out of my laptop before I need to plug it into a portable battery back (my laptop bag actually hooks up to my portable battery and I just need to plug my Mac into the port on my laptop bag to charge it).

DukenukemX -> I love the progress AMD is making, and chip makers in general, but they're just not there yet. It's not minutes in difference. This is so monumentally stupid that I don't think even YOU believe the nonsense you're spewing. The difference is you're anti-Apple, whereas I'm pro-technology. I use both Apple and Windows products. I don't hate either one. In fact, I long for the day that Windows laptops finally catch up to Apple not only in power to performance, but also heat in general use. They're not there yet. Period. The day that happens, I may switch to Windows only. For now, it's Windows for desktops and Apple for laptops.
 
I doubt that the case in realistic days of work load (heavy load would be quite niche, who run all day all core task on battery ?), obviously if you find way to use the same amount of watts of 2 computers with similar battery you end up with similar battery life , Apple will shine in how much stuff are not heavy for it and how well it does in idle-near idle usual worktime.
You can believe all you want but tests were done and results were shown. Apple is not alone in having a laptop that can run all day on battery. Though, that really depends on what you really do. I've shown examples where an M1 can die within 2 hours playing video games.
You can keep talking out of your arse, but it's not going to change reality. 90% of the time, my Mac is literally cold on my lap.
Is it an M3 Macbook Air? You mean to say that other reviewers are stupid and must have your Macbook?
Also, ffs dude, you posted a video of an Apple Air laptop that has no fans and only has passive cooling.
The problem is what then? You said it doesn't get hot and it clearly does. What you need to understand is that the game changed for Apple. It's not the M1 anymore, because Apple now has to compete against faster x86 laptops. How you think Apple gets their chips to run faster? It's gonna generate more heat, which shouldn't shock anyone.
He can't do anything but use the most extreme situations to try and prove some point. And his point is that he hates Apple products. As someone that frequently uses Apple laptops and the most current cutting edge Windows laptops, Apple still handily beats the Windows laptops every single time with battery life usage and heat. Normal battery life. A mix of light, normal, and heavy workloads. I get roughly 10-15 hours out of my laptop before I need to plug it into a portable battery back (my laptop bag actually hooks up to my portable battery and I just need to plug my Mac into the port on my laptop bag to charge it).
Like I said, this is the same problem when AMD's Ryzen beat Intel in power consumption and heat. People's minds were stuck on FX vs Sandy Bridge, but again times change. It's no longer Apple's M1's vs AMD's and Intel's chips from that era.
DukenukemX -> I love the progress AMD is making, and chip makers in general, but they're just not there yet. It's not minutes in difference. This is so monumentally stupid that I don't think even YOU believe the nonsense you're spewing.
Results are gonna result.

View: https://youtu.be/GnT7GhuurlY?si=qpKSfrjcni__oqD3
The difference is you're anti-Apple, whereas I'm pro-technology.
You're behind the times for someone who's pro-technology.
I use both Apple and Windows products. I don't hate either one.
I hate both because I'm a Linux guy.
In fact, I long for the day that Windows laptops finally catch up to Apple not only in power to performance, but also heat in general use. They're not there yet. Period. The day that happens, I may switch to Windows only. For now, it's Windows for desktops and Apple for laptops.
Literally a quick Google.
https://www.anandtech.com/show/1891...yzen-9-7940hs-tested-with-rtx-4070-graphics/5

How about 7.5 hours of video playback?
132722.png

Over 7 hours using Microsoft office.
132725.png
 
You can believe all you want but tests were done and results were shown. Apple is not alone in having a laptop that can run all day on battery. Though, that really depends on what you really do. I've shown examples where an M1 can die within 2 hours playing video games.

Is it an M3 Macbook Air? You mean to say that other reviewers are stupid and must have your Macbook?

The problem is what then? You said it doesn't get hot and it clearly does. What you need to understand is that the game changed for Apple. It's not the M1 anymore, because Apple now has to compete against faster x86 laptops. How you think Apple gets their chips to run faster? It's gonna generate more heat, which shouldn't shock anyone.

Like I said, this is the same problem when AMD's Ryzen beat Intel in power consumption and heat. People's minds were stuck on FX vs Sandy Bridge, but again times change. It's no longer Apple's M1's vs AMD's and Intel's chips from that era.

Results are gonna result.

View: https://youtu.be/GnT7GhuurlY?si=qpKSfrjcni__oqD3

You're behind the times for someone who's pro-technology.

I hate both because I'm a Linux guy.

Literally a quick Google.
https://www.anandtech.com/show/1891...yzen-9-7940hs-tested-with-rtx-4070-graphics/5

How about 7.5 hours of video playback?
View attachment 648434

Over 7 hours using Microsoft office.
View attachment 648435

Have you ever truly put a current MacBook through its paces as your daily driver? What about a top-tier Windows laptop? Let me ask you something important: what's your current go-to computer? Do you own a laptop, and if so, which one? I'm not stuck in the past; I actively engage with the latest technology to inform my choice for a daily driver. As for Windows laptops, they're just not hitting the mark yet. I've had hands-on experience with the very latest in Windows laptop tech year after year, including models sporting the newest Intel/AMD CPUs and AMD/NVIDIA GPUs, like the 4070s, 4080s, and 4090s, alongside various AMD and Intel CPU iterations. I keep waiting for something comparable as a daily driver and I keep being disappointed. The build quality of most Windows laptops is also a problem. While you keep sharing this information with me, I doubt you've tested out any of stuff you keep going on about. I have. Because I need a laptop to make a living. And I'm still waiting for a comparable Windows laptop in battery life, heat, and build quality. MacBooks also still absolutely shite on Windows laptops with their laptop speakers. For some reason laptop manufacturers can't figure that one out.

By the way, I sometimes watch Hardware Canucks, but that video you linked is clearly a sponsored ad for Omen laptops, which are very shitty laptops. But more importantly, who on earth uses their laptop at 50% brightness? Are you high? Did you even look at the benchmarks you posted? The benchmarks are all at 50% brightness. Nobody does that. Talk about fudging results to make them look better. I always max out my brightness. It's pretty apparent you just do a 'quick Google' and copy and paste the first thing you see without actually looking at it.

And you're a Linux user? I guess that explains a lot. What laptop hardware are you running Linux on?
 
But more importantly, who on earth uses their laptop at 50% brightness? Nobody does that. Talk about fudging results to make them look better. I always max out my brightness.
I am not much of a big laptop user, but wouldn't a level of brightness that work well outside be too bright inside, my Laptop I am using to type this right now, is almost to its minimum of brightness, not being use with strong light around. Using 50% probably being an easy to everyone set and common use case giving a realistic real world battery life.
 
I am not much of a big laptop user, but wouldn't a level of brightness that work well outside be too bright inside, my Laptop I am using to type this right now, is almost to its minimum of brightness, not being use with strong light around. Using 50% probably being an easy to everyone set and common use case giving a realistic real world battery life.
Adjusting the brightness in a very dark room or when lights are off might seem practical. However, using half brightness as a benchmark for a performance CPU touted to be more battery-efficient is nonsensical. Relying on such a metric implies resorting to tricks like dimming the screen to extend battery life, which shouldn't be necessary for regular usage. Personally, I prefer using my laptop at full brightness unless the environment demands otherwise, typically keeping it at 70-80 percent even in darker settings. And that's not for saving battery, that's so I don't melt my eye sockets when the lights are off. Realistic battery life shouldn't hinge on compromising the user experience; it should reflect normal usage patterns. If a laptop's battery life heavily relies on dimming the screen, it indicates poor performance at full brightness, requiring users to make sacrifices. I just want to use it normally like I do when it's plugged in, like I can do with my Mac laptop.
 
Do you own a laptop, and if so, which one?
Gotta be more specific as I have over a dozen laptops. Don't ask. My main laptop is a HP with Ryzen 5 5600U.
The build quality of most Windows laptops is also a problem.
This is a serious problem when dealing with Apple fans, because when the argument is lost they refer to build quality. Like Apple has a track record of good build quality.
But more importantly, who on earth uses their laptop at 50% brightness? Are you high?
I tend to run my laptop as low as possible because my blue eyes are really sensitive to bright light. Unless it's outside in the sun where I turn that up higher.
The benchmarks are all at 50% brightness. Nobody does that. Talk about fudging results to make them look better. I always max out my brightness. It's pretty apparent you just do a 'quick Google' and copy and paste the first thing you see without actually looking at it.
I think you need to have your eyes checked if you max out brightness all the time. Some of these screens can get really bright. Also, who's to say what brightness setting other reviewers use for comparison? As Aurelius said before, it's not a straightforward comparison because of the nature of laptops.
And you're a Linux user? I guess that explains a lot. What laptop hardware are you running Linux on?
Again you gotta be more specific as I have over a dozen laptops and yea they all run Linux. Some are even old Macs.
 
I was reading about Apple's plan to run local machine learning partially from storage and the first thought I had was "Apple will do anything to enshitify the specs of their low end". It's a logical approach to all this AI nonsense, but it's just... Apple gotta Apple, and that's the first thought I had.

8GB Macs are why "Starting From" pricing is complete bullshit.
 
Seems Apple will release their M4's this year. All Apple products will use the M4's.


View: https://youtu.be/Pc5i0zx6CqM?si=o-GVGdyAxqbaHoqu

Apple is likely to repeat the M3 strategy where it upgrades some models this year and others the next, but that's not surprising... it needs to deal with both production volume and the challenges of building very high-end chips like the M4 Ultra/Extreme.

I do like what Apple is doing, though; it's finally getting the M series on a yearly cadence that's both more competitive and more predictable. If I weren't pining for an upgrade soon I'd hold out until the fall for a MacBook Pro M4.
 
I was reading about Apple's plan to run local machine learning partially from storage and the first thought I had was "Apple will do anything to enshitify the specs of their low end". It's a logical approach to all this AI nonsense, but it's just... Apple gotta Apple, and that's the first thought I had.

8GB Macs are why "Starting From" pricing is complete bullshit.
Actually there have been some really solid advances in PyTorch that let LLM’s use a combination of local storage and RAM (both system and vram) as locations for running the models. Obviously there are speed differences but changes in the LLM’s themselves have decreased those dramatically.
The US can keep banning AI cards to China and set limits on VRAM and transfer speeds and all the other jazz and they just keep finding ways to make up for it.
 
Actually there have been some really solid advances in PyTorch that let LLM’s use a combination of local storage and RAM (both system and vram) as locations for running the models. Obviously there are speed differences but changes in the LLM’s themselves have decreased those dramatically.
The US can keep banning AI cards to China and set limits on VRAM and transfer speeds and all the other jazz and they just keep finding ways to make up for it.
I really do like the idea of using storage for this, especially given how fast storage is getting. I love the potential uses of all this, but I really want to see it local.

If a smart home ever becomes something other than a burden, making it work offline would be fantastic.
 
I really do like the idea of using storage for this, especially given how fast storage is getting. I love the potential uses of all this, but I really want to see it local.

If a smart home ever becomes something other than a burden, making it work offline would be fantastic.
The thing is most AI models once generated are relatively small, you only really need the big numbers when training or running the much larger models. Neither is something you are likely to see on a consumer level, but the fact you can spill it over means that even the MBPs with only 128GB of RAM can run those big 1TB models that can easily exist.
 
Back
Top