Apples M2 looks like a beast.

I would say they have embraced and extended ARM so as to make it "not ARM" anymore though. It's their own thing. Much like MacOS and any suggestion with regards to its roots.

"Evolution of ARM"? I suppose, but ARM that nobody but Apple can have.
 
I would say they have embraced and extended ARM so as to make it "not ARM" anymore though. It's their own thing. Much like MacOS and any suggestion with regards to its roots.

"Evolution of ARM"? I suppose, but ARM that nobody but Apple can have.
There’s lots of custom ARM silicon out there that is unique to the customer, Apple simply has taken the customizations to a new level. But Alphabet, Meta, Amazon, and a handful of others are doing it it’s one of the many ARM license types they sell.
 
There’s lots of custom ARM silicon out there that is unique to the customer, Apple simply has taken the customizations to a new level. But Alphabet, Meta, Amazon, and a handful of others are doing it it’s one of the many ARM license types they sell.
And, like Apple, likely you can't buy those CPUs either.
 
Still better than the rest. The only thing that's gone down, really, is bloatware. So do a clean install, which you should do with any laptop, anyway.

They did get in trouble for preinstalled spyware a few of times.

Nothing like getting caught once, saying it was a mistake and promising it won't happen again, and then getting caught again and again...

I wonder if we are at the stage where this stuff is embedded in the EFI and reinstalls itself upon OS wipe.

I have a difficult time trusting a Chinese company with the design of devices I use for information.

Also, almost every place I have worked has been a Dell shop, and I have always been fond of Latitude laptops. Maybe less so in the modern "everything is an ultrabook" era, but still, I wouldn't necessarily place Lenovo ahead of them if we only compare enterprise products to enterprise products.

If I had to buy a modern business laptop, it would likely be a Latitude, even though they are not as great as they once were. I still have two of the old thick ones at home for my personal use. They are truly the golden age of laptops.

1655143055826.png



I wish I could still buy a new laptop with an even halfway decent keyboard. Regardless of which brand of machine they are on, Apple, Lenovo, Dell or HP, the modern thin, flat chiclet/island laptop keyboards are all absolute shit to type on.
 
Last edited:
They did get in trouble for preinstalled spyware a few of times.

Nothing like getting caught once, saying it was a mistake and promising it won't happen again, and then getting caught again and again...

I wonder if we are at the stage where this stuff is embedded in the EFI and reinstalls itself upon OS wipe.

I should have read my own link.

In August, Lenovo again got caught installing unwanted and non-removable crapware into part of the BIOS reserved for custom drivers.

Apparently they were putting malware in the driver space of the BIOS already back in 2015...

It's amazing to me anyone still buys anything Lenovo.
 
Apparently they were putting malware in the driver space of the BIOS already back in 2015...

Thanks for finding this. The link to the removal tools is this: https://support.lenovo.com/us/en/solutions/ps500002 <--- search for "LSE" there's one for desktop, one for laptop.

I ran it on both my laptops and neither of them had it. On the one hand, it's skeezy that they did it. On the other hand, they got caught, and there's a fix. How many other companies just haven't got caught?
 
Thanks for finding this. The link to the removal tools is this: https://support.lenovo.com/us/en/solutions/ps500002 <--- search for "LSE" there's one for desktop, one for laptop.

I ran it on both my laptops and neither of them had it. On the one hand, it's skeezy that they did it. On the other hand, they got caught, and there's a fix. How many other companies just haven't got caught?

Who knows, but also who knows what else they are doing? After all, they were caught twice before and promised it was a mistake and wouldn't happen again. Maybe getting caught just helped them get better at being sneaky?

The Chinese connection is automatically a source of distrust, at least for me. Of course, other nations who are concerned about the CIA could probably say the same about American vendors after some of the backdoors Cisco were caught with.
 
Apparently they were putting malware in the driver space of the BIOS already back in 2015...

It's amazing to me anyone still buys anything Lenovo.
I don't understand it either. Superfish alone should've been a sufficient deterrent, but you've probably noticed it's quite common for individuals to defend corporations when they're guilty of shitty behavior.
 
Most of Apple's lineup are made in China and I'm not sure Apple is a safer software company than Microsoft is.

The only way to ensure you've got real control is probably to put together a Gentoo rig and then set up a separate machine as a router that sniffs every single packet and logs everything.
 
Most of Apple's lineup are made in China and I'm not sure Apple is a safer software company than Microsoft is.

The only way to ensure you've got real control is probably to put together a Gentoo rig and then set up a separate machine as a router that sniffs every single packet and logs everything.
Mind you, Apple isn't insisting on collecting at least some telemetry, or threatening to put app ads in your OS.

As it is, there's a big difference between your laptop being made in China versus being made by a Chinese company. Apple, Dell, HP and other brands have manufacturing in mainland China, but you don't see them being accused of this kind of skullduggery. Being based in China doesn't automatically mean a company is a CCP ally or otherwise up to no good, but there's no evidence that US brands are as fishy as Chinese makes sometimes are.
 
They did get in trouble for preinstalled spyware a few of times.

Nothing like getting caught once, saying it was a mistake and promising it won't happen again, and then getting caught again and again...

I wonder if we are at the stage where this stuff is embedded in the EFI and reinstalls itself upon OS wipe.

I have a difficult time trusting a Chinese company with the design of devices I use for information.

Also, almost every place I have worked has been a Dell shop, and I have always been fond of Latitude laptops. Maybe less so in the modern "everything is an ultrabook" era, but still, I wouldn't necessarily place Lenovo ahead of them if we only compare enterprise products to enterprise products.

If I had to buy a modern business laptop, it would likely be a Latitude, even though they are not as great as they once were. I still have two of the old thick ones at home for my personal use. They are truly the golden age of laptops.

View attachment 482682


I wish I could still buy a new laptop with an even halfway decent keyboard. Regardless of which brand of machine they are on, Apple, Lenovo, Dell or HP, the modern thin, flat chiclet/island laptop keyboards are all absolute shit to type on.
90% of my Laptop fleet are currently Latitudes, the bulk of them being 3500 or 3520's at this point, they are solid little units and pretty easy to work on. I just wish they would ditch the 4.5mm barrel plug and move them all over to USB-C like they have the 5000 and 7000 series.
Lenovo's years of trying to race Acer to the bottom are starting to really show on their consumer products.
 
Last edited:
Most of Apple's lineup are made in China and I'm not sure Apple is a safer software company than Microsoft is.

The only way to ensure you've got real control is probably to put together a Gentoo rig and then set up a separate machine as a router that sniffs every single packet and logs everything.

While that is true, there is a big difference between "made in" and "designed in".

While possible, it is a lot harder to sneak untoward stuff in in the manufacturing process, especially when there are samples that are periodically tested by the designer against specs.

I'm much more concerned about brands where the design function resides in China. Then there is no external quality control or accountability.
 
90% of my Laptop fleet are currently Latitudes, the bulk of them being 3500 or 3520's at this point, they are solid little units and pretty easy to work on. I just wish they would ditch the 4.5mm barrel plug and move them all over to USB-C like they have the 5000 and 7000 series.

I wish I had one of those.

Our IT guy at my current job for some reason has an infatuation with the prosumer grade XPS series.

My current one, an XPS 9510 (i think, don't quote me on the model number) is pretty good. It has a Nvidia GPU in it which will never get used, which is a waste, but it is OK. My previous 2018 era model was an absolute DOG. It shipped with a spinning hard drive, and no amount of upgrading RAM and switching to SSD could help it. I presume there was something wrong with it.
 
I wish I had one of those.

Our IT guy at my current job for some reason has an infatuation with the prosumer grade XPS series.

My current one, an XPS 9510 (i think, don't quote me on the model number) is pretty good. It has a Nvidia GPU in it which will never get used, which is a waste, but it is OK. My previous 2018 era model was an absolute DOG. It shipped with a spinning hard drive, and no amount of upgrading RAM and switching to SSD could help it. I presume there was something wrong with it.
Their XPS lineup is... Fine as long as you never intend to call Dell for support and only carry the standard 1-year warranty, XPS units are one of those models that many 2'nd party distributors sell in bulk packages when they need to move off Dell's surplus prosumer parts. I just hate dealing with them, any deal they can give a decent Dell rep can match, easier to deal with Dell directly than a middle man.
 
Seriously, let's get some 3rd party benchmarks. I'm sick of these "look, one line is higher!!!" graphs that are like something North Korea would publish about their economy.
This is honestly one of the best interpretations of manufacturer-supplied benchmarks I've ever read. Hats off to you!
 
I'm much more concerned about brands where the design function resides in China. Then there is no external quality control or accountability.

I've had Lenovos and Apples for years, and yeah, while Lenovo support is not as good as Apple, their hardware, especially at the top end of mobile, is as good as Apple's, sometimes better, and a lot cheaper. Also there are plenty of people who can work on Lenovos after the warranty expires. I know, that's true for Apple, too, but it's easier to find, and also cheaper.

As far as INFOSEC is concerned, it's a wash. There's [H]ard|ness in understanding and remembering hardware and software company failures and shens, but seriously, how many of us are still so [H]ard we compile our own software? Even overclockers today be like, "AMD magic go better turn on now, pweese."

I'm all for Buy 'Murica! but when it comes to computers, there's no avoiding China, in one way or another. And China will solder extra chips into ethernet jacks if they think they can get away with it, no matter who designed it.

Lenovo has a lot of defense contracts, and has to at least pretend that they're not a shifty Chinesium-grade builder. I can understand wanting to stay away from GPD, AYN, Minis Forum, etc. I'm not being a fanboy by not lumping Lenovo right in there with them.

Just don't buy into the Apple protects its customers mentality. That's just really good marketing.
 
Apple has a perpetual architectual license so they're allowed to both use and modify ARM technology to their specification. M(1,2) chips are simply an extension of their ARM strategy, which includes the chips designed for their phone and tablet lineups.
The Apple M chips advantage and the only thing that confuses people about it's performance is the Media Engine that Apple put into it. The Media Engine is a heavy duty video encoder decoder that consumes very little power, and is honestly very impressive considering what it can do. It's why Apple says it's like a RTX 3080, if all you did with a RTX 3080 was video editing. It's also why the M1 looks extremely power efficient because most tests were done purely on video playback and editing. This is why anyone buying a Apple M1 should be focusing on video editing and playback as the main reason for buying one, as the CPU and GPU aren't as capable compared to x86 counterparts. As I've shown, it's not much better in battery life when the CPU and GPU are mainly used.

The problem is that Apple is again not like Intel, AMD, and Nvidia where they make their hardware from scratch. Apple CPU's are dependent on ARM which is currently a dysfunctional company that was nearly bought up by Nvidia. Their GPU is a bastardized PowerVR where Apple poached Imagination engineers and then later had to pay licensing fees and additional help from Imagination for Apple GPU's. Some serious work has gone into the M1 but it's again around the Media Engine. Apple has an uphill battle against the x86 market, one that I can't see them winning. Even the ARM market may ramp up as Apple has made Qualcomm and Nvidia interested in investing R&D into ARM SoC market. The M2 looks better but honestly that may not be enough for what is coming out in the market later this year.
 
They'll just do Apple stuff, wait for AMD and Nvidia to dump tons of money into ARM, and then license their stuff.
Or they develop such a massive patent portfolio around ARM that it strangles AMD and Nvidia out the gate
 
It's right there in the terms and conditions, they collect data, including non-anonymous data, to provide support to their app developers and advertisers.
Apple also gives you control over that data, including during the setup process. Microsoft's approach is "will you let us share a ton of data, or just some data?"
 
It is so funny to watch some people attach their whole identities around defending some random technology like x86. Some of the logical contortions are amazing to watch, like "The M1 is only fast because the video encoder, its 9999x slower than x86 on everything else besides video encoding!!!" "ARM is a horrible dysfunctional company with awful technology that nobody really wants so who cares if Apple uses it!!!!" Shit like that is just blatantly hilarious. I don't get it, I mean it's just some random technology and none of us had anything to do with developing it. Why the need to get so emotionally invested in some random technology that isn't even owned by a single corporation?

But I also notice that the people who like to white knight for technologies are almost never the ones who actually pony up for the latest and greatest of either technology, because they don't actually need a fast computer for what they do most (post online all day) and don't have any skin in the game. They don't even have a fast x86 OR a fast ARM computer! They just like to pontificate. On the flip side, the people who tend to actually use their computers for work tend to have the good stuff, and often both (like me!) - because it's just a tool. Just buy whatever is best, and the market will sort itself out.

Or, I guess, they could just keep buying the low end of one technology and try to convince people who have the high end of both what's better? What an odd behavior.
 
I've had Lenovos and Apples for years, and yeah, while Lenovo support is not as good as Apple, their hardware, especially at the top end of mobile, is as good as Apple's, sometimes better, and a lot cheaper. Also there are plenty of people who can work on Lenovos after the warranty expires. I know, that's true for Apple, too, but it's easier to find, and also cheaper.

As far as INFOSEC is concerned, it's a wash. There's [H]ard|ness in understanding and remembering hardware and software company failures and shens, but seriously, how many of us are still so [H]ard we compile our own software? Even overclockers today be like, "AMD magic go better turn on now, pweese."

I'm all for Buy 'Murica! but when it comes to computers, there's no avoiding China, in one way or another. And China will solder extra chips into ethernet jacks if they think they can get away with it, no matter who designed it.

Lenovo has a lot of defense contracts, and has to at least pretend that they're not a shifty Chinesium-grade builder. I can understand wanting to stay away from GPD, AYN, Minis Forum, etc. I'm not being a fanboy by not lumping Lenovo right in there with them.

Just don't buy into the Apple protects its customers mentality. That's just really good marketing.
Lenovo does not have defense contracts. They’ve been put on a permanent shit list since 2015.
 
Lenovo does not have defense contracts. They’ve been put on a permanent shit list since 2015.

I am not sure about this but what I will say is that tons and tons of government employees and contractors use them. I am at TechConnect Innovation Summit down in D.C. right now, which is a trade show that is based around innovative technology and selling to the DoD, NSF, NASA, DoE, etc and a lot of those guys are using Thinkpads to run their presentations off of. I see a ton of them here.
 
I am not sure about this but what I will say is that tons and tons of government employees and contractors use them. I am at TechConnect Innovation Summit down in D.C. right now, which is a trade show that is based around innovative technology and selling to the DoD, NSF, NASA, DoE, etc and a lot of those guys are using Thinkpads to run their presentations off of. I see a ton of them here.
Oh I’m aware. Doesn’t change the fact that anyone using them and purchasing them is a complete numpty.
 
Oh I’m aware. Doesn’t change the fact that anyone using them and purchasing them is a complete numpty.
Their business lineup is fine hardware and generally bloat-free, it's their consumer side and support that has gone to shit.
But if you are fixing everything in house and reimaging on arrival then really neither of those are significant issues.
 
As much as I admire what Apple is doing, going their own way, doing their own thing, I simply do not like proprietary standards. Never have never will.
I don't think this was so much Apple trying to strangle choice as just deciding that it wants off the x86 rollercoaster. The company has seen this script play out before: company X ties its business to third-party product Y, gets burned when Y either stumbles or loses interest in X. The best way to avoid the boom-and-bust cycle of Intel and AMD is to not participate in the first place.

I'm always reminded of the Flash saga. I still remember 2010, when seemingly every major tech company outside of Apple chained itself to Flash. BlackBerry based the PlayBook's OS around it; Palm saw it as a major selling point for webOS phones; Google even tried to paint Apple as a horrible monster because it... dared to suggest that you shouldn't design the web around a poorly designed plugin you don't control. And sure enough, those companies paid the price. BlackBerry and Palm got nowhere as Flash was lousy on mobile, while Google not only unwound Chrome's built-in Flash support but spent years weaning people off of it.

The lesson isn't that everyone should follow Apple's lead; of course not. But if you're going to rely on someone else's technology, you either have to embrace a true standard or accept that you'll deal with the ups and downs of someone else's product. Dell, HP, Lenovo and others can choose between AMD and Intel, but they still have to cross their fingers and hope that x86 won't treat them badly.
 
They'll just do Apple stuff, wait for AMD and Nvidia to dump tons of money into ARM, and then license their stuff.
This is something I predicted would happen. At some point Nvidia and Qualcomm will release killer ARM SoC's because unlike Apple they have a much larger market to sell these too, and therefore much larger reason to invest R&D. The iPhone market is very large but the M based hardware isn't for those devices. Not to forget, x86 ain't going nowhere. I've tried to make Raspberry Pi's work and I love the idea, but when I need something x86 to work it just isn't gonna happen. Especially because when I tried to make the Rapsberry Pi work the GPU drivers weren't finished and installing Ubuntu 64-bit on my Pi3 is not a good idea unless you want to wait 5 minutes to boot the OS. I just found cheaper NUC's for the same price as a Pi with far more capabilities. ARM just needs so much more software support for me to even consider using it beyond novelty stuff. Apple decided to make this their primary CPU architecture for their computers.

Or they develop such a massive patent portfolio around ARM that it strangles AMD and Nvidia out the gate
Apple could only do that if Apple owned ARM. AMD and Nvidia though can certainly punish Apple with their GPU licensing. Apple is licensing Imaginations PowerVR stuff so I can see certain technologies that AMD and Nvidia have that Apple might want to license.

It is so funny to watch some people attach their whole identities around defending some random technology like x86. Some of the logical contortions are amazing to watch, like "The M1 is only fast because the video encoder, its 9999x slower than x86 on everything else besides video encoding!!!" "ARM is a horrible dysfunctional company with awful technology that nobody really wants so who cares if Apple uses it!!!!" Shit like that is just blatantly hilarious. I don't get it, I mean it's just some random technology and none of us had anything to do with developing it. Why the need to get so emotionally invested in some random technology that isn't even owned by a single corporation?
I'm sure you'll never get a chance to explain why any of what I've said is hilarious.
But I also notice that the people who like to white knight for technologies are almost never the ones who actually pony up for the latest and greatest of either technology, because they don't actually need a fast computer for what they do most (post online all day) and don't have any skin in the game. They don't even have a fast x86 OR a fast ARM computer! They just like to pontificate. On the flip side, the people who tend to actually use their computers for work tend to have the good stuff, and often both (like me!) - because it's just a tool. Just buy whatever is best, and the market will sort itself out.
You need to own something to say something about it? I really doubt anyone who owns a BMW knows anything about the car beyond gas goes in, cars goes vroom. This applies to Apple M1 owners, as I've proven them wrong as a non-M1 owner. The only way this makes sense if you take apart your products and put them under xray to see how the internals work. Either that or you develop code for the hardware. If you're not doing either then why does owning the hardware matter?
Or, I guess, they could just keep buying the low end of one technology and try to convince people who have the high end of both what's better? What an odd behavior.
Aren't you the guy that only appears when Apple is being called out?
 
This is something I predicted would happen. At some point Nvidia and Qualcomm will release killer ARM SoC's because unlike Apple they have a much larger market to sell these too, and therefore much larger reason to invest R&D. The iPhone market is very large but the M based hardware isn't for those devices. Not to forget, x86 ain't going nowhere. I've tried to make Raspberry Pi's work and I love the idea, but when I need something x86 to work it just isn't gonna happen. Especially because when I tried to make the Rapsberry Pi work the GPU drivers weren't finished and installing Ubuntu 64-bit on my Pi3 is not a good idea unless you want to wait 5 minutes to boot the OS. I just found cheaper NUC's for the same price as a Pi with far more capabilities. ARM just needs so much more software support for me to even consider using it beyond novelty stuff. Apple decided to make this their primary CPU architecture for their computers.


Apple could only do that if Apple owned ARM. AMD and Nvidia though can certainly punish Apple with their GPU licensing. Apple is licensing Imaginations PowerVR stuff so I can see certain technologies that AMD and Nvidia have that Apple might want to license.


I'm sure you'll never get a chance to explain why any of what I've said is hilarious.

You need to own something to say something about it? I really doubt anyone who owns a BMW knows anything about the car beyond gas goes in, cars goes vroom. This applies to Apple M1 owners, as I've proven them wrong as a non-M1 owner. The only way this makes sense if you take apart your products and put them under xray to see how the internals work. Either that or you develop code for the hardware. If you're not doing either then why does owning the hardware matter?

Aren't you the guy that only appears when Apple is being called out?
Apple's R&D budget was $20 bil a few years back. Nvidia's is only $5.25 bil as of this year.

I can assure you that Apple is far ahead of anyone else when it comes to their ARM SoC. No one will be catching up at least in the next decade.
 
I don't think this was so much Apple trying to strangle choice as just deciding that it wants off the x86 rollercoaster. The company has seen this script play out before: company X ties its business to third-party product Y, gets burned when Y either stumbles or loses interest in X. The best way to avoid the boom-and-bust cycle of Intel and AMD is to not participate in the first place.
Apple wants to not depend on anyone for their products. Remember the fiasco where Apple didn't want to pay Qualcomm license fees for their radio hardware? Of all companies, they had Intel make it. That's not because it's better, because the Intel radio was inferior to the Qualcomm one, but because it's cheaper for Apple. I'm sure Apple's M1 is far cheaper for Apple in the long run compared to buying Intel x86 CPU's.

A lot of us forget history in how CPU architectures work themselves out of the market because once you're dominant then why even bother to invest into R&D? This is how MIPS and PowerPC died because both companies thought they didn't need to invest into R&D and nobody was going to leave them in fear of having to rewrite all their software. ARM is already borderline in this situation as they're broke and was nearly bought by Nvidia. They clearly need to raise existing license fees but they can't because contracts keep the prices low.

x86 is unique in that AMD has none of Intel's designs unless they want to pull out the old 286 handbook they still have from the 1980's. So AMD has to fight back against Intel, and the same goes for Intel. The only thing was that back in the 2000's Intel blocked AMD from making money off their superior AthlonFX chips and basically starved them out of R&D funding. Then AMD released the Bulldozer architecture and that was a flop they had to deal with for several years. Especially after Sandybridge was released where the difference in performance was dramatic to the point that AMD just couldn't climb itself out of the hole that Intel dug for AMD. Intel has basically been using Sandybridge hardware since it's release because why redesign new hardware when you can go Tick Tock with your CPU design. With Intel it's a lot of Ticks but whatever.

ARM is not a stable market by any means, and I'm willing to bet that from this point forward any innovation for ARM will have to come from Apple or Qualcomm themselves. Think AltiVec but instead Apple exclusive. Even now with the M1 products you can't just use existing AArch64 code and you're done. Coders need to add Write Xor Execute in order for existing ARM code to work on the M1. This wasn't as strict on iOS devices but that means anyone working on M1 hardware has to follow these instructions to get existing ARM code working on it. That means Apple has already divided ARM code with the M1. Something I see Apple doing far more so in their future ARM based products. Which could turn ARM from a universal standard to a Apple ARM standard. ARM technically doesn't have to compete with anyone as they just sell licenses. Nearly every mobile device uses ARM so there's very little incentive for ARM to improve. That's why Nvidia probably wanted to buy ARM so they could fund improvements instead of going alone and hoping people pay Nvidia a license fee for their tech. If Nvidia owned ARM then you're not buying Nvidia's tech but ARM's tech, even though it's really Nvidia's. I could easily see Nvidia making it attractive for Apple to buy designs that include Nvidia GPU tech.
 
Apple could only do that if Apple owned ARM. AMD and Nvidia though can certainly punish Apple with their GPU licensing. Apple is licensing Imaginations PowerVR stuff so I can see certain technologies that AMD and Nvidia have that Apple might want to license.
Apple can't patent anything from the stock ARM ISA, but they can and do patent their customizations to it, they also successfully file a large number of patents referring to their SOCs themselves.
 
I could easily see Nvidia making it attractive for Apple to buy designs that include Nvidia GPU tech.
I firmly believe this was their goal with ARM, after the purchase they could have made their proprietary GPU tech the global standard over the course of 4 years. Yeah AMD may have a lock on the consoles for the foreseeable future but if they were the defacto on every ARM-based Chromebook, Android device, etc.
Over the course of 4 years or so Nvidia simply through the means of hardware attrition takes the overwhelmingly dominant position, and Moble gaming regardless of your personal take on it is a force to be reckoned with and the games are getting better and more graphically demanding. And yeah ports happen in all directions mobile to desktop to console and all possible combinations in-between, and by pure market dominance, it would force Sony and Microsoft to put them in the running for console refreshes because it would be cheaper for developers to port between the different markets.
Nvidia would have been very good to ARM, but it would have been very bad for consumer choice over the next 5+ years.
But ARM is now in a very tricky place for sure.
 
It is so funny to watch some people attach their whole identities around defending some random technology like x86. Some of the logical contortions are amazing to watch, like "The M1 is only fast because the video encoder, its 9999x slower than x86 on everything else besides video encoding!!!" "ARM is a horrible dysfunctional company with awful technology that nobody really wants so who cares if Apple uses it!!!!" Shit like that is just blatantly hilarious. I don't get it, I mean it's just some random technology and none of us had anything to do with developing it. Why the need to get so emotionally invested in some random technology that isn't even owned by a single corporation?

But I also notice that the people who like to white knight for technologies are almost never the ones who actually pony up for the latest and greatest of either technology, because they don't actually need a fast computer for what they do most (post online all day) and don't have any skin in the game. They don't even have a fast x86 OR a fast ARM computer! They just like to pontificate. On the flip side, the people who tend to actually use their computers for work tend to have the good stuff, and often both (like me!) - because it's just a tool. Just buy whatever is best, and the market will sort itself out.

Or, I guess, they could just keep buying the low end of one technology and try to convince people who have the high end of both what's better? What an odd behavior.


I'm not tied to any particular technology. I just don't believe Apple's propaganda.

They have a LONG history of convincing their cult that their products are "better" when all they are are shiny and priced higher.

It's going to take some real hard proof, and not magic marketing numbers to convince me otherwise. Similar to any claim that comes from the North Korean, Chinese or Russian governments, any claim that comes from apple is less than worthless.

I treat Apple the same as I treat any other person or company in the world. If you are going to tell me anything at all, bring incontrovertible proof, or I won't believe you. Once there are real cross-platform benchmarks that run apples to apples comparisons on M1/M2 chips compared to existing high end x86 chips, without cherry picking wortkloads or low wattage laptop chips to compare them to, then I'll take them seriously, but until then anything and everything Appple releases is just shiny cult-like fluff to me.

I treat absolutely and automatically treat everything and everyone in the world that challenges my existing knowledge as untrue, until such time as firm evidence is provided to back up the claims. it doesn't matter who you are, Apple, Intel, Nvidia, AMD... I always assume they are lying to me unless I see unbiased hard proof.

...and Apple has never dabbled in unbiased hard proof as long as the company has existed. Their track record is to obfuscate, throw in marketing numbers and proclaim it "magical", and dumb-asses keep falling for it time and time again. Even Intel and Nvidia at their worst (and they are some really shitty companies in this regard) have done better on the evidence front than Apple.
 
I'm not tied to any particular technology. I just don't believe Apple's propaganda.

They have a LONG history of convincing their cult that their products are "better" when all they are are shiny and priced higher.

It's going to take some real hard proof, and not magic marketing numbers to convince me otherwise. Similar to any claim that comes from the North Korean, Chinese or Russian governments, any claim that comes from apple is less than worthless.

I treat Apple the same as I treat any other person or company in the world. If you are going to tell me anything at all, bring incontrovertible proof, or I won't believe you. Once there are real cross-platform benchmarks that run apples to apples comparisons on M1/M2 chips compared to existing high end x86 chips, without cherry picking wortkloads or low wattage laptop chips to compare them to, then I'll take them seriously, but until then anything and everything Appple releases is just shiny cult-like fluff to me.

I treat absolutely and automatically treat everything and everyone in the world that challenges my existing knowledge as untrue, until such time as firm evidence is provided to back up the claims. it doesn't matter who you are, Apple, Intel, Nvidia, AMD... I always assume they are lying to me unless I see unbiased hard proof.

...and Apple has never dabbled in unbiased hard proof as long as the company has existed. Their track record is to obfuscate, throw in marketing numbers and proclaim it "magical", and dumb-asses keep falling for it time and time again. Even Intel and Nvidia at their worst (and they are some really shitty companies in this regard) have done better on the evidence front than Apple.
Eh, I always find it odd that people will blast Apple when all it's really doing is marketing its products from a certain angle. It's just that Apple makes the whole package, not only the hardware, and can market more unique features (however well they work) than Android and Windows vendors. Plenty of other brands use incredulous language — Lenovo claims its Legion laptops have "ultimate performance" and "uncompromised battery life" when you know damn well they make sacrifices in both areas.

Besides, focusing on the "cult" label is odd when Apple sells hundreds of millions of devices every year. Now, while there aren't detailed studies of brand devotion among those buyers, it's highly doubtful that most of them are cultish. They may be fans, they may buy the next iPhone because it's 'comfortable' to them, but they don't hang Steve Jobs posters on the wall or throw a fit if forced to use Android or Windows. They're everyday consumers, the same sort of people who would otherwise buy a Galaxy S22 or XPS 13. We're long, long past the point where Apple is catering to a small but rabid fan base, and its marketing is definitely not the only thing keeping it alive.
 
Years back I was on the Apple hate train. Mostly because of the return of Jobs shift in the mid/late 90's where they slowly abandoned power users, and stopped dominating from the hardware side. However, in the past number of years i've grown to like them again, mostly because of the quality and length of support period on their software side. I can buy an iPhone an expect it to last a good 4-6 years easy. Same goes with any of their products. I've also been slowly shifting over to doing most of my work on OSX. I get a nice looking interface. I get decent integration across devices, and importantly, I get a CLI that works closer to linux and isn't trash like Powershell & windows. The older I get the more I appreciate just a well put together hardware and software setup for the more basic tasks out there. I still have a monster PC for gaming, but honestly, that's all I use Windows & my PC for anymore. Everything else has shifted to the mac mini on my desk.
 
Once there are real cross-platform benchmarks that run apples to apples comparisons on M1/M2 chips compared to existing high end x86 chips, without cherry picking wortkloads or low wattage laptop chips to compare them to, then I'll take them seriously, but until then anything and everything Appple releases is just shiny cult-like fluff to me.
So what do you think of specint 2017 numbers for apple M1 vs ryzen 5950 and intel 11th gen.

https://www.anandtech.com/show/16252/mac-mini-apple-m1-tested/4
 
So what do you think of specint 2017 numbers for apple M1 vs ryzen 5950 and intel 11th gen.
Not sure what we should think about them ?

119365.png


28.85 in specint 2017 on 4+4 core M1 is around a 91% of a 6 full core Ryzen 5 3600 or 114% of a 15W 4800U, very impressive power envelope wise I think but not that special ?

Float is significantly more impressive than int, getting close to an 3700x.
 
Last edited:
28.85 in specint 2017 on 4+4 core M1 is around a 91% on a 6 full core Ryzen 5 3600 or 114% of a 15W 4800U, very impressive power envelope wise I think but not that special ?

Float is significantly more impressive than int, getting close to an 3700x.

Yeah. Did someone have a chart of energy usage for each test? I thought I saw one but can't find it. Would be interesting to compare how much energy M1 uses for each test vs competition along with score.

edit: also interesting that single thread performance often competes/exceeds amd 5950 in 2006 test. in 2017 test seems to generally match the mobile more often, and exceeds the amd 5950 mainly in gcc compilation
 
Last edited:
Back
Top