Two years after launch Windows 11 adoption is still waaaay behind Windows 10

polonyc2

Fully [H]
Joined
Oct 25, 2004
Messages
25,859
Windows 10 still utterly dominates the desktop PC operating system market, despite Windows 11 having been around for 2 years now...and there's not much sign of this changing any time soon

the latest figures suggest people, as individuals and as a business, really aren't into Windows 11 at all...with the numbers now updated for September 2023, we can see the global uptake of Microsoft's latest operating system has been stagnant since April 2023, and there doesn't seem to be any indication that this will change

Not only that, but the market share paints a dismal picture: 71% for Win10, just 24% for Win11, and 3% for Win7 (with the rest being XP, 8.1, and so on)...the figure for Windows 10 has barely changed in twelve months and it's only the decline of Windows 7 that has benefited the newest version...

https://www.pcgamer.com/two-years-a...1-adoption-is-still-waaaay-behind-windows-10/
 
I am moving as many of my employees as possible to MacOS. I am tired of Microsoft's BS.

MacOS is a shit show as well, just in a different way. Everything in MacOS is fine, if all you want to do is do things "the Apple Way" and not stray from the happy path at all ever.

Though the biggest problem with MacOS is not the software, it is the hardware.

I really wish there were a big push to go Desktop Linux.

Linux on the desktop has been ready for 10 years. Only reason it doesn't work is because it requires critical mass. You need enough other people out there running LibreOffice so your documents are fully compatible. I feel like that will sadly never happen.
 
Windows 10 still utterly dominates the desktop PC operating system market, despite Windows 11 having been around for 2 years now...and there's not much sign of this changing any time soon

the latest figures suggest people, as individuals and as a business, really aren't into Windows 11 at all...with the numbers now updated for September 2023, we can see the global uptake of Microsoft's latest operating system has been stagnant since April 2023, and there doesn't seem to be any indication that this will change

Not only that, but the market share paints a dismal picture: 71% for Win10, just 24% for Win11, and 3% for Win7 (with the rest being XP, 8.1, and so on)...the figure for Windows 10 has barely changed in twelve months and it's only the decline of Windows 7 that has benefited the newest version...

https://www.pcgamer.com/two-years-a...1-adoption-is-still-waaaay-behind-windows-10/

I can only speak to myself, not the market as a whole.

At work we are still mostly Win10. As machines get retired and new machines depending on Win11 drivers/scheduler are introduced Windows 11 is bit by bit making its way into the office, but it is slow going. We are maybe 10-15% windows 11 at this point.

For my home purposes, my take is that 99% of what I dislike about Windows 11 I already disliked about Windows 10. You know, the whole ecosystem of bloat, and cloud integration rather than treating my machine as an independent local machine. I didn't like Cortana, and I bet I won't like the new AI shit they are now trying to force on us either.

But I guess my take is, there is really no reason to hold off on "upgrading" to Windows 11, because what I hate about Windows 11 was already bad in Windows 10. I'm not avoiding anything I dislike about Windows 11 by not upgrading.

But on the flip side, there simply is no reason to "upgrade" to Windows 11. Windows 11 doesn't do anyhthing I want to do that Windows 10 doesn't already do. There is no big new feature that makes me want to "upgrade". They have given me absolutely no reason to spend the time and make the switch. As far as I am concerned, Windows 11 is just a facelifted Windows 10.

I wish I didn't have to run Windows 10. I would have preferred just a facelifted Windows 7 without all of the bloat, ecosystem crap, AI nonsense and cloud integration that came with Windows 10 and newer, but Win 7 went EOL and it is a very bad idea to keep using an EOL OS. The same will happen with Windows 10 in 2025. When it does, I will need to decide if I "upgrade" to Windows 11, or if I just ditch windows all together.
 
Since I do technical support for my family, I happily tell them to install Windows 11, then I put my hands up when they have problems. "Sorry, I don't know how to fix that in Windows 11. Have you tried googling the issue?"

I'll probably switch to Windows 10 once I do an entire platform upgrade (e.g. move to AM5) or if my boot drive implodes (no I don't care to maintain backups on my personal devices).
 
But I guess my take is, there is really no reason to hold off on "upgrading" to Windows 11, because what I hate about Windows 11 was already bad in Windows 10. I'm not avoiding anything I dislike about Windows 11 by not upgrading.

But on the flip side, there simply is no reason to "upgrade" to Windows 11. Windows 11 doesn't do anyhthing I want to do that Windows 10 doesn't already do. There is no big new feature that makes me want to "upgrade". They have given me absolutely no reason to spend the time and make the switch. As far as I am concerned, Windows 11 is just a facelifted Windows 10.

there is 1 major reason to hold off...most likely Windows 12 will be released in the next 2 years...so people can continue to use W10 and then jump directly into W12
 
MacOS is a shit show as well, just in a different way. Everything in MacOS is fine, if all you want to do is do things "the Apple Way" and not stray from the happy path at all ever.
As opposed to the Windows way or the Linux way? macOS has a full blown terminal. You can do whatever you want.
Though the biggest problem with MacOS is not the software, it is the hardware.
In what way does any piece of Apple hardware prevent an office worker from getting their work done? It is far less limited than buying a Chromebook.

I expect you'll say something about upgradeablity/repairabilty (in fact I know you will, not the first time around). But frankly in corporate that doesn't matter at all. You get Apple care for 3 years. You let Apple be the one to fix all hardware faults. Then you get another machine at the end of the service cycle. Same as literally any other HP/Dell or whoever service contract in an office environment. Massive corporate and education doesn't repair or upgrade. Not when you're deploying hardware year round.

The vast majority of all service calls will always be software and not hardware related. So if you can "fix" the majority of possible software issues simply by having general users switch to macOS, then it's a huge win for hours spent by helpdesk professionals answering borderline stupid requests.

Frankly a non-repairable, non-upgradeable machine that is reliable (which Apple systems generally are) will require significantly less time to manage over its 3 year life cycle than any competing Windows machine. Which is the point. And there is far less in there for the user to mess up or for software otherwise to go wrong.
I really wish there were a big push to go Desktop Linux.

Linux on the desktop has been ready for 10 years. Only reason it doesn't work is because it requires critical mass. You need enough other people out there running LibreOffice so your documents are fully compatible. I feel like that will sadly never happen.
Yeah, general users will never go there. So if you love living in fantasy land, then you can feel free. But anyone who thinks "Linux is ready" is living in a bubble. It is taking Valve custom hardware and tens of thousands of development hours to basically build a "Linux device" with high enough guardrails and software optimization that a "normie" can use it. And it's only possible because there is precisely 1 hardware combination. All of the upgrades on that device are just storage.

In essence for Linux to go mainstream, those software things you're complaining about in macOS will need to come to Linux. Someone will have to curate hardware combinations and software optimizations. Someone will have to make those guardrails. A team will have to push updates that don't have the possibility of breaking everything. If you hate being compared to Apple then it will need to be a ChromeOSification or Androidification of Linux. And frankly up to this point no one has been willing to put in the development resources to do all of this stuff. Linux is a massive fractured mess. ChromeOS is likely the closest thing to be able to fullfill a general Linux desktop system for general users. I personally loathe that option over Windows and macOS. I mean at that point may as well just install spyware onto your Windows desktop for similar functionality (which ironically Windows also has).
 
Last edited:
As opposed to the Windows way or the Linux way? macOS has a full blown terminal. You can do whatever you want.

In what way does any piece of Apple hardware prevent an office worker from getting their work done? It is far less limited than buying a Chromebook.

I expect you'll say something about upgradeablity/repairabilty (in fact I know you will, not the first time around). But frankly in corporate that doesn't matter at all. You get Apple care for 3 years. You let Apple be the one to fix all hardware faults. Then you get another machine at the end of the service cycle. Same as literally any other HP/Dell or whoever service contract in an office environment. Massive corporate and education doesn't repair or upgrade. Not when you're deploying hardware year round.

Frankly a non-repairable, non-upgradeable machine that is reliable (which Apple systems generally are) will require significantly less time to manage over its 3 year life cycle than any competing Windows machine. Which is the point. And there is far less in there for the user to mess up or for software otherwise to go wrong.

Yeah, general users will never go there. So if you love living in fantasy land, then you can feel free. But anyone who thinks "Linux is ready" is living in a bubble. It is taking Valve custom hardware and tens of thousands of development hours to basically build a "Linux device" with high enough guardrails and software optimization that a "normie" can use it. And it's only possible because there is precisely 1 hardware combination. All of the upgrades on that device are just storage.

In essence for Linux to go mainstream, those software things you're complaining about in macOS will need to come to Linux. Someone will have to curate hardware combinations and software optimizations. Someone will have to make those guardrails. A team will have to push updates that don't have the possibility of breaking everything. If you hate being compared to Apple then it will need to be a ChromeOSification or Androidification of Linux. And frankly up to this point no one has been willing to put in the development resources to do all of this stuff. Linux is a massive fractured mess. ChromeOS is likely the closest thing to be able to fullfill a general Linux desktop system for general users. I personally loathe that option over Windows and macOS. I mean at that point may as well just install spyware onto your Windows desktop for similar functionality.
Apple also does corporate leasing now, which gives Apple Care over the duration of the device lease, makes procurement really easy, and normalizes a good amount of IT expenditures.
Pair that with Apple business manager and your choice of Jamf or Intune for MDM management and it's easy, roll that in there with iPhones.
Then you can easily get a certificate generated in Business Manager and Entra (formerly Azure AD), for cross-enrollment, which then lets you manage Apple Accounts that are using your domain, for software licensing and distribution at a user level and not a device one.

Linux at a desktop level is great IF you have an IT department that can support it, but most businesses don't exactly have that sort of manpower on hand, and Linux is not exactly as fire and forget on the desktop as it is for servers and if you are using contracted IT or on-call IT services, getting them out to fix a minor issue easily comes in at multiple hundred dollars and any cost savings the Linux environment gives very quickly vanishes. Not to mention it adds a great deal in complexity for new hardware setups because you are very likely dealing with translation layers and virtual machines for many things which potentially adds complexity to software licenses and support contracts.

Linux for servers is awesome.
Linux for desktops needs a lot of background tech to make it seamless with business tools that aren't exactly cheap and when it goes wrong your options are very slim on who can fix it.

But really I'll be honest Linux for the desktop has never been better, it's ChromeOS, ChromeOS is what Linux on the desktop was always going to end up looking like, 15 years ago Linux Desktop proper had its best shot at going mainstream with Ubuntu but the community decided that having one major distro take over was a bad thing and they sabotaged it from the inside forked it to high hell and split resources killing any chance it had and that's the end of that.
 
Last edited:
Maybe I did not read the article correctly, but is it particularly low (say versus 2000, Xp, vista, 8, 10, 2 years in ?)

OS in the past often changed only when the PC changed and has time goes, PC lifetime went up and up, your 10 years old 4th gen core i5-16gig or ram with an SSD now are way better than a 1993 computer in 2003. Computer that were bought in the win10 era, 6700k and up type....

And Win10-11 being almost the same thing, I feel win10 got most of the upgrade has of now (September 2023 22h2 was still on Win10), figure getting close to the end of 2025 will be more telling (the bought a new computer in the windows 11 era but still went for 10 instead figure).
 
Apple also does corporate leasing now, which gives Apple Care over the duration of the device lease, makes procurement really easy, and normalizes a good amount of IT expenditures.
Pair that with Apple business manager and your choice of Jamf or Intune for MDM management and it's easy, roll that in there with iPhones.
Then you can easily get a certificate generated in Business Manager and Entra (formerly Azure AD), for cross-enrollment, which then lets you manage Apple Accounts that are using your domain, for software licensing and distribution at a user level and not a device one.

Linux at a desktop level is great IF you have an IT department that can support it, but most businesses don't exactly have that sort of manpower on hand, and Linux is not exactly as fire and forget on the desktop as it is for servers and if you are using contracted IT or on-call IT services, getting them out to fix a minor issue easily comes in at multiple hundred dollars and any cost savings the Linux environment gives very quickly vanishes. Not to mention it adds a great deal in complexity for new hardware setups because you are very likely dealing with translation layers and virtual machines for many things which potentially adds complexity to software licenses and support contracts.

Linux for servers is awesome.
Linux for desktops needs a lot of background tech to make it seamless with business tools that aren't exactly cheap and when it goes wrong your options are very slim on who can fix it.

I don't understand the perceived difficulty as to why it would take more manpower to maintain?

The first time, as you figure out some of the tools needed to - for instance - manage AD interactivity it might be a little tricky, but once you have a working image configured you just write that to every machine and you are golden. You don't even need to build it for every configuration. All the drivers you need are in the kernel, and there is no license or key to keep track of, so the same image can literally be used for every machine regardless of hardware (provided it is compatible, which these days is a much smaller issue than in the past.)

The biggest weakness when it comes to business and desktop Linux - IMHO - is the lack of a good Exchange/Outlook equivalent.

Thunderbird is great and all, but it can't really replace Outlook for typical use.

That, and the aforementioned critical mass of Libre (or other open) office document formats, because they are never going to be perfectly compatible with MS Office, and people tend to have poor tolerance for their documents rendering differently on different machines.
 
Windows 10 still utterly dominates the desktop PC operating system market, despite Windows 11 having been around for 2 years now...and there's not much sign of this changing any time soon

the latest figures suggest people, as individuals and as a business, really aren't into Windows 11 at all...with the numbers now updated for September 2023, we can see the global uptake of Microsoft's latest operating system has been stagnant since April 2023, and there doesn't seem to be any indication that this will change

Many businesses use LTSC releases on their company machines. To date, there is still no LTSC release of Windows 11 (no, the Enterprise IoT "LTSC" version is not the same as a regular LTSC release). Supposedly this is coming in the somewhat near future and will be based on 23H2. That will definitely change things.

Also, you can't ignore the fact that this is really the first Windows version in a very long time with strict official requirements. Most people aren't inclined or knowledgeable enough to use a requirements bypass. There are probably many out there who would upgrade to Windows 11 if they could.

it's only the decline of Windows 7 that has benefited the newest version...

It's not as if people are upgrading their Windows 7 computers to Windows 11. Maybe they are buying new computers, but I doubt that they are necessarily doing that because of the OS.

Most people just use whatever OS comes with their computer and don't care.
 
If they finally add never combine, show titles, and quick launch for the taskbar, then I might not be 100% against Win11. Besides 4 out of 5 computers I own are not even compatible with it. So unless Microsoft wants to pay me to upgrade computers perfectly suitable for what I'm using them for, it's not gonna happen.
 
  • Like
Reactions: Wat
like this
MacOS is a shit show as well, just in a different way. Everything in MacOS is fine, if all you want to do is do things "the Apple Way" and not stray from the happy path at all ever.
No, for 95% of business users most of their work is done in a browser anyway. My engineers are literally the only employees that use actual applications and require linux and windows. Logistics, sales, support, etc all live in cloud-based services. 3PL, EDI, Google Docs, gmail, Notion, Dropbox, Asana, etc.

For this use case, MacOS is measurably and objectively superior. More secure, less viruses, less ways for dumb employees to infect themselves.


Though the biggest problem with MacOS is not the software, it is the hardware.
Wrong. All of the problems in our office come from Dell XPS machines. Our engineers have top-specced 15in XPS and they are garbage. Thunderbolt never works right, chargers die, ports die, mainboards die, fan is constantly on full blast, etc.

Mac hardware is reliable, silent, gets better battery life, and is more expandable with more I/O. If there ever was a problem (which has happened exactly once so far) I can go down the street and have an Applecare replacement in 30min, tell the employee to run Time Machine, and be back up and running without even needing IT.
 
The new workstations I ordered for the office came pre-downgraded to Windows 10 Pro. It is one thing a few of the OEMs get right when it comes to business PCs. They were relatively bloat free as well.

Between legacy software that barely functions in Windows 10 (it does not function in Windows 11 at all) and a user base that struggled with the transition from Windows 7 to Windows 10, I'm not looking forward to the end of support for Windows 10. At least our hardware can run Windows 11 now without resorting to workarounds when the time comes.
 
I don't understand the perceived difficulty as to why it would take more manpower to maintain?

The first time, as you figure out some of the tools needed to - for instance - manage AD interactivity it might be a little tricky, but once you have a working image configured you just write that to every machine and you are golden. You don't even need to build it for every configuration. All the drivers you need are in the kernel, and there is no license or key to keep track of, so the same image can literally be used for every machine regardless of hardware (provided it is compatible, which these days is a much smaller issue than in the past.)

The biggest weakness when it comes to business and desktop Linux - IMHO - is the lack of a good Exchange/Outlook equivalent.

Thunderbird is great and all, but it can't really replace Outlook for typical use.

That, and the aforementioned critical mass of Libre (or other open) office document formats, because they are never going to be perfectly compatible with MS Office, and people tend to have poor tolerance for their documents rendering differently on different machines.
But who is figuring out that management setup, who is maintaining it, who is creating those images, updating them, who is dealing with local user permissions and on call every time a user needs to do something elevated or does each user have root access who is there to figure out why Foxit Reader is displaying weird ass characters in that pdf document.
Unless you are running 50 or more employees then having an IT department in-house doesn't really pan out, so you are contracting out. Techs out there who are good with Linux are few, and expensive. You can find just about anybody with a Google search of your area for a dozen techs who can work on a Windows system or a Mac-based system where there is a well-documented and standardized back end, that is pretty universally compatible, easy to troubleshoot and has abundant diagnostic options.
And then for Linux as you said there is a lack of many good business suite options out there, Exchange/Outlook is just the tip of it, PowerBI is massive, Dynamics, just about any Accounting software suite, etc so now you have to deal with Wine or a VM or an RDP server. That's more backend stuff that adds complexity and more places for things to go wrong which means more calls which adds downtime, frustration, and costs.

So unless you reach a critical tipping point where you have an in-house IT department that you can trust, the cost/benefit ratio doesn't exist.

One of the most stressful jobs I had was working at a consulting firm that was called in to clean up "IT Terrorists" which is what we affectionately called the Managers who bake job security into their systems with unnecessary complexity and non-standard things and they were always "Linux guys". Ooof glad I don't do that anymore.
 
No, for 95% of business users most of their work is done in a browser anyway. My engineers are literally the only employees that use actual applications and require linux and windows. Logistics, sales, support, etc all live in cloud-based services. 3PL, EDI, Google Docs, gmail, Notion, Dropbox, Asana, etc.

For this use case, MacOS is measurably and objectively superior. More secure, less viruses, less ways for dumb employees to infect themselves.



Wrong. All of the problems in our office come from Dell XPS machines. Our engineers have top-specced 15in XPS and they are garbage. Thunderbolt never works right, chargers die, ports die, mainboards die, fan is constantly on full blast, etc.

Mac hardware is reliable, silent, gets better battery life, and is more expandable with more I/O. If there ever was a problem (which has happened exactly once so far) I can go down the street and have an Applecare replacement in 30min, tell the employee to run Time Machine, and be back up and running without even needing IT.
I prefer the G series over the XPS ones, yeah you have to really shmooze up to your rep to get them to release them into the enterprise channel but they work far better than that XPS trash.

But yeah for most mobile needs you are looking at Macbooks or the newer ultrabooks and they are pretty much cost-comparable so it comes down to how much of your use case fits on a Mac.
 
Because Win 11 is Sh!t. Win 10 was also Sh!t. You know what wasn't Sh!t? Windows 7. Windows 7 did what you needed without all the telemetry and ads and cortana and microsoft store BS.

Windows 7 was what you called a real "operating system". It was the software interface that facilitated the end user to control their own hardware and software resources as they wish.
 
Windows 7 was what you called a real "operating system". It was the software interface that facilitated the end user to control their own hardware and software resources as they wish.
Yeah, but users don't want that any more, desktop and laptop sales go down every year for a reason while iOS, Chrome, and Android increase every year.
Most users don't want to fiddle with things, they are "afraid to break it", they want to turn it on, use it, get that hot new app, take, edit, and upload their photos, do their work, and then play their games.
The majority of users out there want to fiddle with as little as possible, and most business/enterprise environments spend extra time and resources to lock those things down from the people on the machines because we don't want them to have access to anything that could mess anything up.

At this stage, we are the car guys who want to get under the hood and turn the 3 grind valve into the 5 grind valve, but everybody else doesn't give a rats ass what a valve even is as long as it starts and gets them from A to B without needing to be in the shop every other week.
 
A feel the windows 6 era (vista-7) is getting a bit of revisionism history, the talk versus windows 5 (2000-xp) was quite similar for 6, than what 10.x is getting over 6.1
 
A feel the windows 6 era (vista-7) is getting a bit of revisionism history, the talk versus windows 5 (2000-xp) was quite similar for 6, than what 10.x is getting over 6.1
I just held on until stuff got near EOL. There's no doubt 2k/XP were way more lightweight than 7. The problem I ran into was the 32bit 4gb memory limits. GPUs started eating into your available memory so eventually I had to switch. By then 7 was patched up quite a bit and in good shape.

I am ready to get to 11 but was waiting for more stuff to makes use of direct storage. The slow adoption of that tech has been pretty disappointing.
 
Windows 7 was what you called a real "operating system". It was the software interface that facilitated the end user to control their own hardware and software resources as they wish.

Yep, that's what people don't get.

Prior versions of Windows were operating systems.

Windows 10 and 11 are operating systems - yes - but they also include forced ecosystems.

To be fair to Microsoft they are just a follower in this regard, with Apple and Google having led the way for 15 years, but as the saying used to go (in 2010 or so):

"I want my phone to be more like my computer, NOT my computer to be more like my phone."

That battle seems like it has been completely lost at this point.

I don't want an ecosystem. I don't want bloat that comes with the OS. I don't want any cloud integration. I don't want anything on my computer to link to a server on the WAN unless I explicitly configure it and tell it to. I want all logins on my local PC to be local only Logins, and nothing to ever be linked to an account anywhere else than on my local machine.

I want my computer to not reach out to anything outside the local machine on either the LAN or the WAN unless I tell it to do so myself. No automated connections to anything ever. No network scans to "find" other devices, etc. etc.

If I want to communicate with a printer at 10.0.1.17, I will set that printer up with that IP, and then I will manually type that IP in on my computer in order to connect with the printer and print. I don't want anything going looking for anything on the network.

I want my PC to be a local machine monolith only, which I can reach out to the internet on if I want to, but where this is never done without my explicit intent.

No background tasks (unless I manually enabled them), no dialing home to the mothership or any other server (unless I manually told it to, or enabled automation to do so), no data collection, no telemetry, and no bloatware, no forcing new features or applications on me. If I want an AI chat prompt (I don't) I'll download an AI chat prompt. Don't force shit on me. If I want software, I will install it myself, otherwise it shouldn't be there, and it should never change on its own without my consent.

I don't want voice activated, AI assistant anything.

I shouldn't need some weirdo Intel CPU with a shit ton of E cores to handle background tasks. Those background tasks shouldn't exist in the first place unless I intentionally enable them, and then I know what I want running and what I don't and can budget system resources for them myself.

It is MY computer, and I need to be in control of it 100% of the time. Otherwise I don't want it.

The above is the only experience I will ever be happy with. Both Windows 10 and Windows 11 utterly fail in this regard. Windows 7 was the last proper OS Microsoft made.
 
Last edited:
Windows 10 and 11 are operating systems - yes - but they also include forced ecosystems.

Even in Windows 11, you can still use a Local Account and there is nothing that requires you to ever actually use the Microsoft Store. How does that qualify as a "forced ecosystem"? New features are introduced but you're not forced to use them. It takes just a few clicks to disable things like OneDrive on a new install and then it never bugs you ever again.

Going all the way back to the XP era you still had things like Internet Explorer, Outlook Express, and Windows Media Player being updated and introducing new features on a regular basis. Microsoft Security Essentials (predecessor to Windows Defender) was introduced during the XP era. Also don't forget Microsoft trying to push MSN Explorer (their AOL competitor) during the XP era, as well as the "Windows Live" suite of apps and services. "Bloat", software bundling, and what is now referred to as "cloud computing", has been going on for 20+ years.
 
Even in Windows 11, you can still use a Local Account and there is nothing that requires you to ever actually use the Microsoft Store. How does that qualify as a "forced ecosystem"? New features are introduced but you're not forced to use them. It takes just a few clicks to disable things like OneDrive on a new install and then it never bugs you ever again.

- Many things are installed by default (which should never happen)
- You shouldn't have to play whack-a-mole getting rid of junk that you didn't want in the first place. Everything should be opt in, not opt out.
- Many things cant even be uninstalled. At least not easily without jumping through hoops to get them to disappear in the first place.
- You need to use various tricks during the install process to get Win11 to install without a microsoft account. The old "just don't connect to the internet" approach doesn't work anymore.
- Leaving stuff on your computer and "just not using it" is not an acceptable option. It needs to not be there in the first place.
 
Last edited:
- Many things are installed by default (which should never happen)
- You shouldn't have to play whack-a-maole getting rid of junk that you didn't want in the first place. Everything should be opt in, not opt out.
- Many things cant even be uninstalled. At least not easily without jumping through hoops to get them to disappear in the first place.
- You need to use various tricks during the install process to get Win11 to install without a microsoft account. The old "just don't connect to the internet" approach do4esn't work anymore.
- Leaving stuff on your computer and "just not using it" is not an acceptable option. It needs to not be there in the first place.

This... 100%
 
- Many things are installed by default (which should never happen)
- You shouldn't have to play whack-a-maole getting rid of junk that you didn't want in the first place. Everything should be opt in, not opt out.
- Many things cant even be uninstalled. At least not easily without jumping through hoops to get them to disappear in the first place.
- Leaving stuff on your computer and "just not using it" is not an acceptable option. It needs to not be there in the first place.
Were you saying the same things about Internet Explorer, Windows Media Player, and Outlook Express 20+ years ago? Is it really any different now?

- You need to use various tricks during the install process to get Win11 to install without a microsoft account. The old "just don't connect to the internet" approach do4esn't work anymore.
"various tricks"? It takes 3 clicks to use the Domain Join option. Using the Domain Join option is even easier than disconnecting from the internet during install IMO.

Ironically the post I linked to was originally a reply to one of your posts from a year and a half ago saying that Local Accounts were going away. Well... guess what didn't happen?
 
They have given me absolutely no reason to spend the time and make the switch. As far as I am concerned, Windows 11 is just a facelifted Windows 10.
Supposedly Alder/Raptor Lake will work better on Win11 because the scheduler knows the difference between P and E cores, while Microsoft deliberately didn't add that functionality to Win10. I haven't bothered to look for any corroboration or debunking; I have an Alder Lake mini PC that I bought this spring that came preinstalled with Win11, and the 12600K desktop I built last year has it for the same reason. I've passed on upgrading my 4800HS laptop and 5800X desktops, though.
 
I'm not surprised in the least. I personally like win11 for the most part and prefer it over 10. But that's me.

Reality is they introduced a very firm hardware barrier and didn't really change that much when you get down to it. Very little incentive to switch. I waited for that hardware barrier, previous system was a 6600k and there's no good reason to bypass the TPM requirement.
 
Yep, that's what people don't get.

Prior versions of Windows were operating systems.

Windows 10 and 11 are operating systems - yes - but they also include forced ecosystems.

To be fair to Microsoft they are just a follower in this regard, with Apple and Google having led the way for 15 years, but as the saying used to go (in 2010 or so):

"I want my phone to be more like my computer, NOT my computer to be more like my phone."
The part you don't seem to understand, is that especially in the business world the things that are wanted are diametrically opposed to what you want.

They want efficiency to go up. Meaning reliability, uptime, least amount of higher level intervention, etc. And cost to go down

Frankly while walking around I realized that if Apple ever released an AppleTV with 128GB of storage space that could run iPadOS apps, then basically 95% of office workers could do literally all of their office tasks off of a $200 device that requires nearly zero management. Simply install all office apps, everything else is web based. KB/M is Bluetooth. There is nothing that has to be saved locally. Permissions could be user based off of iCloud login info. Even sysadmin work doesn't require any computing power to do.

There is and are fewer and fewer advantages to having a significantly more complex system that is "free to do anything" but takes inordinate amount of hours to train office drones to be able to use. Most of the world's primary computing device is their phones. I've been using an iPad Pro 12.9" as my primary computing device for a month as a bit of an experiment and found that other than deep photo and video editing there isn't much else I needed another machine for. And frankly if a few snags with accessing external drives etc was given (that is to say app/file management) I would likely be able to do most of that work on my iPad too.

The other half of these more complex systems is that they cost more and require more resources to run. Ironically iPadOS is likely the lightest weight OS you can find on a device right now given its complexity (yes, I'm aware of ultra lightweight linux devices like Rasberry Pi, but they aren't really running full desktop versions of Linux like a SteamDeck). And it "easily" survives with 4GB of RAM and a very low power SoC. Getting more complexity of "a computer" onto essentially a big "phone" would just simply make battery life worse, effective computing power go down, for negligible gains in "moar control!" serves nobody but elitists. That isn't to say that I wouldn't want to do a few more things in iOS that I cannot easily do yet. I'm just saying for general computing tasks and office workers they really don't need more.
That battle seems like it has been completely lost at this point.

I don't want an ecosystem. I don't want bloat that comes with the OS. I don't want any cloud integration. I don't want anything on my computer to link to a server on the WAN unless I explicitly configure it and tell it to. I want all logins on my local PC to be local only Logins, and nothing to ever be linked to an account anywhere else than on my local machine.

I want my computer to not reach out to anything outside the local machine on either the LAN or the WAN unless I tell it to do so myself. No automated connections to anything ever. No network scans to "find" other devices, etc. etc.

If I want to communicate with a printer at 10.0.1.17, I will set that printer up with that IP, and then I will manually type that IP in on my computer in order to connect with the printer and print. I don't want anything going looking for anything on the network.

I want my PC to be a local machine monolith only, which I can reach out to the internet on if I want to, but where this is never done without my explicit intent.

No background tasks (unless I manually enabled them), no dialing home to the mothership or any other server (unless I manually told it to, or enabled automation to do so), no data collection, no telemetry, and no bloatware, no forcing new features or applications on me. If I want an AI chat prompt (I don't) I'll download an AI chat prompt. Don't force shit on me. If I want software, I will install it myself, otherwise it shouldn't be there, and it should never change on its own without my consent.

I don't want voice activated, AI assistant anything.

I shouldn't need some weirdo Intel CPU with a shit ton of E cores to handle background tasks. Those background tasks shouldn't exist in the first place unless I intentionally enable them, and then I know what I want running and what I don't and can budget system resources for them myself.

It is MY computer, and I need to be in control of it 100% of the time. Otherwise I don't want it.

The above is the only experience I will ever be happy with. Both Windows 10 and Windows 11 utterly fail in this regard. Windows 7 was the last proper OS Microsoft made.
Great. That works for you, but most people don't want to have a CS degree to manage their personal machines. You already have what you want with Linux. But the rest of the world wants to do other jobs and other things with their time than manage their personal computers. I have a job I want to do and spending even 15 minutes on a computer that isn't a part of my job is too much time. I don't think everyone is happy with where Windows 10/11 landed with a lot of this stuff. Hence this thread. But it certainly leans more to this side than not.
 
How does that qualify as a "forced ecosystem"? New features are introduced but you're not forced to use them.

I have an HP monitor. Both Windows 10 and 11, without my permission, repeatedly install the HP Omen gaming app that opens every time I play a game and interrupts what I'm doing. I don't even have the USB connected for the extra ports, and this garbage still gets foisted onto my computer. It pops up 2-3x a week, and I have to uninstall it again. Seems pretty forced to me.
 
I have an HP monitor. Both Windows 10 and 11, without my permission, repeatedly install the HP Omen gaming app that opens every time I play a game and interrupts what I'm doing. I don't even have the USB connected for the extra ports, and this garbage still gets foisted onto my computer. It pops up 2-3x a week, and I have to uninstall it again. Seems pretty forced to me.

Sounds like Windows is just trying to install a monitor driver and HP made a pretty terrible bloated one. That's really on HP, not MS. In 99% of cases installing a monitor driver would be a good thing. From what you say, it sounds like you are causing the problem to repeat itself by uninstalling it every time. Instead of uninstalling it, go into the app and disable it.
 
Sounds like Windows is just trying to install a monitor driver and HP made a pretty terrible bloated one. That's really on HP, not MS. In 99% of cases installing a monitor driver would be a good thing. From what you say, it sounds like you are causing the problem to repeat itself by uninstalling it every time. Instead of uninstalling it, go into the app and disable it.

Its not a driver, its an app. Its a game optimizer. It installs through the app store silently because there is no option to stop Windows from doing that. Telling the app not to run on startup doesn't work, because its hooked to DX and turns itself on if you launch a 3D program. It doesn't ask me if I want it or not.
 
The part you don't seem to understand, is that especially in the business world the things that are wanted are diametrically opposed to what you want.

They want efficiency to go up. Meaning reliability, uptime, least amount of higher level intervention, etc. And cost to go down

Frankly while walking around I realized that if Apple ever released an AppleTV with 128GB of storage space that could run iPadOS apps, then basically 95% of office workers could do literally all of their office tasks off of a $200 device that requires nearly zero management. Simply install all office apps, everything else is web based. KB/M is Bluetooth. There is nothing that has to be saved locally. Permissions could be user based off of iCloud login info. Even sysadmin work doesn't require any computing power to do.

There is and are fewer and fewer advantages to having a significantly more complex system that is "free to do anything" but takes inordinate amount of hours to train office drones to be able to use. Most of the world's primary computing device is their phones. I've been using an iPad Pro 12.9" as my primary computing device for a month as a bit of an experiment and found that other than deep photo and video editing there isn't much else I needed another machine for. And frankly if a few snags with accessing external drives etc was given (that is to say app/file management) I would likely be able to do most of that work on my iPad too.

The other half of these more complex systems is that they cost more and require more resources to run. Ironically iPadOS is likely the lightest weight OS you can find on a device right now given its complexity (yes, I'm aware of ultra lightweight linux devices like Rasberry Pi, but they aren't really running full desktop versions of Linux like a SteamDeck). And it "easily" survives with 4GB of RAM and a very low power SoC. Getting more complexity of "a computer" onto essentially a big "phone" would just simply make battery life worse, effective computing power go down, for negligible gains in "moar control!" serves nobody but elitists. That isn't to say that I wouldn't want to do a few more things in iOS that I cannot easily do yet. I'm just saying for general computing tasks and office workers they really don't need more.

Great. That works for you, but most people don't want to have a CS degree to manage their personal machines. You already have what you want with Linux. But the rest of the world wants to do other jobs and other things with their time than manage their personal computers. I have a job I want to do and spending even 15 minutes on a computer that isn't a part of my job is too much time. I don't think everyone is happy with where Windows 10/11 landed with a lot of this stuff. Hence this thread. But it certainly leans more to this side than not.
Or consider this, iPad with the Logitech keyboard case and a USB-C docking station.
Apple Business Manager has tie-ins for Google Workplace and Azure, so that when you sit down and sign it it just maps your profile from the MDM and it will automatically configure email, applications, and the works.
So on the device enrollment, you can specify the base must-haves for the devices, and then you can customize it further based on specific users to enable or disable other applications.

It's not a $200 device but closer to $650 Canadian + Monitor, and accessories.

Add on security certificates with an always-on VPN configuration and now your employees are 100% mobile, and even if they are using a shitty hotspot at crapy motel the data is relatively secure and since it will deal with the online file storage during the sync process nothing is technically on the devices, use a Buffalo NAS locally to work as the intermediary between the users and the cloud storage provider of choice and when in the building you aren't chewing up data with immediate file syncs, and when they are out of the building none of your data is leaving the building.

But it's why a lot of schools are ditching Chromebooks and going back to Apple Education because the management and setup are vastly better and Privacy Impact is way better on Apple than it is on Google.

And Apple provides very simple instructions they keep updated with pictures on how to do all that, as well as has agents who can hand hold you all the way through it.
 
my biggest issue with 11 is the task bar being completely useless, so I won't be switching any time soon.
I can't say I have noticed a difference in functionality exactly but I sort of miss being able to move it, I kept it on top so I could more easily move between Apple and Microsoft environments because the muscle memory would go to the same general places when I thought Oh I need to launch this, but aside from that I can't say anything is overly different.
 
Fucking HP, That is a driver thing the bastards have an update utility that runs in the background and when it checks for updates it will auto-add it as part of the drivers like it is missing.
MSI and Gigabyte recently got called out for the same garbage on their motherboard firmware updaters and adding their software stuff.
Same garbage canned firmware update tool, same garbage behavior.
It's not quite at the level of rootkit, but it's so F'ing close that it might as well be.

Asus uses it too and I think they did everybody up by not implementing https on their hard-coded web links to the update server so hackers could spoof it.

But yeah that software runs in Kernel as it is part of the drivers or firmware... Shit like that drives me crazy, kept finding the Asus AI Suite installed and couldn't figure out why.
 
Last edited:
Back
Top