Windows 11 May Not Run on Early Ryzen, Threadripper, Skylake-X, or Any Pre-2016 Intel PC

People use Windows/MacOS/ChromeOS/Android/iOS because it's what comes on the equipment they purchase.

You are welcome to try to make convincing arguments otherwise though.
 
The fantasy is that Windows PC buyers will switch to Linux in a fit of rage and revel in all the control and openness they've been missing.
The idea that all those corporate users are going to revel in all that control and openness is amazingly foolish. Many of them just know enough to do their job and neither want, nor care for, those things. I'm not saying that's a good thing, just that it's what it is.
 
As we see, Chrome OS has already surpassed macOS for the quarter. Not exactly the desktop Linux we were hoping for, but is technically a Linux kernel.
The irony is that Chrome OS is succeeding precisely because it does away with, well, Linuxness. It's not very flexible at all; your choices for native apps are more limited than ever; it's focused on simplicity over control. Richard Stallman hates it... all the more reason to use it, then.

The issue, of course, is that some here are convinced that a surge in Linux-ass Linux (that is, with the full complexity and capability) is right around the corner when the entirety of the conventional Linux home PC experience fits into that 1.2 percent "other" group. That's what I call a pipe dream. Windows 11 might move the needle a bit, but I'd expect more users switching to Mac than to Linux, if there's a significant shift at all.
 
The idea that all those corporate users are going to revel in all that control and openness is amazingly foolish. Many of them just know enough to do their job and neither want, nor care for, those things. I'm not saying that's a good thing, just that it's what it is.
In a sense, it is. Too many tech enthusiasts have this weird notion that everyone must be highly tech literate, and that you're an abject failure if you don't root your phone or spend weekends building PCs. Sorry, but that's not how the real world works. People often have higher priorities than tinkering with gadgets; you can be a brilliant person who just decided to pour that energy into medicine (or law, or...) instead of tech.

It's a triumph if someone can afford to not care about the tech they're using, because that means the developers did their job — it's easy and reliable enough that you don't have to think about it. While it can help to be reasonably well-informed, you shouldn't need to dive deep just to get your work done.
 
The difference between cloud storage and ftp is kinda like the difference between floppy disks and pci-e.
Wrong answer. Next!
You mean, besides one being remote hardware and the other being a protocol?
You haven't seen ftp being installed on remote hardware? It works fundamentally the same, it's just the software that's different.
Nope, because most people don't CARE. As for Linux, they're free to support/not support what they wish.
What's the point of TPM then? If it's useful then I'd imagine Linux would also require it, but last I checked Linux is better at security so what's TPM for?
There was still some outcry when they dropped i386 support from the mainline kernel, but they did it for the same reason that microsoft now is:
This is not the same as requiring TPM and you know that.
Support costs a @#%!^ load of money. If you limit what is officially supported, then you don't have to test/fix/secure against esoteric whack-ass bugs from a 12 year old chip, especially since the home market is a ~small~ part of their core revenue stream. On the Linux side, that was DONATED time and energy that could be placed elsewhere - on Microsoft's side, yeah, it's dollars - dollars that they don't get a ton of from Users (when's the last time you paid full price for a Windows license?), but DO get from Corporations (enterprise use cases pay a LOT). Corporations, however, tend to be on a 5 year refresh cycle... which means that Skylake is coming off right now, and by release, Kaby lake will be too. Exactly on time for the next release of windows. They support the stuff that the enterprise world will want them to support.
Sounds like you're making excuses for Microsoft's decisions to not support hardware that's relatively new. What is the benefit to TPM?
Here's the thing - Apple does this today. If you can't verify the signature on a downloaded binary, it won't RUN. They do it with a certificate check against an online repo. You can override, easily enough - but for most people, it just cuts down on security breaches. How many times have you seen apple use it as a DRM method (for software mind you, not talking about movies/audio, which isn't entirely in their control).
You calling me an Apple user? You saying I'm a stupid end user who doesn't know how my computer works? What's the point of them needing TPM?
One's a protocol, the other a conceptual idea. Also, one is manual, the other is integrated into the FS in various forms. rsync would be a better comparison.
They're both conceptual ideas, the only difference is that ftp is also a protocol. It's fundamentally the same but cloud storage sounds mysterious and original when it's really not.
Windows will not control the market forever "just because". They haven't really been innovating and, like Intel, they could easily lose out to a competent competitor.

As we see, Chrome OS has already surpassed macOS for the quarter. Not exactly the desktop Linux we were hoping for, but is technically a Linux kernel.
This is unfortunately true but Linus seems behind it.
 
For all the "year of the Linux desktop" people, it's already come and it is called ChromeOS, the community as a whole could have had it with Ubuntu a decade ago but in fear of one distro ruling them all they forked it to all hell divided their resources and the dream crumbled. So that left Google to pick up the pieces, and they will run with it. ChromeOS is most school districts' goTo operating device because they are cheap, easy to manage, "secure", and abundant. And as more students grow up using those devices they will start to purchase them on their own as they leave for university and so on. Once NVidia and Steam get their stuff out for the ChromeOS platform that will be a big deal for the platform as a whole and make it a far more attractive solution for a lot of users going forward.
 
Last edited:
And as more students grow up using those devices they will start to purchase them on their own as they leave for university and so on.

I have never heard of a kid who was eager to purchase a computer for themselves that was similar to the one they were forced to use during school.
 
I have never heard of a kid who was eager to purchase a computer for themselves that was similar to the one they were forced to use during school.
Kids who grow up in Mac districts, overwhelmingly purchase Mac's, kids who grow up in PC districts overwhelmingly purchase PC's there is a statistical history to this, we've now been using Chrome OS for 5 years for most of our kids and most of our graduates at this stage do own their own Chromebooks as the ones they had purchased individually are infinitely nicer than the bulk models I bring in for the schools. But the fact that over half our grade 11's and 12's have gone out and purchased their own generally means the trend is continuing and will continue to do so.
The simple rule of "hook them while their young" is a very real thing and it is why most software companies basically give K-12 access to the software for free of a very small fee, because countless studies show that when they leave they will use what they know.
 
Last edited:
Kids who grow up in Mac districts, overwhelmingly purchase Mac's, kids who grow up in PC districts overwhelmingly purchase PC's there is a statistical history to this, we've now been using Chrome OS for 5 years for most of our kids and most of our graduates at this stage do own their own Chromebooks as the ones they had purchased individually are infinitely nicer than the bulk models I bring in for the schools. But the fact that over half our grade 11's and 12's have gone out and purchased their own generally means the trend is continuing and will continue to do so.
This is very true. Given that Chromebooks have near regular PC like pricing, it's probably the main reason for Chromebook sales (because, IMHO, make no sense otherwise).

And while I don't have a stat (10 to 1, 100 to 1?), it doesn't necessarily explain "what I see" as a the huge disparity between 365 users and G Suite. But maybe I'm reading too much into "Chromebook" and maybe the assumption of G Suite there.
 
in my district, the well off kids buy macs as a status symbol, the not so well off buy pc and the low income families get chromes or donors.
edit: they all use gsuite and other online pages.
 
This is very true. Given that Chromebooks have near regular PC like pricing, it's probably the main reason for Chromebook sales (because, IMHO, make no sense otherwise).

And while I don't have a stat (10 to 1, 100 to 1?), it doesn't necessarily explain "what I see" as a the huge disparity between 365 users and G Suite. But maybe I'm reading too much into "Chromebook" and maybe the assumption of G Suite there.
That's the thing regardless of OS platform most kids high-school and university at this point are using GSuite, and most kids don't know how and can't be bothered to care about the differences so they know ChromeOS works out the gate with Gsuite and google credentials, which works with their Android phone, so they go with what they know works in their price range for what they do. I will tell you right now that the bulk of the kids here don't game on PC's they game on consoles and their phones, so gaming performance isn't really even a thought for them when purchasing a laptop they care about screens and size and the general outside aesthetics, and there are a lot of thin sleek Chromebooks out there that have good screens that don't break the bank.
 
NOPE Microsoft pulled app to check this , They are now backtracking to allow even older cpu for install.
They got hit for requiring this for installs from tech and users .
 
Windows will not control the market forever "just because". They haven't really been innovating and, like Intel, they could easily lose out to a competent competitor.

That is certainly true, but breaking into operating systems is a little bit more difficult due to the barriers of entry.

You have to convince a ton of developers to write their software for your platform. Sure, it is easier than ever to port and recompile for different platforms, but still we see an unwillingness of developers to take risks on new operating systems until they are established.

This is the same problem Microsoft had with the Windows Phone. Android and iOS already had dominant amounts of applications, but Windows Phone was lacking in this regard.
 
Ok I guess you are trolling or deeper into the Kool Aid than I thought....

Well, for the data storage component of cloud services, or in other words Cloud Storage, he really does have a point.

It's still just storing your data on someone else's computer. The cloud model is often more insidious though, on by default unless you know to turn it off, and automatically harvests information about you when you do so and sells it on, though.

There is much, much more to cloud services though. Spinning up VM's or dedicated servers in the cloud to handle an entire platform is the norm these days. It's a little sad if you ask me. I liked knowing that when I used a service, I was using that service, and not AWS indirectly, but these days if you use anything online, chances are it is on AWS, Azure, etc. No one wants to run their own servers anymore, and it is a crying shame.
 
Well, for the data storage component of cloud services, or in other words Cloud Storage, he really does have a point.

It's still just storing your data on someone else's computer. The cloud model is often more insidious though, on by default unless you know to turn it off, and automatically harvests information about you when you do so and sells it on, though.

There is much, much more to cloud services though. Spinning up VM's or dedicated servers in the cloud to handle an entire platform is the norm these days. It's a little sad if you ask me. I liked knowing that when I used a service, I was using that service, and not AWS indirectly, but these days if you use anything online, chances are it is on AWS, Azure, etc. No one wants to run their own servers anymore, and it is a crying shame.
It's just so cheap to use their services and financially much easier to account for. You purchase your own server say $10,000 but then you need the networking, internet, and security infrastructure to deal with it, you need your backup and disaster recovery software and hardware so suddenly your $10,000 server is $50,000. Big woop that's nothing really, but now you need somebody knowledgeable in how to maintain and administer that server and its backups and your networking and security. If you are lucky that is just one guy but it's probably two if not 3, so now you are looking at some $200k in annual wages to support that $50,000 in hardware. Meanwhile, you could instead spend $500 a month with AWS or Azure which over your 5-year plan ends up costing you less than the hardware/software licensing for that server, while also saving you the needed support staff to manage that hardware and instead replaces it with some one-time setup fees and maintenance can then easily be farmed out to a 3'rd party on a simple contract basis for remote support which probably comes in at a flat $20,000 per year. So what you have managed to do is get the same service running, with a much smaller set of depreciating assets, with a smaller payroll, which just looks awesome on financial sheets, especially for startups or small businesses where oftentimes having dedicated tech support access is not feasible.

I mean this is also why we see so many poorly configured AWS instances because to save themselves the $20,000 they google "how to configure "this" in Ubuntu LTS X.X" then follow some guide they find online without knowing some of the pre or post procedures to properly secure the environment, and as there isn't really any repercussions to leaking your customers' data there really isn't any strong financial incentive to not do it in most cases, and really the number of leaks are actually much larger but they lack the staff and procedures to even detect that they have been hacked let alone determine the extent of which they were breeched.
 
It's just so cheap to use their services and financially much easier to account for. You purchase your own server say $10,000 but then you need the networking, internet, and security infrastructure to deal with it, you need your backup and disaster recovery software and hardware so suddenly your $10,000 server is $50,000. Big woop that's nothing really, but now you need somebody knowledgeable in how to maintain and administer that server and its backups and your networking and security. If you are lucky that is just one guy but it's probably two if not 3, so now you are looking at some $200k in annual wages to support that $50,000 in hardware. Meanwhile, you could instead spend $500 a month with AWS or Azure which over your 5-year plan ends up costing you less than the hardware/software licensing for that server, while also saving you the needed support staff to manage that hardware and instead replaces it with some one-time setup fees and maintenance can then easily be farmed out to a 3'rd party on a simple contract basis for remote support which probably comes in at a flat $20,000 per year. So what you have managed to do is get the same service running, with a much smaller set of depreciating assets, with a smaller payroll, which just looks awesome on financial sheets, especially for startups or small businesses where oftentimes having dedicated tech support access is not feasible.

I mean this is also why we see so many poorly configured AWS instances because to save themselves the $20,000 they google "how to configure "this" in Ubuntu LTS X.X" then follow some guide they find online without knowing some of the pre or post procedures to properly secure the environment, and as there isn't really any repercussions to leaking your customers' data there really isn't any strong financial incentive to not do it in most cases, and really the number of leaks are actually much larger but they lack the staff and procedures to even detect that they have been hacked let alone determine the extent of which they were breeched.


Yeah, I know the IT field is generally not a fan of government intervention, but I feel we could definitely use some strong data security regulation.

That's exactly my thought process. Sure, "putting it in the cloud" saves some money, but you still need to spend the wages, as you NEED someone competent to manage your cloud instance! Almost without exception, cloud contracts explicitly stipulate the security is on you, not them.

There are too many ignorant or just plain reckless executives in decision making positions regarding this stuff.
 
Yeah, I know the IT field is generally not a fan of government intervention, but I feel we could definitely use some strong data security regulation.

That's exactly my thought process. Sure, "putting it in the cloud" saves some money, but you still need to spend the wages, as you NEED someone competent to manage your cloud instance! Almost without exception, cloud contracts explicitly stipulate the security is on you, not them.

There are too many ignorant or just plain reckless executives in decision making positions regarding this stuff.
The biggest issue I deal with on the Tech side is staying current on security trends and requirements, the best practices stuff from as recent as 4 years ago is absolute garbage today, and the stuff I am doing now will probably be useless in another 4 years. At this stage running a firewall that is capable of application filtering, and SSL decrypt is an absolute necessity if you have any real hope of running an externally accessible web service with any degree of security, but the number of appliances out there that are just doing simple port forwarding is atrocious. And very few organizations are doing filtering and shaping on outgoing traffic, the most common attack vector I deal with doesn't come from PC's it comes from phones, somebody gets a virus on their phone or tablet when they are at home or traveling, that device then gets brought back to the office where they connect it to the wifi then that device launches the attack. And god forbid that device has a data plan because then it scraps the network internally while using the device's data plan to bypass the whole protection system to transmit the best bits back over that. So now really my firewall's rules on incoming traffic are pretty slim maybe 60 or so individual rules, outgoing though closer to 100, and internal another 120 or so to ensure that devices are only communicating to one another on the specified ports with the specified protocols using the specified applications from the specified user groups.

The complexity of the networking stuff now actually makes me excited for what NVidia is bringing to the table here with their recent announcements because it is quickly becoming unwieldy to try to manage it all solo like I have been doing for the past decade.

The security part of things is why I really prefer Azure to AWS, Azure's prefered method is not direct access but instead, you operate a VPN from your datacenter to your cloud instance where you do all your traffic filtering there but then route the traffic back over the VPN to your Azure services. But those requirements make it obviously more complex which is why AWS is so widely used because the barrier for entry is so ridiculously low.
 
Last edited:
I have to be honest, the TPM thing makes sense to me. When 10 goes EOL, basically all the hardware not compatible will be well beyond EOL anyway.
In enterprise for Windows 11, sure, but in consumer space, not so much.
Considering all of the systems still running Sandy Bridge and Ivy Bridge-era equipment, which is around a decade old now and is still going strong, I don't see why Kaby Lake and ZEN 1-era systems would be any more end of life since it would still be less than a decade old by 2025; in fact, it would be considered even less obsolete so these days, at least going purely off of the CPU requirement itself and not the TPM requirement.

Do what you gotta do. :borg:
 
In enterprise for Windows 11, sure, but in consumer space, not so much.
Considering all of the systems still running Sandy Bridge and Ivy Bridge-era equipment, which is around a decade old now and is still going strong, I don't see why Kaby Lake and ZEN 1-era systems would be any more end of life since it would still be less than a decade old by 2025; in fact, it would be considered even less obsolete so these days, at least going purely off of the CPU requirement itself and not the TPM requirement.

Do what you gotta do. :borg:
For the consumer space, 1.2 is adequate and easy to meet those requirements as it's been on all Intel platforms since 2008. Remember with windows 98 the prompt window that would go something like "Hey your system is still running FAT, let's upgrade you to FAT32, here's why that is a great thing for you to do", well for people upgrading to 11 they just need a similar prompt for "Hey we see your not using Bitlocker, let's secure your system, here's why this is a great thing for you to do" than on all new deployments have it as part of the standard configuration that OEM's abide by, and have it as part of the default configuration for any systems getting a clean install and call it done. For Enterprise, LTSB, and Education licensing make 2.0 a requirement as it does have some tangible improvements over 1.2 that those organizations licensing that through VLSC can actually take advantage of. Then some time down the road when 11.3 or whatever the third iteration or service pack or whatever they are calling it comes out make 2.0 the requirement there as well as by that stage the people still running 1.2 only hardware should be pretty minor and probably in a stage where they are actively looking for an upgrade. But I think forcing 2.0 as a requirement for the upgrade process is a bit extreme.
 
Linux still isn't going to make significant inroads until it's aimed at the mainstream — not an enthusiast's idea of the mainstream, the actual mainstream.

The fantasy is that Windows PC buyers will switch to Linux in a fit of rage and revel in all the control and openness they've been missing. The reality? Unless the major desktop platforms truly go off the rails (and they haven't), people will either keep using Windows or (among well-heeled customers) switch to Macs. Real, everyday people are just interested in systems that accomplish common tasks reliably and quickly; they don't care that Windows is wasting some screen space or that macOS looks a bit more like iOS this year. And they certainly aren't interested in Linux ideology.
No, Linux isn't going to become as popular as Windows unless Microsoft shoot themselves in the head.

The only reason for the popularity of Windows is the fact it's on the device when you buy it, most actually hate it, they use it under duress.
 
The only reason for the popularity of Windows is the fact it's on the device when you buy it, most actually hate it, they use it under duress.
I can't think of a single one of my non-techy friends or co-workers that use Windows that hate it, it generally does what they want it to do and they tend not to even think about it much....
 
No, Linux isn't going to become as popular as Windows unless Microsoft shoot themselves in the head.

The only reason for the popularity of Windows is the fact it's on the device when you buy it, most actually hate it, they use it under duress.
You're thinking too much about it, 99% of the users who use Windows don't care what the system is actually running. They turn it on they run their software they turn it off, it looks how they are used to it looking, things work the way they expect it to be working, and they move on with their lives, they don't think about it. For all the "big" UI changes that are being shown, at the end of the day they are pretty superficial and nothing there is a drastic leap from anything we have seen to date.
 
No totally true. Many people complain when they have an important document open and Windows decides to update and restart out of nowhere.

That is something in Microsoft's power to fix. Maybe with a new file system and update process. But they don't care.

What about trying to search for files in the file explorer? It has never worked once. It will take 10 minutes and find nothing (even when the file exists and you typed it correctly).

People just live with broken features because they don't think there is an alternative.
 
No totally true. Many people complain when they have an important document open and Windows decides to update and restart out of nowhere.

That is something in Microsoft's power to fix. Maybe with a new file system and update process. But they don't care.

What about trying to search for files in the file explorer? It has never worked once. It will take 10 minutes and find nothing (even when the file exists and you typed it correctly).

People just live with broken features because they don't think there is an alternative.
I can't say those have been a problem for the last decade, search works fine, I have hundreds of users that have no clue where they are saving anything at any point and live out of the file explorers search function, if that stopped working it would be chaos. I am also never seen a windows PC just shut down out of nowhere unless the system straight-up failed, windows updates generate prompts, even in edge cases where the update is scheduled to happen outside usage times and you are using the machine it generates a popup that asks the user if it is OK to reboot and if not when it can be rescheduled. Honestly, I see more search failures on MacOS than I do in Windows 10, but the process for rebuilding the search database on MacOS is clearly documented and only takes a few minutes to clear so it's easy to walk users through or just remote into their system and do it for them.
 
Windows will not control the market forever "just because". They haven't really been innovating and, like Intel, they could easily lose out to a competent competitor.

As we see, Chrome OS has already surpassed macOS for the quarter. Not exactly the desktop Linux we were hoping for, but is technically a Linux kernel.

View attachment 370672

https://arstechnica.com/gadgets/202...-desktop-operating-system-isnt-macos-anymore/

Steam is coming to Chrome OS later this year in Q3, which could be huge.

https://chromeunboxed.com/borealis-a-k-a-steam-will-live-in-the-chrome-os-settings-menu/

That really is Windows last hold out for desktop enthusiasts. With Valve investing so much in Proton, improvements on Linux and also bringing to Chrome OS (and possibly the SteamPal handheld), well Windows days are numbered.

Yes, businesses will likely stay on Windows for Office and certain proprietary software, but honestly the ship is going down.

Personally wouldn't run Chrome OS as a daily driver, since I need development tools, but for many people that is enough to use the web and cloud-based web apps. And soon they can game as well.

Except that Chrome OS is not a true desktop OS so.........
 
In enterprise for Windows 11, sure, but in consumer space, not so much.
Considering all of the systems still running Sandy Bridge and Ivy Bridge-era equipment, which is around a decade old now and is still going strong, I don't see why Kaby Lake and ZEN 1-era systems would be any more end of life since it would still be less than a decade old by 2025; in fact, it would be considered even less obsolete so these days, at least going purely off of the CPU requirement itself and not the TPM requirement.

Do what you gotta do. :borg:
You won't get much argument out of me on that. A Sandy Bridge I7 can check email, surf facebook, and run ms-word with the best of them which covers something like 95% of pc user needs. But I believe the industry standard EOL for consumer desktop hardware is 5-6 years and 3-5 for laptops. Even Nvidia / AMD only support devices for 5 years generally and that is longer than most hardware manufacturers.
 
For those of us in the audience not familiar with this line of reasoning could you please define what makes something a "True" desktop OS?

I can tell you it is not a locked down web browser pretending to be an OS with app store only stuff. Oh, and you must be online to use it, normally. A true Desktop OS is Amiga, OS/2, Linux, Windows, Mac OS and other such stuff which, as you can see, is built with a proper usage model in mind.
 
No totally true. Many people complain when they have an important document open and Windows decides to update and restart out of nowhere.

Windows 10 goes out of it's way to notify you before it restarts. The people who prefer to keep the same Microsoft Word document open for 6 months without saving it will simply have to adapt to the reality of regular security updates. The glory days of going years without doing updates are fortunately over... Android devices, iOS devices, ChromeOS devices, and even MacOS devices also restart to install updates... just in-case you thought it was a Windows issue...

What about trying to search for files in the file explorer? It has never worked once. It will take 10 minutes and find nothing (even when the file exists and you typed it correctly).

My search works fine, despite searching simultaneously from all of my local drives as well as over a dozen mapped network drives located on several different file severs. Not sure why yours is broken. User error? My guess is that you probably haven't actually done a windows search in years, just like you clearly formed your opinion about Windows 11 before it was even announced. But nevermind that this is a Windows 11 thread, please give us 10 more pages beating-off to Linux fantasies.

People just live with broken features because they don't think there is an alternative.

Actually most people "live with" their current computer because it works fine, not because they are oppressed and have not yet been shown the light of Linux.
 
You're thinking too much about it, 99% of the users who use Windows don't care what the system is actually running. They turn it on they run their software they turn it off, it looks how they are used to it looking, things work the way they expect it to be working, and they move on with their lives, they don't think about it. For all the "big" UI changes that are being shown, at the end of the day they are pretty superficial and nothing there is a drastic leap from anything we have seen to date.
No I'm not, I deal with the public and their Windows issues - Believe me, they care. Contrary to right wing belief, 100% of the masses aren't stupid or ignorant.
 
Pretty sure it means, Windows.
For one, it means no manipulative tactics forcing people to online cloud based accounts. The other feature that defines a desktop OS is no forced touch UI element at all. The new Settings Panel taking place of the Control Panel, something Microsoft are relentlessly forcing on their users, is an example of a touch UI making poor use of screen real estate regarding desktop users.

There's also the implication that the uploading of personal metadata is in no way taking place on behalf of the OS itself, such features are strictly opt in only.
 
But nevermind that this is a Windows 11 thread, please give us 10 more pages beating-off to Linux fantasies.
You're funny. I'll give you that.

Honestly, I'm not anti-Windows. I would like to see evolution on the desktop, including Windows, but Windows 11 seems like a patch update with some UI changes and not anything innovative.

I think it would be more interesting if there was real competition, whether that is from Apple or Linux, or other things like Chrome OS. I just want to see things move forward and it seems Microsoft is in a similar position to Intel a few years ago (except there is no AMD in the OS-space right now).
 
I see a lot of people here upset that there's a lot of us suggestion to switch to Linux. So here's some photos to support that.

0hi4zlyw8g871.jpg
1i1n00fr6f871.jpg
Also Rick uses Debian.
bi3eqkwzg6871.jpg
 
Wrong answer. Next!

You haven't seen ftp being installed on remote hardware? It works fundamentally the same, it's just the software that's different.
Uh, no. FTP is a protocol with put/pull, and that's... really it. Cloud storage, in the concepts being discussed here, has locking, sharing, multiple access points simultaneously, active access while stored remotely, remote execution, etc. There's a drastic difference in terms of ~business outcome~, which is why cloud storage drives billions of dollars in revenue, and FTP is just a tool included in every OS. Doesn't matter as much for the home user, at least not the same way, but business activities cannot be simply replaced with FTP - saving a file remotely is different than collaboration tools. Show me how to have 15 people editing a excel file simultaneously with FTP - I'm curious - because that's a business outcome. Same for source code with git and github (which is a different variety of cloud storage).
What's the point of TPM then? If it's useful then I'd imagine Linux would also require it, but last I checked Linux is better at security so what's TPM for?
They choose to make it optional. They can, and do, use it if you want to and it's present. Microsoft made a different choice. Not saying it's the right choice, but I understand the choice and what they're trying to accomplish (at least publicly).
This is not the same as requiring TPM and you know that.
And you know that FTP is not the same as cloud enabled filesystems or storage. Dropping support for older architectures for financial reasons is what places that are driven by revenue do. 5 year refresh cycle - Kaby Lake came out in 2016. No business revenue behind it, and consumer revenue is smaller than business (at least those who care - like us - who get their licenses far under MSRP).
Sounds like you're making excuses for Microsoft's decisions to not support hardware that's relatively new. What is the benefit to TPM?
I've already explained this, but I'll try farther in my next line:
You calling me an Apple user? You saying I'm a stupid end user who doesn't know how my computer works? What's the point of them needing TPM?
See above. Same reason - separate processor with limited API interaction (at the hardware level) is more secure than a general purpose application that is queried, even at Ring 0, because a writable application can be corrupted - a read only (or limited access) device that just holds keys is more reliable. And no, I'm pointing out that the majority of users won't care, or notice a difference - because they ~won't~. There's a reason that key-exchange systems are still used today - and why most root CAs (the true roots, not the intermediates that provide signing responses) are run the way they are.

Question - you might understand this better if you can answer this: What state is a Root CA kept at? (These are the root CA's that are listed in MSFT and Linux root Cert stores and distributed with the OSes or browsers (Firefox, for one, and I know Chrome was looking at their own root store)).
They're both conceptual ideas, the only difference is that ftp is also a protocol. It's fundamentally the same but cloud storage sounds mysterious and original when it's really not.

This is unfortunately true but Linus seems behind it.

Uh, no. Not quite. One is leveraged into business outcomes and drives revenue - the other is a tool. I sell enterprise IT. No one even thinks about FTP, outside of basic log offload/backup offload from small tools (see: VMware NSX-T offloading config backups to SFTP, other vmware tools, etc that do the same. Small text files. Also support uploads). But people pay literal millions of dollars for OneDrive, Google Drive, DropBox, etc.

I'll watch linus' video later, but I'm betting from seeing others like it from him (and others) that it's arguing that it's delivering an outcome. Linux is a tool. ChromeOS is sold as an outcome and a result - it just uses Linux to get there.
 
No I'm not, I deal with the public and their Windows issues - Believe me, they care. Contrary to right wing belief, 100% of the masses aren't stupid or ignorant.
No one said ignorant or stupid - if it runs their programs, doesn't run stuff they don't want (especially malware), and does so at a decent rate and performance, mission done.
 
Well, for the data storage component of cloud services, or in other words Cloud Storage, he really does have a point.

It's still just storing your data on someone else's computer. The cloud model is often more insidious though, on by default unless you know to turn it off, and automatically harvests information about you when you do so and sells it on, though.

There is much, much more to cloud services though. Spinning up VM's or dedicated servers in the cloud to handle an entire platform is the norm these days. It's a little sad if you ask me. I liked knowing that when I used a service, I was using that service, and not AWS indirectly, but these days if you use anything online, chances are it is on AWS, Azure, etc. No one wants to run their own servers anymore, and it is a crying shame.
Business outcomes - tie it straight to revenue, even COGS. Servers and apps deliver a business outcome. Cloud services do so to, and do so very easily. :p That's the problem - why run a DC when you don't need to? Folks do, sure - but lots don't. I sell enterprise IT for on-prem workloads, and even I can see that the majority of the future is cloud built. Too much overhead if you don't need on-prem.
 
Back
Top