Windows 11 leak reveals new UI, Start menu, and more (UPDATE - added source for Windows 10 retirement date)

Well not to ruin the surprise, but Linux requires a password entry for privilege escalation (in fact, you could say MS stole UAC from Linux).

This is actually a good thing, and one of the main reasons Windows Vista+ was more secure.
 
XP Was a huge leap forward.

Essentially we got the protected virtual memory previously only used in NT releases in a consumer OS. It sucked a tiny bit of performance out of the equation, but essentially it made bluescreens caused by non-driver software a thing of the past. Programs could still crash themselves, but they could no longer trample on memory belonging to other programs or the operating system itself, which made it immensely more stable than its consumer predecessors.
Too bad Windows XP was just Windows 2000 with stuff that Microsoft didn't give to 2000. Some people got DX9 working on 2000 as well.
Vista was another huge leap forward.

I know it was really popular to hate on it at the time, but UAC finally brought Unix style user management and control to Windows, and was huge for increasing the overall security of the operating system. It was a little rough at first though, as programs designed for XP and earlier often tried to write in places they were no longer allowed, causing UAC to constantly ask for Admin priveleges, but once 3rd party software was updated to place data in the proper user folder locations, it worked very well.
UAC was such a huge leap that I disable it now whenever I can. In Linux you don't have this problem because the only time something asks for root is when something is trying to make changes to the system. Most apps shouldn't need root and luckily they don't.
Honestly, Microsofts three biggest releases in recent memory - IMHO - were:

- Windows 95 - Brought true 32bit pre-emptive multitasking and DirectX
- Windows XP - Brought Virtualized Protected Memory
- Windows Vista - Brought UAC
I agree with 95 and XP but Vista is the worst Windows OS with maybe Windows 8 in second place. Microsoft removed hardware audio acceleration which made no sense since a lot of sound cards had this feature, and now suddenly became useless. Vista is the first Windows OS to thrash hard drives, which drove me made because the hard disk was always working. Vista is when Microsoft started their tick tock of making a shit OS and then making a less shit OS. Vista shit. Windows 7 is Vista but less shit. Windows 8 shit. Windows 10 is Windows 8 but less shit. There's a reason I say Microsoft was due for a mistake and Windows 11 would continue Microsoft's every other shit OS. It doesn't look shit now but honestly the OS is far from finished. Microsoft is going to implement something controversial because of course they would.
 
Much as I dislike UI changes, especially when they seem like they are just fluff, it's not the end of the world. We should be used to it by now with MS, and well, everyone else.
At best this will be wasting a bit of time to figure out how to get the UI looking and working the way you want it to. Or, at least close to how you want it to. At worst, there is always shell replacements, kludgy as that can be.
Annoying, but not world ending.
 
I thought they streamlined UAC with 7 and it’s still there in 10
It is still there, but unless you run legacy software not designed with UAC in mind, it's really unobtrusive these days. It only pesters you about things it should pester you about.

And that's not really bwcause Win7, 8 or 10 improved much about it, but really more because most of us are predominantly e running software designed with UAC in mind these days.
 
Cyberreality is being hyperbolic, but XP was... bad.

I was doing ISP tech support for much of XP's original lifecycle, and when malware like Blaster hit, there was a huge wave of calls from people with infected XP systems. Malware in general was a concern until Microsoft finally clamped down with software updates.

Blaster and other viruses/worms/trojans of similar design are benign compared to broken shitty Windows 10 updates. Blaster at its worst makes your system unstable or crash, because it's basically aiming to turn your computer into a zombie on a botnet. Shitty broken Windows 10 updates trash your computer, sometimes so badly that all you can do is reinstall. Big difference. These updates are not optional either, like they were in the past. I'd much rather be removing Blaster type viruses than dealing with the bullshit that is Windows 10.

Windows 10 may be better protected than XP from Viruses, but as security has stepped up on the OS side, so has it on the black hat side. Viruses today are orders of magnitude more dangerous than back in the XP era. There may be fewer infections, but those that do happen are of the worst kind, like cryptographic ransomware; Where there's almost nothing you can do to fix it but completely flatten everything and start over. Telling people to keep backups is like pulling teeth, and it takes an event like that for them to take it seriously.
 
Much as I dislike UI changes, especially when they seem like they are just fluff, it's not the end of the world. We should be used to it by now with MS, and well, everyone else.
At best this will be wasting a bit of time to figure out how to get the UI looking and working the way you want it to. Or, at least close to how you want it to. At worst, there is always shell replacements, kludgy as that can be.
Annoying, but not world ending.
Problem I have with the new UI is that its so jarring. Windows has always been from left to right. And the fact that when opening the start menu it takes up 75% of your screen.
 
  • Like
Reactions: Axman
like this
Problem I have with the new UI is that its so jarring. Windows has always been from left to right. And the fact that when opening the start menu it takes up 75% of your screen.
I get it, but we don't know how much control we are going to have over that quite yet. We have some pics and some speculation. While expecting the worst is prolly not out of line when it comes to MS and UI changes, I am willing to wait a little bit b4 I start the yelling.
 
I don't know what you are talking about. Laptops are somewhere around 66% - 70% of all pc sales... Touch screens are awesome on laptops once you get used to them. I cannot even tell you how many times I have had to clean my macbook pro screen because my instinct had switched to using the screen on a laptop. It really is just better once you get used to it.
I don't know what you are talking about. I buy infinitely more laptops than I buy PCs. As I've never purchased a pre-built in my entire life. But I'm on my 5th laptop at least. Still the time I spent on laptops in total amounts to less than .01% of the time I spent on a PC.
Also, a lot of peps use laptops as portable workstations. Meaning they slap it on a docking station at work, then they slap it on a docking station at home, and use it with external displays keyboards and mouse all the time.

As for touchscreens, no and no. If you want to use the thing for the same as a phone, maybe. For any kind of work or gaming it's a big hell-no. The touchscreen on my workplace portable pc has been broken for 5 years, never bothered me the least bit, because it is stupid. The only thing you get are smudges on the screen that are especially good when you are trying to read it in the sunlight.
 
Depends on the game. Visual novels are actually really good for touch screens, played some on my 2-in-1 and it works well (since you are basically just clicking the screen and reading text, it is better than using a mouse).
 
Last edited:
XP wasn't bad at all and was probably Microsofts best OS ever. It's just that XP lasted far too long and Microsoft was trying to push people onto Vista and Windows 7 to the point where Microsoft started to intentionally ignore security fixes on XP. Also, Microsoft was ill prepared for the popularity of XP and the amount of attention it would bring in terms of virus's and malware. XP is bad by today's standards but back in the early 2000's is was amazing.
XP was good from a UI and infrastructure perspective, but from a security perspective it was terrible even at the time. Remember, Blaster surfaced in 2003... that's just two years into XP's lifecycle. Even if Vista had followed Microsoft's relatively quick revision schedule, it wouldn't have been ready.

A lot of it stemmed from bad OS policy. Letting people do virtually anything on a machine without asking for a password, so long as they're an admin (which is the default)? Yeah, that's bad. Making security-related updates entirely optional, and releasing them relatively infrequently? That's pretty lousy. And then there's ActiveX... gee, let's give web plugin technology direct access to your system, what could go wrong?

I got my first Mac in 2002... and OS X was already regularly asking for your admin password and otherwise locking down key parts of the system. You think XP was great security-wise only because you've been isolated from other experiences for so long.
 
UAC's biggest weakness is anyone could just click OK and it went away, exception was a non admin user on a domain. The (real) admin aka 'root' should NEVER be signed in and only that password should be used to ack UAC prompts. UAC is easily bypassed by malware because of this. NIX does it right.
 
XP was good from a UI and infrastructure perspective, but from a security perspective it was terrible even at the time. Remember, Blaster surfaced in 2003... that's just two years into XP's lifecycle. Even if Vista had followed Microsoft's relatively quick revision schedule, it wouldn't have been ready.
XP was good for a number of reasons, again based in that time period. It used the same NT kernel as Windows 2000, which at the time was a leap forward in stability and security. It was compatible with Windows 95/98 drivers, and even allowed a lot of DOS stuff to still work. It adopted a lot of the good from Windows ME and reintroduced them again into XP. Yea, security sucked because of a lot of that backwards compatibility but that's the price you pay for usability.
A lot of it stemmed from bad OS policy. Letting people do virtually anything on a machine without asking for a password, so long as they're an admin (which is the default)? Yeah, that's bad. Making security-related updates entirely optional, and releasing them relatively infrequently? That's pretty lousy. And then there's ActiveX... gee, let's give web plugin technology direct access to your system, what could go wrong?
ActiveX was stupid but the rest is just Microsoft assuming the user had a brain. I don't like how Windows 10 forces updates which is why I disable it. I'll update when I feel like it. I like having admin access because I don't want to deal with permissions. For a guy like me, that's no big deal but most people are going to download a lot of crap that will infect their machines and cause problems.
I got my first Mac in 2002... and OS X was already regularly asking for your admin password and otherwise locking down key parts of the system. You think XP was great security-wise only because you've been isolated from other experiences for so long.
If I used a Mac in 2002 then I'd be extremely limited in what I could do compared to a XP machine of that same era. If you were a Mac user back in 2002 then I'm sorry you missed out on a lot of great things. I use Linux today and it works the same, but I'm not missing out on anything like Mac users are. Today, Windows XP is horrible by todays standards but we aren't using XP today. Windows 10 by Linux standards is behind as Microsoft needs to do a lot of things that Linux does like a repository system which I hear they're doing, and updates that don't automatically install and reboot your system. Linux in most cases doesn't require a reboot when updates are applied. Linux also updates all software which Windows is still struggling to accomplish. Mac OSX is also behind by Linux standards as well. Besides not having a repository system and not updating all software, you don't have Vulkan support which is a big deal in my opinion. Mac isn't as secure as Linux, isn't open source, and isn't usable on other hardware. Windows isn't either but at least I can install Windows on other hardware besides Microsoft's.
 
UAC's biggest weakness is anyone could just click OK and it went away, exception was a non admin user on a domain.
thats not how my installs work. i setup the system with an admin account, then i add the standard account, forget to install zoom/vlc/office/etc and it prompts for the admin password. i was cloning systems on thurs and just had to rename the machine, prompts for admin pw every time.
 
thats not how my installs work. i setup the system with an admin account, then i add the standard account, forget to install zoom/vlc/office/etc and it prompts for the admin password. i was cloning systems on thurs and just had to rename the machine, prompts for admin pw every time.
You can do that with windows... its just not the norm as its not the default.
With *nix OSs using the Admin account as a regular account or giving user accounts full admin priv is never the standard setup.
 
I use Linux as my primary OS and I find Wine to be a disaster. Almost nothing works in it, and it is brutally painful to set up.
have you tried bottles ?

https://flathub.org/apps/details/com.usebottles.bottles

https://github.com/bottlesdevs/Bottles


bottles2.png


Not just for productivity software.

bottles.png


lots of options there. I belive the nature of bottles in combination of flatpak isolates wine better than just running vanilla wine and having it shit up the rest of your Linux install.

maybe give it a whirl.
 
Windows 11 doesn't seem to be much of an upgrade from 10. I just feel like they took what was supposed to be 10X and newer builds and renamed it to 11. Perhaps they will move back to the original licensing system to make more $$$. They did give 10 away for free lol

Vista was great, but my first experience with it was on a brand new PC. I still think that Vista had the best user interface out of every version of Windows. 7 kind of went back to the XP look to intrigue the XP holdouts, but 7 is also where all the dumbing down of the UI and low-level function started happening.
I totally agree with this. I had bought a brand new Macbook Core Duo or Core 2 Duo at the time and Vista ran great in boot camp. My desktop had Core 2 Quad and again Vista ran great. I might have had a few driver issues (only because manufacturers didn't update their drivers early enough) but otherwise it was my favorite OS look wise and still is.

Vista just required good hardware and for most people, that was not the case. Most people had like 512MB RAM when 1GB-2GB would have been preferred.
 
XP was good for a number of reasons, again based in that time period. It used the same NT kernel as Windows 2000, which at the time was a leap forward in stability and security. It was compatible with Windows 95/98 drivers, and even allowed a lot of DOS stuff to still work. It adopted a lot of the good from Windows ME and reintroduced them again into XP. Yea, security sucked because of a lot of that backwards compatibility but that's the price you pay for usability.
Eh... while I definitely appreciated the backwards compatibility at the time, in hindsight Microsoft could have done more to balance that with security.


ActiveX was stupid but the rest is just Microsoft assuming the user had a brain. I don't like how Windows 10 forces updates which is why I disable it. I'll update when I feel like it. I like having admin access because I don't want to deal with permissions. For a guy like me, that's no big deal but most people are going to download a lot of crap that will infect their machines and cause problems.
Nah, I don't think Microsoft was assuming people "had a brain." It was incorrectly assuming that everyone is a tech enthusiast who understands the importance of timely security updates and OS permissions. And that's not how the real world works. A doctor is a very intelligent person, but I wouldn't expect them to actively maintain their computer as well as people in a forum like this. Automating and mandating certain things makes sure people can focus on the things they really care about.
If I used a Mac in 2002 then I'd be extremely limited in what I could do compared to a XP machine of that same era. If you were a Mac user back in 2002 then I'm sorry you missed out on a lot of great things. I use Linux today and it works the same, but I'm not missing out on anything like Mac users are. Today, Windows XP is horrible by todays standards but we aren't using XP today. Windows 10 by Linux standards is behind as Microsoft needs to do a lot of things that Linux does like a repository system which I hear they're doing, and updates that don't automatically install and reboot your system. Linux in most cases doesn't require a reboot when updates are applied. Linux also updates all software which Windows is still struggling to accomplish. Mac OSX is also behind by Linux standards as well. Besides not having a repository system and not updating all software, you don't have Vulkan support which is a big deal in my opinion. Mac isn't as secure as Linux, isn't open source, and isn't usable on other hardware. Windows isn't either but at least I can install Windows on other hardware besides Microsoft's.
Nah, not really. Using a Mac in 2002 wasn't so hot as far as browsers go (Safari didn't show up until January 2003, and Chrome didn't reach any platform until 2008), but I wasn't really hurting for software or functionality outside of games. I'd also dispute the claims that Linux is more secure than macOS (where's your source for this?), and the Mac is currently sitting in a weird middle ground where some apps do have a central repository for seamless updates thanks to the Mac App Store. Vulkan? It'd definitely be nice, but Metal fills the gap in some cases.

I got into Macs because the laptop hardware was, for the longest time, just overall better. Windows laptops circa 2002 were generally hot garbage, especially at the low end... they were enough to make me switch to a PowerBook G4 after buying a supposedly decent Toshiba just months earlier. And both the hardware and software were better in virtually every way, at least for my purposes.
 
I don't want to be promoted to use or sign up for a Microsoft account whenever I install or create an account.

I don't want Cortana.

I don't want Microsoft suite of software.

I don't want any of the features of my operating system to reside in the cloud (like parental controls which used to be built in locally in Win7, but now require two cloud accounts to use)

I don't want Microsoft's pre-installed browser.

I don't want Microsoft's pre-installed OneDrive

I don't want X-Box integration, especially not on a work machine. That is just embarrassing.

I don't want their TV and Movies system nor Groove Audio service.

I don't want phone style apps like camera, clock, calendar, etc.

I don't want various keyloggers or telemetry to be sent anywhere. I don't want it to contact any other system on the LAn or WAN without me explicitly telling it to. If I plug it into a network there should not be a single packet coming in or leaving the machine unless I manually and intentionally initiate it. (Or intentionally set up my own automation)

I don't want a pre-installed Microsoft store.

I don't want News and Interests popping up on my menu bar one day without asking for it.

The list goes on and on and on.

I feel like I am playing whack-a-mole to get rid of Microsoft garbage I don't want on a regular basis and it is very frustrating.

For crying out loud, at least let me say "no to everything" in a convenient way that sticks across 6 month updates.

I want just the operating system. I may CHOOSE to install other Microsoft software (like the Office Suite) but that should be a choice. Ideally no other software or applications should be installed by default, and they DEFINITELY should be able to be removed, and not just hidden, but the binaries gone from my system so that they don't reinstall themselves when anyone creates an account.

Some of these things can be removed with registry and other tweaks, but these are unreliable and some can't be removed at all.

This is the ecosystem I don't want that is getting forced on me.

what about LTSC?
 
have you tried bottles ?

https://flathub.org/apps/details/com.usebottles.bottles

https://github.com/bottlesdevs/Bottles


View attachment 367580

Not just for productivity software.

View attachment 367581

lots of options there. I belive the nature of bottles in combination of flatpak isolates wine better than just running vanilla wine and having it shit up the rest of your Linux install.

maybe give it a whirl.

Installed the the *.deb file and it will not open.
 
Nah, not really. Using a Mac in 2002 wasn't so hot as far as browsers go (Safari didn't show up until January 2003, and Chrome didn't reach any platform until 2008), but I wasn't really hurting for software or functionality outside of games.
Browsers being hot in 2002? As far as I remember you either used Internet Explorer or you didn't use the internet. It wasn't until FireFox was created where we got features like a pop-up blocker and addons, which in my opinion remade the internet. I don't get the obsessions with Chrome as Chrome not only didn't do anything unique but almost tried to remove Ad Blocker. Also the G3 and G4 processors were nothing compared to the Athlon 64's of that era as well. You missed out on something great if you used Mac OSX only in 2002.
I'd also dispute the claims that Linux is more secure than macOS (where's your source for this?),
Ask Apple as they use Linux for their servers. What do you think Apple uses for servers? It isn't Mac OSX that's for certain. Microsoft, Apple, everyone uses Linux for their servers. That's like Coca Cola not using their own Dasani water brand for their soft drinks because they know it's too shitty. What does that tell you about Microsoft Windows and Mac OSX compared to Linux when they don't even use these OS's to secure their online server data?
and the Mac is currently sitting in a weird middle ground where some apps do have a central repository for seamless updates thanks to the Mac App Store.
Same goes for Windows but it isn't something all app developers can use freely. On Linux I can use use a PPA for Dolphin or Kodi created by the developers to ensure I always get the latest updates. You can go a bit further and use Arch Linux which updates everything all the time. Arch is like Linux but on speed.
Vulkan? It'd definitely be nice, but Metal fills the gap in some cases.
Metal creates a divide among developers because Apple wants them to learn how to code for their platform and their platform only. So developers have a few choices in that they also learn to code Metal as well as Vulkan and DX12, or they could only learn code on Metal only because nobody has time for that. Or developers could just avoid metal by using a wrapper like MoltenVK which does slightly decrease performance. It makes no sense to have Metal when Vulkan is open and Apple could donate code to make the changes needed that they want, like everyone else does.

The creation of Metal is just a method to piss off developers because end users aren't exactly dealing with the API situation directly. Apple is trying to pull a Sony PS3 by making something complicated in hopes that developers will only develop for your platform because nobody has the resources to learn everything like Vulkan and DX12. Apple thinks they have enough market share that developers will have to learn Metal and hopefully learn no other API. It's another dick move by Apple that doesn't help the API situation what so ever, and just hurts end users in that they either don't get software or will get inferior software.
 
I thought "10" was the last moniker they were going to use? Are they doing this because Apple switched to 11? o___O
My theory: Microsoft just realized this was an untenable promise. At a certain point, Windows 10 was going to seem old both from technical and branding perspectives — were you really going to be using Windows 10 in 2040? Microsoft is realizing that there will still be major milestones, and that it's worth bumping the version number up a notch now and then.
 
Yeah, the Vulkan thing really upsets me. It's a great API, with broad industry support, and Apple refuses to implement it. It's the Apple way or the highway.

You want to code 3D graphics, you need to use their graphics API, their programming language, you have to use macOS on an Apple computer, the code needs to be signed by Apple to run. It sucks.

From my Linux machine, I can export code for Windows, Linux, Android, cross-platform for WebGL, etc. all from the same code just clicking another button.

But if I want macOS or iOS support I need to buy a whole other computer, pay for a developer account, potentially use a different language and API, etc. etc. no wonder there are not many games for Mac. They are actively hostile to game developers and just open ecosystems in general.

As much as I don't really like Windows anymore, Microsoft does do developers a solid and they provide tools and APIs mostly for free, and allow things like cross-compiling (for example exporting a Win32 app from Linux).

If Apple somehow "won" on the desktop, it would be one of the worst things to happen. Though, at least they do respect privacy and (AFAIK) aren't putting key-loggers in their OS or other BS Microsoft seems to like.
 
My theory: Microsoft just realized this was an untenable promise. At a certain point, Windows 10 was going to seem old both from technical and branding perspectives — were you really going to be using Windows 10 in 2040? Microsoft is realizing that there will still be major milestones, and that it's worth bumping the version number up a notch now and then.
They could just call it "Windows" :eek:
 
Browsers being hot in 2002? As far as I remember you either used Internet Explorer or you didn't use the internet. It wasn't until FireFox was created where we got features like a pop-up blocker and addons, which in my opinion remade the internet. I don't get the obsessions with Chrome as Chrome not only didn't do anything unique but almost tried to remove Ad Blocker. Also the G3 and G4 processors were nothing compared to the Athlon 64's of that era as well. You missed out on something great if you used Mac OSX only in 2002.

Netscape Navigator Gold! ;-)
 
If Apple somehow "won" on the desktop, it would be one of the worst things to happen.

I guarantee that won't happen. The average age of a small-business desktop PC is probably over 10 years old at this point. Tons of people still milking 15 year old Core2-era PCs, some even older. The whole idea of buying an Apple PC where you basically use it for a few years and then toss it in the trash, is fundamentally incompatible with the way businesses work in the real world.

Of course, that isn't a problem if you're a hipster with too much money who chooses their personal computer based on the same criteria that a woman uses to choose her ear-rings.
 
Well it depends what industry. Some places I've worked they basically standardized on Mac (since artists and designers tend to like them) and then me as a programmer got stuck using an iMac.

It was fine, I can use whatever OS no problem, and most of the stuff I do is all cross-platform tools anyway, but I really just can't get down with Apple.
 
I dunno, part of me expects power users and enthusiasts to ditch the default Start menu and run a better shell entirely after installing Windows.
I know I'm going to be running Open Shell (if it works on 11) and re-enable the Quick Launch.
 
I don't know what you are talking about. I buy infinitely more laptops than I buy PCs. As I've never purchased a pre-built in my entire life. But I'm on my 5th laptop at least. Still the time I spent on laptops in total amounts to less than .01% of the time I spent on a PC.
Also, a lot of peps use laptops as portable workstations. Meaning they slap it on a docking station at work, then they slap it on a docking station at home, and use it with external displays keyboards and mouse all the time.

As for touchscreens, no and no. If you want to use the thing for the same as a phone, maybe. For any kind of work or gaming it's a big hell-no. The touchscreen on my workplace portable pc has been broken for 5 years, never bothered me the least bit, because it is stupid. The only thing you get are smudges on the screen that are especially good when you are trying to read it in the sunlight.
I get it, you are the embodiment of the power user and [H]ardcore stud. The first computer I built for myself was an Epson so believe me, I have been doing this for a long time too and I understand the feeling. But you have to understand that a laptop is a pc, and the very very vast majority of systems are for work and/or office, web browsing, email, facebook, zoom, etc. Janet from accounting does not need a custom built over clocked water cooled pc. She needs a dell i3 with a big monitor and her third keyboard replaced because nobody is cleaning that thing. But post covid, if it has not happened already, she is most likely going to get a laptop with a docking station. The very vast majority of users are Janets and most companies are in the same boat as Janets. As more things become web based, tech like this is going to become more prevalent because it works and genuinely provides a boost to productivity in the settings that were designed for it (the number of which is growing rapidly.) Most of us that are in the 1% of users that make up the power user category do not/have not had much use for it. I thought it was worthless until I became a student again and the value became very evident to me.

Ultimately, I do not agree with a TON of things Microsoft has done or changed. Probably the majority of changes they have made I did not agree with at the time, or still do not agree with. But you can hardly blame Microsoft for adapting to the market forces being placed on them. For better or worse, this is where the market is moving.
 
UAC's biggest weakness is anyone could just click OK and it went away, exception was a non admin user on a domain.
The domain part is not true, non-admin users are required to provide admin credentials to UAC even when not on a domain.
hats not how my installs work. i setup the system with an admin account, then i add the standard account, forget to install zoom/vlc/office/etc and it prompts for the admin password.
The Administrator account is disabled by default, you cannot enable it during install, only afterwards using obscure NET USER commands or going like 8 levels deep into the classic Control Panel. That means you need to create at least one user account which is admin and gets only the OK prompt from UAC (you can later delete this account or drop to standard user).
 
The Administrator account is disabled by default, you cannot enable it during install, only afterwards using obscure NET USER commands or going like 8 levels deep into the classic Control Panel. That means you need to create at least one user account which is admin and gets only the OK prompt from UAC (you can later delete this account or drop to standard user).
lol id dont need an piss poor explanation on how it works, thanks. you can enable the "administrator" account via comp management(hint: right click on start menu) really easily, its not "obscure NET USER commands or going like 8 levels deep"
 
Back
Top