Why Windows Vista Sucked

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,637
While we try to stay away from bringing up bad memories, Terry Crowley over a Hackernoon has a very good article entitled, What Really Happened with Vista. Before you start self-cutting again, Terry's take on this is much more specific than any I have read in the past, and much more technical. While this story is decade old, his writing sheds light on many issues that need to be considered today when it comes to our hardware and software.

This is also not investigative journalism. I have not done a deep series of interviews with key parties. This is my point of view based on what I experienced during that period and learned afterwards. Although I was in the Office organization at the time, I worked closely with many Windows partners and so was very aware of what was happening in the Windows organization.
 
Vista is 95% the same as Windows 7, so I really disagree that it sucked. It was just a bit before it's time, with some poorly adjusted features like an over-aggressive UAC. It was the first viable 64-bit version of Windows, and by the time Windows 7 arrived everyone I know had already switched to Vista 64 as their Primary gaming OS, with only a few still dual-booting XP.
 
I can sum it in up 4 words: Because its not Linux.

Or if you prefer: Because its not Mac.
Or if youre a sadist: Because its not XP.
 
This is what I remember when it was initially released. Keep it mind most of this was fixed.

#1 File copying was slow. Anyone remember this?

#2 Video drivers for Vista sucked. Performance was noticeably worse.

#3 Driver compatibility issues. Lots of hardware wouldn't work cause nobody updated their drivers for Vista.

#4 Hardware Sound acceleration was removed. Which meant if you had a hardware sound card it just became useless. Also Creative charged money for Alchemy software. The dicks.

By Windows 7 they have patched it so basic things like file copying was faster. Video drivers matured to the point where Windows XP and Vista performance was nearly equal. What hardware that wouldn't work in Vista was eventually replaced with newer hardware with working Vista drivers, and therefore working Windows 7 drivers. Nobody has sound cards with 3D sound acceleration. They were phased out.
 
Interesting article, though I'll have to read most of it later. It doesn't look to me like it's saying that Vista sucked, and I notice that one of the article's main points is also one of the common reasons given for why Vista did not suck, but was just presented to a technologically- naive and unprepared market:

Microsoft badly misjudged the underlying trends in computer hardware, in particular the right turn that occurred in 2003 to the trend of rapid improvements in single-threaded processor speed and matching improvements in other core elements of the PC. Vista was planned for and built for hardware that did not exist. This was bad for desktops, worse for laptops and disastrous for mobile.
 
Last edited:
Vista is 95% the same as Windows 7, so I really disagree that it sucked. It was just a bit before it's time, with some poorly adjusted features like an over-aggressive UAC. It was the first viable 64-bit version of Windows, and by the time Windows 7 arrived everyone I know had already switched to Vista 64 as their Primary gaming OS, with only a few still dual-booting XP.
UAC wasn't that aggressive, just forget that all programs were poorly coded at the time used to XP where your instant messenger needed admin access etc. Poorly designed programs lead to UAC popping up with everything, that coupled with the instability or unavailability of drivers lead to most of the complaints.
 
I agree with most of the sentiments above. It wasn't necessarily that Vista "sucked" but more that it was ahead of its time. The hardware wasn't there yet and the driver support wasn't there yet. Since they changed up the Kernel so much from XP there was a very large amount of incompatibilities with software and hardware. I think Windows 7 was so well received because like someone else said they had already fixed these issues by then. It was virtually the same as Vista with only minor tweaks.
 
#3 Driver compatibility issues. Lots of hardware wouldn't work cause nobody updated their drivers for Vista.
By Windows 7 they have patched it so basic things like file copying was faster. Video drivers matured to the point where Windows XP and Vista performance was nearly equal.

One important thing to remember is that Vista basically enabled the transition from 32-bit to 64-bit. It wasn't just a matter of drivers having time to mature. With XP, 32-bit was the only version that was relevant. There was a desire among hardware makers to initially only make 32-bit Vista drivers for their hardware, similar to how most only had 32-bit XP drivers. Microsoft decided that the only way a driver could get WHQL certified is if hardware makers made both a 32-bit AND a 64-bit version of their Vista drivers. That basically forced 64-bit drivers to be made, and ultimately created a swath of stable 64-bit drivers by the time Win7 arrived. Windows 7 would have been a catastrophe if it had not been able to build upon the 64-bit driver foundation that Vista created.

#4 Hardware Sound acceleration was removed. Which meant if you had a hardware sound card it just became useless.

EAX was "removed", but OpenAL was created which was better in just about every way. I'm still not sure, to this day, why game developers never really adopted OpenAL but I don't think Microsoft can be blamed for this. Also, hardware processing in cards like the X-Fi is still used for features like CMSS-3D (up-mixing stereo sources into surround, and down-mixing surround sources into stereo).

UAC wasn't that aggressive

From an IT or power-user perspective, I would agree. But from the perspective of a standard user coming from XP, it was a difficult transition, and ultimately became one of the main complaints about Vista. If that many people complain, to where it's hurting the reputation of the OS itself, then yes it's too aggressive. Thankfully people became used to it eventually.
 
UAC wasn't that aggressive, just forget that all programs were poorly coded at the time used to XP where your instant messenger needed admin access etc. Poorly designed programs lead to UAC popping up with everything, that coupled with the instability or unavailability of drivers lead to most of the complaints.

It was aggressive until a person figured out how to tone it down or disable it. After my first fresh Vista installation (and again in recent years when I re-installed Vista for fun), UAC prevented me from installing the programs I use - no permission-asking or simple warning, but outright denying installation with no hint on how to ignore the warning and install anyway.

At the time of Vista's release, I hadn't ever encountered something like that, because PC functionality had been about doing absolutely everything the user thinks to do. So, encountering something in the OS that was not allowing me to do something was a concept I hadn't yet developed, and I couldn't figure out what was going on. I think there wasn't even a UAC popup informing what was happening, and that certain software just wouldn't run, or would not run while an asterisk sound would play, with no indication of what it meant.

Searching up solutions online was also a concept I hadn't fully developed at that time, because I had always learned everything through experience or by asking friends. I recall being frustrated with the UAC effect on Vista until I looked the issue up online and found the solution. Then I just completely shut off UAC, and it's been fully disabled on my PCs ever since - If I run an app on my PC, it's because I know why I'm running it and want it to run, and will have it run one way or the other lol.
 
It may have been bad initially but after being patched up and manufacturers releasing drivers for hardware, Vista worked well and was almost the same thing as Windows 7.
I don't think i've ever seen a fully patched up version of vista in the wild, it always broke before that ever happened.
 
(and again in recent years when I re-installed Vista for fun),

Is this a cry for help?

If I run an app on my PC, it's because I know why I'm running it and want it to run, and will have it run one way or the other lol.

Its not really about protecting the PC from you per se. Its about actions taken on your behalf without your knowledge. Like how you run an installer for a chatprogram that wants admin access so it can install a keylogger in the background...under XP it just worked. with UAC it at least presents what the program is asking for access wise letting YOU be the one to make the call. You still get to be a bonehead and install that p0rn downloader from warezsite#3204823 if you want or that chat program from majorcompany.

I will agree though that their implementation of it just blew blue donkey balls...
 
It sucked up until SP1...then it was fine after that, but the damage was done.
 
Note: the author is a programmer and not a technician. My experience is that many programmers are in many ways technically illiterate. I found that a properly configured Vista machine was far more stable and robust than an XP machine. Vista was by and large Windows 7 First Edition. I was blown away the first time a sound driver crashed in the middle of a game and I only had to Ctrl-Alt-Del to task manager, end the game and restart the game without rebooting Vista. I have had far less problems with Vista than XP.

In 2006 I went to work for a company that was having major computer problems. I upgraded (not replaced) all the computers and upgraded all the computers to Vista from XP and all the problems went away.

If you tried to run Vista without enough memory you would have problems. If you tried to run Vista with cheap hardware and poorly written drivers you would have problems. If you had problems with Vista than your Computer Hardware Kung Fu was very weak...
 
I used Vista from the time it was released until the time Win 7 rolled out. It had a few issues here and there, but I don't remember it being any more painful than XP or the early days of Win7 for that matter.
 
80% driver issues - vendor fault mostly
10% UAC - just turn it off
9% slow file copy - which did indeed suck
1% misc BS

Overall though it was much better than XP, even 64bit XP.
 
I rather enjoyed Vista Ultimate 64bit better than XP Pro 64bit...especially since the Wi-Fi handling/configuring was far and away better in Vista than it ever was in XP. Vista ended my despair of having to continually disconnect/reconnect my USB wireless adapter or rebooting numerous times a day just to get it to sync with my router.

The most notable negative memories I have with Vista was not with Vista itself, but with a lot of the add-in hardware manufacturers refusing to publish Vista drivers for existing hardware, but they were sure on the ball with their shiny new versions.

They blatantly used an OS as a mechanism for forced obsolescence...and I haven't bought any hardware from many of the offenders since I was on XP.

Shitty part is that this exact crap continues today: Intel being another culprit with their "Win10 Only" requirement for their latest chipsets and processors. Not that it directly effects me, since I am loving Win10 and not looking back...but I disagree with said practice, especially since I work in a corporate IT department with active revision control in place for things like OSs and certain softwate applications). It's forcing companies to extend the field life of computing assets, risking potential costly break/fix instances, or footing a humongous bill to validate the latest hardware and OS to ensure proper end-user functionality with existing applications (many of which may not even operate as needed with Win10).
 
article said:
The bet on C# and managed code was poorly motivated and poorly executed. This failure in particular can be laid directly on Bill Gates and his fruitless Holy Grail effort to create a universal storage and universal canvas applications infrastructure. This had especially long-running consequences.

Fully disagree. .NET was the answer to JAVA. And while it isn't nearly as powerful as C++ in terms of raw speed (especially with random large memory access), it is very easy to write code for it and develop an application quicker and with fewer bugs than C++.

Also the libraries used are very generic template driven like STL, this limits their efficiency and requires specialty libraries for things like database for quicker access. Home rolling your own specialty classes for specialty algorithms (heap sort/bin sort, etc...), memory access/complex data structures (B-Trees) isn't nearly as easy compared to C++.

But .NET has made some good progress in performance and I even see DX games written in .NET. (Although I still think C++ is better in terms of performance) And I find .NET apps to be more stable and cleaner overall.
 
80% driver issues - vendor fault mostly
10% UAC - just turn it off
9% slow file copy - which did indeed suck
1% misc BS

Overall though it was much better than XP, even 64bit XP.

The driver issue was described in the article. Ultimately it was Microsoft's fault for almost unnecessarily moving the driver layer outside of the Kernel, and doing so late in the development (which gave no lead time to vendors), which caused a lot of compatibility problems.

It seems the bulk of the issues were ambitious goals set by the Windows team but with no real reason behind them. They also underestimated Moore's law at the time and kept moving parts of the OS out of the kernel which resulted in a bloated 'middleware' layer that wasn't there before. And because they underestimated the hardware's power at the time of release, this caused a whole lot of overhead in processing.

That was a pretty good article and a good morning read while I sipped my coffee and pretended to work.
 
As far as end users were concerned, the lack of memory was probably the sticking point, IIRC. I was working service desk back then, and a large number of tickets dealt with users complaining about sluggish performance. The machines were HPs from 02-04, and mostly equipped with 256MB RAM. Not an issue at all with XP, but Vista presented some problems. Hell, my Dell with 512MB RAM had issues. Admittedly, my company should have been to blame for this one. Got a bit too overeager to stay current, and thought the recommended hardware requirements for Vista were a bit rubbish. If I'm not mistaken, Vista even refused to install if it detected less than 512, but there was a way to get around it.

There were some users that even tried to use the ReadyBoost feature. Anyone remember that? Microsoft really should have clarified just exactly what the purpose of that was. You had people sticking in cheap 512MB thumb drives into their PC, and expecting their RAM to double. Always got a kick out of that one.

UAC was rough, too. I don't think many people understand how much that affected tech illiterate users.
 
Fully disagree. .NET was the answer to JAVA. And while it isn't nearly as powerful as C++ in terms of raw speed (especially with random large memory access), it is very easy to write code for it and develop an application quicker and with fewer bugs than C++.

Also the libraries used are very generic template driven like STL, this limits their efficiency and requires specialty libraries for things like database for quicker access. Home rolling your own specialty classes for specialty algorithms (heap sort/bin sort, etc...), memory access/complex data structures (B-Trees) isn't nearly as easy compared to C++.

But .NET has made some good progress in performance and I even see DX games written in .NET. (Although I still think C++ is better in terms of performance) And I find .NET apps to be more stable and cleaner overall.

The poor execution part the author referred to was the reliance on the garbage collection aspect of C#. It seems the engineers relied on this too heavily, didn't optimize code, and underestimated hardware performance as it related to the massive overhead caused by the poorly written auto garbage collecting reliance.

Stack modules written in C# on top of each other in addition to the kernel and you have an exponential increase in performance cost when hardware was not yet up to par to handle that. It was noted that WinFS was dropped because it actually performed much slower than the pre-existing desktop search since it ran on top of everything else and wasn't built-in.
 
Vista is 95% the same as Windows 7, so I really disagree that it sucked. It was just a bit before it's time, with some poorly adjusted features like an over-aggressive UAC. It was the first viable 64-bit version of Windows, and by the time Windows 7 arrived everyone I know had already switched to Vista 64 as their Primary gaming OS, with only a few still dual-booting XP.
I might have fallen into that 5% then.

My new laptop back in the day came with Vista, but for some reason I had games that would refuse to exit properly, and required a reboot every time I closed the game down.

After about a week I said F that and went back to XP. Never touched Vista after that before Windows 7 came along.

So Vista, along with the first version of 98, were the two operating systems that I used less than a total of a week before reverting to an older version. I never used later versions of Vista or 98 SE as a result.

Also used ME, but that OS was also a POS, kept using it because I had no choice, moved to XP and never looked back.

I used Win 8.1 for a little longer, but again, due to game issues, I never used it seriously until 10.
 
The poor execution part the author referred to was the reliance on the garbage collection aspect of C#. It seems the engineers relied on this too heavily, didn't optimize code, and underestimated hardware performance as it related to the massive overhead caused by the poorly written auto garbage collecting reliance.

Stack modules written in C# on top of each other in addition to the kernel and you have an exponential increase in performance cost when hardware was not yet up to par to handle that. It was noted that WinFS was dropped because it actually performed much slower than the pre-existing desktop search since it ran on top of everything else and wasn't built-in.

This pretty much sums up my opinion on Vista:

Programmers new to these types of environments (and virtually all the 100’s of developers working in Windows on the project at this time were) tend to take a casual overall attitude to memory use. But memory use, whether automatic or manual, is resource use and a casual attitude to resource use results in bloated code that requires more memory to run. In fact, even being “highly productive” (generating lots of code) is not necessarily the best outcome for a project (“if I had more time I would have written a shorter letter”). Using more resources was part of the value system at the time since it reinforced how important a large rich client was to the computing experience (vs. a “thin client” like a simple application exposed through a web page). The Windows team providing updates on Longhorn would brag about how many new APIs they had written.

This is one thing I don't like about modern software development. The mantra of don't optimize is mainly focused on micro-optimizations. Today, people take it literally, and just don't optimize. I can't tell you how many managers I've had who thought along the lines of, well, people will have faster machines with more memory. C# is nice in the fact that you don't need to worry about memory management as much (as a lot of younger people coming out of college don't seem to understand the need to free up memory), but it does lead to a cavalier attitude.
 
  • Like
Reactions: blkt
like this
I remember the first 6 or so months after Vista came out, customers of mine were complaining in droves that their printers (mostly HP) had no driver support for Vista and the company was not going to make a driver at all for Vista. More or less forced people to have to buy new printers that did support Vista. Lots and lots of angry business customers who had to buy 2-10+ new printers all at once.
 
I don't think i've ever seen a fully patched up version of vista in the wild, it always broke before that ever happened.

Nowadays, or yesteradays, people could just download a Vista SP2 ISO. Still, I think that I used a single installation of Vista from its launch until Windows 7 released, without encountering an issue.


Is this a cry for help?

I had gone too long without immersing myself in the gorgeous UI of Vista. Afterwards, I just installed a Vista theme for Windows 7, and there's also one for Windows 10 that I'm tempted to install.


Its not really about protecting the PC from you per se. Its about actions taken on your behalf without your knowledge. Like how you run an installer for a chatprogram that wants admin access so it can install a keylogger in the background...under XP it just worked. with UAC it at least presents what the program is asking for access wise letting YOU be the one to make the call. You still get to be a bonehead and install that p0rn downloader from warezsite#3204823 if you want or that chat program from majorcompany.

I have never seen UAC request permission for anything other than the file I've explicitly double-clicked on to install, though. Also, I get 99.99% of all my downloads from the well-known companies or developers who designed the software themselves.


They blatantly used an OS as a mechanism for forced obsolescence...and I haven't bought any hardware from many of the offenders since I was on XP.

Shitty part is that this exact crap continues today: Intel being another culprit with their "Win10 Only" requirement for their latest chipsets and processors. Not that it directly effects me, since I am loving Win10 and not looking back...but I disagree with said practice, especially since I work in a corporate IT department with active revision control in place for things like OSs and certain softwate applications). It's forcing companies to extend the field life of computing assets, risking potential costly break/fix instances, or footing a humongous bill to validate the latest hardware and OS to ensure proper end-user functionality with existing applications (many of which may not even operate as needed with Win10).

The newer CPU restriction in Windows 7 and 8 affects Windows Update functionality, but I think you can update those OSes on newer CPUs if you download updates from WSUS, or create a custom updated Win 7 ISO with all updates already slip-streamed.
 
  • Like
Reactions: blkt
like this
This is one thing I don't like about modern software development. The mantra of don't optimize is mainly focused on micro-optimizations. Today, people take it literally, and just don't optimize. I can't tell you how many managers I've had who thought along the lines of, well, people will have faster machines with more memory. C# is nice in the fact that you don't need to worry about memory management as much (as a lot of younger people coming out of college don't seem to understand the need to free up memory), but it does lead to a cavalier attitude.

A lot of this is caused by the top-down, business requirements, driven approach.

Our in house development team works like this and due to business requirements and deadlines, it leads to a lot of unoptimized code. There is usually only enough time to get it to a simple working state before deadlines and this leads to heaps of code running on top of other code that isn't optimized.

Very rarely are we in a slow enough state for us to optimize anything.

I would also agree on the young coders bit. I'm glad that I went through my undergrad at a point in which the university required learning C before Java which lead me to be more conscious of memory allocation/deallocation rather than how they have it now where they introduce Java first, then C++, which has all the fancy memory allocation/deallocation done for you. I like to do things the manual way, and sometimes prefer to in-line C code with my C++ code, because that's what was driven into my brain with learning C as my first language.
 
  • Like
Reactions: blkt
like this
I can sum it in up 4 words: Because its not Linux.

Or if you prefer: Because its not Mac.
Or if youre a sadist: Because its not XP.
Or if you were a beta tester: Because it's not Longhorn.
Too many people were crushed by the number of features that didn't make it to the final version.
 
I had a lot fewer issues with Vista at launch than I had with XP, and for both launches I had built a new PC. Beyond a few annoyances (that were fixed with SP1), it was the best and most stable OS launch I had been a part of up to that time, and it served me well for 7 years.
 
  • Like
Reactions: blkt
like this
I remember the first 6 or so months after Vista came out, customers of mine were complaining in droves that their printers (mostly HP) had no driver support for Vista and the company was not going to make a driver at all for Vista. More or less forced people to have to buy new printers that did support Vista. Lots and lots of angry business customers who had to buy 2-10+ new printers all at once.
This is pretty much what I was going to say.

I actually ran Vista until SP1 came out for Windows 7. I thought it was much more stable that XP for the most part. With that said, I was a field tech at the time and I cant count how many small businesses and self employed peeps I had to try to fix printer issues. Most times they had to just buy a new one because HP refused to update their drivers for Vista and left many users out in the cold.

The only other issues with Vista were peeps who upgraded their existing machines instead of wiping and installing it clean.

I still have my Vista C: drive image somewhere around here....
 
I used Vista from the time it was released until the time Win 7 rolled out. It had a few issues here and there, but I don't remember it being any more painful than XP or the early days of Win7 for that matter.

I had almost zero problems with Vista but then I bought it about six months after it was released.
 
Wasn't Vista supposed to have a lot of new features that never made it to release? One I remember was a completely new filesystem to replace NTFS that would have allowed for google-fast searching of all your local files.
 
This was an interesting read.

My own Vista x64 experience was fine. I installed it on a new machine and had no old peripherals that I needed to use. Aside from a driver issue with a video capture card (which I resolved eventually) I never had a blue screen.
 
In my opinion Vista sucked because you had OEMs putting out systems with 512MB-1GB of RAM. My system at the time had an X2 4400+, 4GB of RAM and I didn't mind Vista. Stability issues were there but were fine after SP1.
 
Last edited:
  • Like
Reactions: blkt
like this
To all those folk that still continue to stand by their position that "Vista is 95% of what Windows 7 is..." you just haven't been really paying attention or apparently used Windows 7 vs Vista for long extended periods of time. :D

I mean I get where the attitude and the opinion comes from, but geez, it's quite a bit more than 5% difference for the two OSes, really.

That's all I'm gonna say on it since I already know where this thread is headed like so many others. :D
 
To all those folk that still continue to stand by their position that "Vista is 95% of what Windows 7 is..." you just haven't been really paying attention or apparently used Windows 7 vs Vista for long extended periods of time. :D

I mean I get where the attitude and the opinion comes from, but geez, it's quite a bit more than 5% difference for the two OSes, really.

That's all I'm gonna say on it since I already know where this thread is headed like so many others. :D

OK, then what in your opinion constitutes the biggest differences between the two operating systems?

Oh my bad. "That's all I'm gonna say on it" is pretty convenient after laying down a remark that can't really be substantiated.
 
Eventually after a couple of years of bug fixes Vista was nearly the same as Win7.

But initially it was really bad. It wasn't just bad drivers people blame on OEMs.

Microsoft also altered the memory mapping of Video memory which caused issues with some games running out of memory and crashing.

So initially people were right to avoid it like the plague. But eventually when the bugs were fixed I am not sure what significant differences still existed between Viista and 7.
 
2GB of ram and a fast HDD is all Vista needed to be fine. I thought it was a great OS, it just had a few kinks to work out.
 
Vista is 95% the same as Windows 7, so I really disagree that it sucked. It was just a bit before it's time, with some poorly adjusted features like an over-aggressive UAC. It was the first viable 64-bit version of Windows, and by the time Windows 7 arrived everyone I know had already switched to Vista 64 as their Primary gaming OS, with only a few still dual-booting XP.

It was a solid OS especially 64-bit version after SP1. Big problem was that driver support and such was taking too long to arrive, it wasn't very different with XP. Second was that too much really old hardware were around or manufacturers wanted to sell off that was too old to run Vista and on top of that were bundled crapware. To make things worse were the Vista capable labels which were misrepresenting system performance to end users. In my experience with a clean install there were no issues or performance problems, most was the hearsay. My switch to 7 was actually a failed hard drive which prompted to just go with new OS. But like you mentioned, 7 is just a more polished Vista. Vista SP2 was pretty much same as 7 and very stable. After I switched to Vista 64 SP1 I didn't keep my XP drive for very long and never looked back. All the bad talk was too much for Microsoft to shake though, at least they sort of listened back then with 7. Unlike Win8 and 10.
 
I used Vista from the time it was released until the time Win 7 rolled out. It had a few issues here and there, but I don't remember it being any more painful than XP or the early days of Win7 for that matter.
this was my experience as well, I had very little issues with vista on my main rig and what issues I did have were nothing major that I couldn't fix/work around with some google help. I was very happy with Windows vista and skiped windows 8 completely, by the time windows 7 rolled out vista worked perfectly for me but windows 7 looked nice so I upgraded anyways lol

My first computer I ever owned came with Windows ME lol, that was a train wreck of an OS but it did force me to learn basics at 13, things like how to re-install an OS many times and how to properly back up things to make an OS wipe as quick as possible :p
 
Back
Top