In Defense of Vista

Obviously a user should buy hardware and software which work with each other, but that wasn't what I was talking about at all.

I was talking about who is at fault for the hardware incompatibility in the first place, which would be both the hardware and OS vendor. People in this thread were attempting to excuse these incompatibilities by placing the fault entirely on hardware vendors, while really the issue is the fault of both parties.
 
if a video card does not work in Vista, how is it MS's fault the hardware vendor, who has access to the OS long before release, didn't write proper drivers for it, we saw this with NVIDIA and Vista, NVIDIA had access to Vista while it was in beta stages to write drivers and they still could not get it working right.
 
If Vista were so great it wouldn't need a thread defending it :p
 
If Vista were so great it wouldn't need a thread defending it :p
I would say that's more the fault of MS's piss poor marketing attempts rather than the actual product itself.

Generally speaking, users don't like change. Compounded by the fact that the XP GUI worked ( and still is working for many users ), and that you are unable to use the old GUI in Vista or 7, and you get the situation you got. Now, finally, after years of the new interface they release 7 which is pretty close to Vista in GUI, and people are finally warming up to it.

If MS had wanted Vista to "work" right out of the gate, they would have "blended" the GUIs, or given an option to have the XP GUI, with the option to swap to Vista's. For many users, the GUI is the OS, and MS just came along and yanked the rug out from underneath these users.

( for the sake of argument, I'm ignoring the damage UAC did as well )
 
I was talking about who is at fault for the hardware incompatibility in the first place, which would be both the hardware and OS vendor. People in this thread were attempting to excuse these incompatibilities by placing the fault entirely on hardware vendors, while really the issue is the fault of both parties.
But... Hardware doesn't just plug into the OS natively. You've got to have a driver to interface between it.

And that's the point. Microsoft can't just include every driver that's ever existed- because then they'd have drivers that won't work with the new driver model in the first place. And I've already given you an example of why Microsoft verifying every driver wouldn't be feasible at all.

If Microsoft would've kept the old driver model in XP, we would've never had the improvements we just saw. That's the other thing.
Would you rather have a more stable and secure OS, or let hardware vendors hold progress back forever?

I wan't to make a Media system to sell, that I can sell folks and they can install it in their vehicles. Is it Toyota's job to make sure their vehicles can take my system? Or is it MY job to MY customers to make sure it works with the vehicle?

I would say that's more the fault of MS's piss poor marketing attempts rather than the actual product itself.
What marketing attempts? I guess that could be part of it, lack thereof.
I think what hurt Vista more than anything was just the FUD generated by Apple and the Tech Media. Then that was just repeated by people that used it as an excuse not to change.
 
What marketing attempts? I guess that could be part of it, lack thereof.
I think what hurt Vista more than anything was just the FUD generated by Apple and the Tech Media. Then that was just repeated by people that used it as an excuse not to change.
Most myths have a grain of truth. The same is true here. If there wasn't something to harp on, then the tech media wouldn't have had basis for harping on vista.

My honest belief is that it was a combination of the huge GUI change and UAC which sank Vista.
 
i think vista was just put out to make win7 look amazing.lol...but i had zero problems with vista
 
i think vista was just put out to make win7 look amazing.lol...but i had zero problems with vista
Vista was supposed to be '7', they just ran out of time. It was a massive undertaking and 5 years in they couldn't delay the OS any longer. They should have though due to the bad publicity.
 
Vista was supposed to be '7', they just ran out of time. It was a massive undertaking and 5 years in they couldn't delay the OS any longer. They should have though due to the bad publicity.

Wouldn't have made any difference. (Again-Mojave) Windows 7 didn't magically change much. Vista was not a bad OS.
 
How could you guys defend vista? Seriously! what have you been smoking?
The fact that we had to wait until sp1 was released to enjoy vista was crap! You pay your hard earned money for an OS that you then have to wait months until MS puts out a service pack so its usable? that's unacceptable IMHO! 300.00 for an OS then Oops wait a sec, lets wait x amount of time before were able to use it with 1/2 the bugs worked out.
XP had far less bugs than vista did upon their releases, not to mention XP was far faster than Vista and XP is older technology! Look at windows 7, 7 is faster than Vista and XP the way Microsoft should release a new OS, faster and better than the previous version.
Vista is nothing more but a cousin of windows ME AKA a joke.

Don't get me wrong, Vista with Sp2 is better than Vista with no SP or Sp1 but even Vista with Sp2, its still slower than XP and uses more memory than XP. Windows 7 is basically windows vista with the bugs worked out and a few GUI changes.

Vista was a joke face it and move on. the fact that someone would post a topic on defending vista makes me want to puke.
 
XP had far less bugs than vista did upon their releases,
No, it didn't. Sounds like you didn't use XP until SP2. XP at release was incompatible with everything and a miserable experience. People also complained that it was slow and buggy. Please stop trolling.
 
I'm not trolling, I ran every version of Xp before it was relased, upon the RC2 release I was able to play all my games and use all my apps. The only major issue I remember was security issues I remember this being the reason Whirlpool refused to upgrade from 98 to XP. Sp1 fixed half of the security issues in XP and Sp2 the OS was flawless.
Vista is now at Sp2 and still has major bugs, still slow and is far from complete. So if were going to compare XP and Vista with Sp2 XP wins hands down.

No, it didn't. Sounds like you didn't use XP until SP2. XP at release was incompatible with everything and a miserable experience. People also complained that it was slow and buggy. Please stop trolling.
 
A simple reading of these forums will show that every single point you made is false. I'm not wasting any more time on you.
 
Another Vista basher... Well, to undertaker... I'll respond to the actual statements you've made, however if you can't formulate an opinion or argument yourself than you'll utterly fail here (I'm talking about your link).

Idiot users quoting idiot sources are one of the big reasons why there's so much misinformation about Vista in the first place. Not saying you're either [yet], but you need to be using links to back up statements you make: not create them.

The fact that we had to wait until sp1 was released to enjoy vista was crap! You pay your hard earned money for an OS that you then have to wait months until MS puts out a service pack so its usable? that's unacceptable IMHO! 300.00 for an OS then Oops wait a sec, lets wait x amount of time before were able to use it with 1/2 the bugs worked out.
I'll ask you the same direct question I ask everyone else that makes this stupid statement (and then runs off never to be seen again in the thread due to lack of any actual facts...): What did SP1 magically fix in Vista?
Yea, it fixed a few minor UAC and networking bugs- but what did it automagically fix that Vista goes from Crap to Enjoyable in one SP?

XP had far less bugs than vista did upon their releases, not to mention XP was far faster than Vista and XP is older technology!
By that logic, DOS is superior to XP because it runs faster. That must mean XP is crap!!!!



You can't expect to run new OSes on old hardware- it's as simple as that. Also- Vista broke the mold of "testing". XP gets slower over time, and Vista gets faster. This is due to disk indexing and Superfetch. Thus, installing them both "out of the box", as has been the case will obviously skew results. Anyone that has used both XP and Vista on the same hardware will acknowledge that Vista runs faster than XP once it's matured a little bit and learned your habits.

Look at windows 7, 7 is faster than Vista and XP the way Microsoft should release a new OS, faster and better than the previous version.
System Requirements for Vista:
Processor 1 GHz
Memory 1 GB
Graphics card DirectX 9.0 capable and WDDM 1.0 driver support
Graphics memory 128 MB
HDD capacity 40 GB
HDD free space 15 GB
Other drives DVD-ROM

System Requirements for 7:
Processor 1 GHz
Memory 1 GB
Graphics card DirectX 9.0 capable and WDDM 1.0 driver support
Graphics memory 128 MB
HDD capacity 40 GB
HDD free space 15 GB
Other drives DVD-ROM

HOLY COW, THEY'RE THE SAME!!!!!!!!!!
 
I'm staring at this thread like...wow...honestly, I think it was all covered within the first 3 pages.

One moron screamed, "ZOMG Vista eats all mah RAMZ" and all the other morons heard him and didn't like Vista :D
 
If Microsoft would've kept the old driver model in XP, we would've never had the improvements we just saw. That's the other thing.
Would you rather have a more stable and secure OS, or let hardware vendors hold progress back forever?
I'm not taking a position against the new driver model. I think it's almost certainly a big improvement. I've been quite explicit about my position, and until now have not commented at all on whether I like the change or not; please don't read into my posts any more than what I have written in them.

I think the new driver model is ultimately good, and yet I am still able to recognize that it had an adverse affect on hardware compatibility, at least for now. This is the fault of both MS (who changed the interface) and hardware vendors (who were slow or unwilling to support the new interface for certain devices). I object to the posts in this thread that put the entirety of the blame on the hardware vendors since both parties were responsible for causing the incompatibilities.
 
Last edited:
Sorry, haven't had a chance to read the whole thread. . . but here's a quick take on it:

If there hadn't been a Vista and the crapfest that ensued due to the innovations it tried to introduce (new driver model, etc.), then guess what, Windows 7 would have introduced the crapfest and everyone would be complaining about how Windows 7 was "rushed" or how it is "terrible."

Vista suffered because it was different (some would say "innovative") and the OEMs failed to have drivers and support ready for it even after MS delayed Vista's launch as much as it reasonably could. The hardware manufacturers had beta builds and all the time in the world to create decent drivers, yet just couldn't seem to get it done. At some point, MS needed to just ship the thing and force hardware manufacturers to get off their duffs and release decent drivers.

So, I have to roll my eyes when I hear so many folks condescendingly say that MS "did it right" with Windows 7 and that Windows 7 is what Vista should have been if MS hadn't somehow messed it up. Well, the only reason Windows 7 has been received so well and seems so stable and well-behaved out of the box is because Vista paved the way for it with all of its growing pains (due to the new driver model, etc.). If not for Vista doing so, Windows 7 would have launched to the same craptacular reception that its predecessor received.
 
Vista performed like a dog when it was first released. Specifically, I'm speaking of the network streaming and the lack of performance when copying files between disks. Going from XP to Vista felt like a downgrade in those aspects when it was first released. I do have a laptop that came with Vista 64. it never really gave me any problems, but it never really wowed me either. think that was the problem with Vista. It didn't bring an earth shattering increase in usability/performance/functionality and wasn't a killer app by any means, which any OS update should be.
 
If there hadn't been a Vista and the crapfest that ensued due to the innovations it tried to introduce (new driver model, etc.), then guess what, Windows 7 would have introduced the crapfest and everyone would be complaining about how Windows 7 was "rushed" or how it is "terrible."

Vista suffered because it was different (some would say "innovative") and the OEMs failed to have drivers and support ready for it even after MS delayed Vista's launch as much as it reasonably could. The hardware manufacturers had beta builds and all the time in the world to create decent drivers, yet just couldn't seem to get it done. At some point, MS needed to just ship the thing and force hardware manufacturers to get off their duffs and release decent drivers.

So, I have to roll my eyes when I hear so many folks condescendingly say that MS "did it right" with Windows 7 and that Windows 7 is what Vista should have been if MS hadn't somehow messed it up. Well, the only reason Windows 7 has been received so well and seems so stable and well-behaved out of the box is because Vista paved the way for it with all of its growing pains (due to the new driver model, etc.). If not for Vista doing so, Windows 7 would have launched to the same craptacular reception that its predecessor received.

I agree 100%.

I am using two different computers in my home...one with Win 7 Pro x64 and one with Vista Ultimate x64(i waited until sp1 to upgrade from xp). I used Vista exclusively for more than 1 year before switching to 7 on my main rig. I really don't even notice much difference between the two other than the Start Menu and Aero Peek...the new cleartype calibration is nice though. I do wish Vista had it.
 
This is the fault of both MS (who changed the interface) and hardware vendors (who were slow or unwilling to support the new interface for certain devices).
Technically, sure.
But both you and I know Microsoft had to make those changes to progress forward. We all know new technology requires changing the old. So it's to be expected.
That's why I'm still squarely placing the blame on hardware vendors for not getting their act together.

If there hadn't been a Vista and the crapfest that ensued due to the innovations it tried to introduce (new driver model, etc.), then guess what, Windows 7 would have introduced the crapfest and everyone would be complaining about how Windows 7 was "rushed" or how it is "terrible."
That's my opinion as well. It really didn't matter if Microsoft released Windows 7 two years ago: it would've had the same reception Vista did.
Microsoft had to pave the way for improvements, and the initial change-shock is what alot of people did not like.

IMO, in two more versions of Windows, we'll get another change-shock. It's coming. I think Microsoft sticking to a 3 year release cycle will help that out, though.
 
Technically, sure.
But both you and I know Microsoft had to make those changes to progress forward. We all know new technology requires changing the old. So it's to be expected.
That's why I'm still squarely placing the blame on hardware vendors for not getting their act together.
Let me see if I understand your position correctly: You agree that MS is partly responsible for the incompatibilities, and yet because you like the change overall, you subsequently are disbelieving this factual truth which you previously acknowledged?
 
Let me see if I understand your position correctly: You agree that MS is partly responsible for the incompatibilities, and yet because you like the change overall, you subsequently are disbelieving this factual truth which you previously acknowledged?

I think the fact is that, yes, every party involved in the whole Vista debacle is at least partly responsible - it's just that some are WAY more responsible than others; namely, all of the vendors who had more than ample time to get their shit together, but didn't.
 
You can't expect to run new OSes on old hardware- it's as simple as that.

Linux runs on any pentium class machine. Even with as little as 64 megs of RAM. And it's as modern as any other OS out there.

It's just flat out retarded that any OS, even a modern one needs 1 gig of RAM all by itself just to run well. And all that hard drive space. For what? So you can have glassy windows and prefetch that's nearly worthless? I boot into a working desktop on my netbook only using 60 megs of RAM and 5 Gb's total HD space WITH added software, and I have all the functionality anyone could ever need except for modern gaming. And it's far faster than Vista could ever be.

That brings me to why Vista is a turd. It offered no worthy benefits to make it worth the switch from XP. That's why so few did it. It makes no sense to drop a ton of money on an OS that actually drops your gaming performance. Which, btw, was the main reason a lot of the tech crowd skipped Vista.

It's the same reason why a lot of people hate Norton Anti-Virus. It's a damn pig on resources. The mutual problem is, it's unnecessary to do so. Where are the benefits of this excessive RAM and hard drive usage?

MS should consider maybe offering two different desktop modes in future releases of Windows. The problem now is, there is no choice. If you want the latest from MS, go get more memory and make sure you have plenty of hard drive space. Sure, memory and HD space is more affordable, but where is the wonderful benefit from using up these extra resources?
 
but where is the wonderful benefit from using up these extra resources?
Added security, faster response time? For all your fancy "glassy windows", you'll notice that they'll never tear. EVER. It's impossible, because of the rearchitecting of how the window management is done. Compare that to Linux, with tearing galore and where if you kill X, all your programs crash. Real "modern" OS there.
 
Linux runs on any pentium class machine. Even with as little as 64 megs of RAM. And it's as modern as any other OS out there.

It's just flat out retarded that any OS, even a modern one needs 1 gig of RAM all by itself just to run well. And all that hard drive space. For what? So you can have glassy windows and prefetch that's nearly worthless? I boot into a working desktop on my netbook only using 60 megs of RAM and 5 Gb's total HD space WITH added software, and I have all the functionality anyone could ever need except for modern gaming. And it's far faster than Vista could ever be.

That brings me to why Vista is a turd. It offered no worthy benefits to make it worth the switch from XP. That's why so few did it. It makes no sense to drop a ton of money on an OS that actually drops your gaming performance. Which, btw, was the main reason a lot of the tech crowd skipped Vista.

It's the same reason why a lot of people hate Norton Anti-Virus. It's a damn pig on resources. The mutual problem is, it's unnecessary to do so. Where are the benefits of this excessive RAM and hard drive usage?
This is an oversimplification at best.
MS should consider maybe offering two different desktop modes in future releases of Windows. The problem now is, there is no choice. If you want the latest from MS, go get more memory and make sure you have plenty of hard drive space. Sure, memory and HD space is more affordable, but where is the wonderful benefit from using up these extra resources?
What modes would these be? What purpose would each serve? How should this happen? Why?
 
Added security, faster response time? For all your fancy "glassy windows", you'll notice that they'll never tear. EVER. It's impossible, because of the rearchitecting of how the window management is done. Compare that to Linux, with tearing galore and where if you kill X, all your programs crash. Real "modern" OS there.

When was the last time you used Linux? 5 years ago? Besides, there are lots of people who don't like the glossy crap. So we don't enable it, or don't use it.

I only brought up Linux as a point that the OS doesn't have to be a fat pig to do what you need it to do. And btw, I'd wager the security of Linux over Vista any day. And it doesn't take a gig of RAM to do so either.

What modes would these be? What purpose would each serve? How should this happen? Why?

For one, it would allow installation on lower end machines. It could also serve to save extra RAM and CPU cycles for games. And, it would be nice to have a choice.
 
Let me see if I understand your position correctly: You agree that MS is partly responsible for the incompatibilities, and yet because you like the change overall, you subsequently are disbelieving this factual truth which you previously acknowledged?
They're "at fault" for the same reasons the FCC is "at fault" for cutting lots of old analog televisions dead.
Yes, both Microsoft and the FCC are "at fault" for these things, technically.
Is it for the better and future development for the consumer? YES. Is it the responsibility of the hardware manufacturers to produce more modern hardware for the consumer? YES.

Except the difference here is Microsoft didn't even cut anyone dead. This'll effect anyone that upgrades. And people that upgrade generally know enough to check for hardware compatibility before doing so.

Linux runs on any pentium class machine. Even with as little as 64 megs of RAM. And it's as modern as any other OS out there.
It's great for folks who love wasting time manually configuring their drivers.

I just tried to try out Ubuntu again just two weeks ago and after a few hours of configuring the darn thing, thought to myself again "Why am I doing this? Windows works just fine, what advantage is this giving me? I can have any given Windows machine up and running by now."
I was installing it in a dual boot just to check it out again, to see if anything has improved. Nope- same typical Linux nonsense I've always had to screw with.

Note: It's not that I CAN'T or DON'T KNOW how to configure Ubuntu, it's that it's a major PITA to do it. Windows literally "just works". I installed 7 on my HTPC (lots of third party hardware in there- moreso than any regular desktop due to the multimedia capabilities), and literally EVERYTHING had a driver and worked out of the box. I can't get Ubuntu to work out of the box on a freaking simple system, no way in hell it'd work out of the box on an HTPC. Even my tuner was installed and working out of the box.

It's just flat out retarded that any OS, even a modern one needs 1 gig of RAM all by itself just to run well. And all that hard drive space. For what? So you can have glassy windows and prefetch that's nearly worthless? I boot into a working desktop on my netbook only using 60 megs of RAM and 5 Gb's total HD space WITH added software, and I have all the functionality anyone could ever need except for modern gaming. And it's far faster than Vista could ever be.
By your logic, DOS is superior to Linux because it boots up faster, uses less RAM, and less space :rolleyes:

RAM: Complaining about RAM in Vista shows your utter ignorance. Go read up on what Superfetch is and STFU. Feel free to buy 10GB of RAM and only use 1GB of it. Have fun with that. I'd rather my OS actually use it.
Disk Space: Driver support is already mentioned. Plus the fact I get a great set of working programs right out of the box. Media Player, BitLocker, GPOs, MediaCenter, Presentation tools, Multitouch, Mobility tools, Input recognition, Voice recognition, etc, etc, etc. Have fun wasting your time downloading all that stuff for Linux, it's all right here and already done for me.

Your Linux also cannot launch Photoshop CS4 64 bit in two seconds, either. "Superior" memory management, my A** :rolleyes:

That brings me to why Vista is a turd. It offered no worthy benefits to make it worth the switch from XP.
GPU offloading, new and vastly improved memory management, new HAL enhancements, new and much better PnP support, VASTLY improved Power Management, new hardware model, VASTLY improved security, new management tools....
And that's just the kernel changes I can think of ATM. Yea- it offered no benefits :rolleyes:

It's the same reason why a lot of people hate Norton Anti-Virus. It's a damn pig on resources. The mutual problem is, it's unnecessary to do so. Where are the benefits of this excessive RAM and hard drive usage?
People hated Norton because it was a pig on resources when other competing products offered more features with less resources. It's a no brainer.
Windows uses more resources than Linux but provides a crap-load more features than Linux as well, which more than makes up for it.

It's like truck vs car. A car uses better resources, sure. Doesn't make it superior. My truck uses more resources, it's big, it's bloated, it's heavy, but I get a ton more functionality of it than a car. Have fun replacing your transmission pulling a 30ft boat in your car, if you can even get out of the driveway with it!

Added security, faster response time? For all your fancy "glassy windows", you'll notice that they'll never tear. EVER. It's impossible, because of the rearchitecting of how the window management is done. Compare that to Linux, with tearing galore and where if you kill X, all your programs crash. Real "modern" OS there.
Good example. The new GPU utilization and GUI in itself is simply awesome. It makes the whole Windows experience run much smoother than the antiquidated image-based crap we had in XP and Linux.
Let's not even go into the capabilities (or lack thereof) multitouch in Linux.




Linux has it's place (servers). Linux fanboys like our friend here have been proclaiming that Linux will take over the desktop for years now. Microsoft even handed you guys a chance on a golden platter: XP security was terrible. Microsoft went over double their release cycle, and had 7 years between OSes. The Linux community came out with Ubuntu- and told us it'd take over. Here it is 6 years later and it's not done a darn thing. The people already using another distro migrated over to it. Where's the surge of Linux desktops we were promised?


I will say this, the one defeat Microsoft suffered was in Vista with netbooks. Vista was just too heavy for them- so Linux became the defacto standard. But now that 7 can run efficiently on them, consumers have quickly abandoned Linux on the netboook as well. Why do you think that is?


I only brought up Linux as a point that the OS doesn't have to be a fat pig to do what you need it to do. And btw, I'd wager the security of Linux over Vista any day. And it doesn't take a gig of RAM to do so either.
http://www.internetnews.com/security/article.php/3667201
Note the date. This is in 2007. Both XP AND VISTA were thrown into these results. If you were to have Vista and/or Windows 7 only in this report, the outcome for Linux would look worse:
Windows averaged 21 days to fix an issue (39 total vulnerabilities). Red Hat averaged 58 days on 208 vulnerabilities.
Not only do you have more vulnerabilities you're needing to patch, you're taking longer to patch them.

Yep, that's some superior security right there :rolleyes:
 
For one, it would allow installation on lower end machines. It could also serve to save extra RAM and CPU cycles for games. And, it would be nice to have a choice.
I'll take that a step further and say corporate desktops would appreciate the cut down version tremendously. My users don't need Aero, it's not going to contribute to them being more productive.
 
I'll take that a step further and say corporate desktops would appreciate the cut down version tremendously. My users don't need Aero, it's not going to contribute to them being more productive.

Disable it.

Not rocket science.
 
Is this really going to turn into a Linux vs. Windows debate? I dual boot with Ubuntu on my netbook and main machine, and while Ubuntu runs fine on either machine - programs crash a hell of a lot more in Linux than with any Windows installation I've ever had. I can barely run Firefox, as it loves to freeze up (turns greyscale, darkens, becomes unresponsive, have to force quit), and while Chrome is better, it still doesn't have a lot of the features that I get to enjoy in the Windows version. Sure, Linux nice and secure - but so is my Windows install. As for window decorations - all the fancy Compiz stuff is pretty nice - but tears like a crazy, as someone already mentioned. I mean, I like both the Linux and Windows (Aero) GUI's, but I'll take the one that doesn't tear like crazy.
 
Fair enough. I actually meant my users would benefit most from a simple interface, and one which doesn't change every couple years with no benefit to their workflow. 2k-ish.

Why not just stay on 2000 then?
Business is a different world from homes... If it works, very little reasons to upgrade in a business environment.

I'm not opposed at all to folks running XP in businesses anymore, it's just a different monster. I've personally opted to keep my company on XP right now as well. Just not worth the hassle to upgrade OSes when all the business applications function well on XP. Note: it's a hassle upgrading ANY OS. Be it OS10, Ubuntu, anything. Change is change, and it's a hassle in the businessplace.


Edit- Also, if your machine can run Aero... There's no reason to disable it anyway (unless you just don't like the look). Offloading it to the GPU has the system running it more efficiently than just using the old image UI.
 
2k is not receiving security updates anymore, which is where XP will be in a couple years. At that point, we won't have a choice. Because I don't like waiting until I'm forced to migrate, we're starting migration from XP to 7 soon.

Which is why I'd like a 2k-like interface for 7.
 
It's great for folks who love wasting time manually configuring their drivers.

"Why am I doing this? Windows works just fine, what advantage is this giving me? I can have any given Windows machine up and running by now."

It's not that I CAN'T or DON'T KNOW how to configure Ubuntu, it's that it's a major PITA to do it. Windows literally "just works". I installed 7 on my HTPC (lots of third party hardware in there- moreso than any regular desktop due to the multimedia capabilities), and literally EVERYTHING had a driver and worked out of the box. I can't get Ubuntu to work out of the box on a freaking simple system, no way in hell it'd work out of the box on an HTPC. Even my tuner was installed and working out of the box.

I just installed Linux Mint last week. Took 20 minutes. Number of problems: 0 Number of drivers I needed: 0 Number of crashes: 0 TV tuner, scanner, and printer worked right out of the box as well.

Learn how to use a computer. Also, I have had to deal with driver and crashing issues with Vista. It took SP1 to make the damn thing usable.

By your logic, DOS is superior to Linux because it boots up faster, uses less RAM, and less space :rolleyes:

The difference here dumbass is I get functionality out of the resources I use. Not prefetch for shit I might use like twice a month. And I'd rather have the extra hard drive space for movies and music instead of OS bloat.

RAM: Complaining about RAM in Vista shows your utter ignorance. Go read up on what Superfetch is and STFU. Feel free to buy 10GB of RAM and only use 1GB of it. Have fun with that. I'd rather my OS actually use it.

I know what it is. A waste of resources. If I want something preloaded, I can do that by choice now. The difference here is you have MS holding your hand to do it.


Disk Space: Driver support is already mentioned. Plus the fact I get a great set of working programs right out of the box. Media Player, BitLocker, GPOs, MediaCenter, Presentation tools, Multitouch, Mobility tools, Input recognition, Voice recognition, etc, etc, etc. Have fun wasting your time downloading all that stuff for Linux, it's all right here and already done for me.

I thought you said you used Ubuntu? If you had, you would know that ALL that stuff you mentioned gets loaded with equivelents at installation. Hell the live CD has all that.

Your Linux also cannot launch Photoshop CS4 64 bit in two seconds, either. "Superior" memory management, my A** :rolleyes:

Wow. Duh George. I don't know how to preload a program myself. So I need Vista to do it for me. Again, this is only a benefit of convenience. Any modern OS can do this in their own way.

VASTLY improved Power Management

Are you just spatting out random shit? It's been proven by countless tests that Vista is worse at PM than XP.


Linux fanboys like our friend here have been proclaiming that Linux will take over the desktop for years now.

I only brought up Linux to say that you don't need a pig OS to be fully functional with modern hardware. Other people including you are the ones who are bringing Linux into the argument.

You can also thank Linux for Windows 7 being better. MS recognized the threat and started to practically give XP away for free. It's also the only reason XP is still supported today. I would also bet that's why they tuned up Windows 7 to run better on less resources.

I will say this, the one defeat Microsoft suffered was in Vista with netbooks. Vista was just too heavy for them- so Linux became the defacto standard. But now that 7 can run efficiently on them, consumers have quickly abandoned Linux on the netboook as well. Why do you think that is?

Because MS is practically giving away the OS on netbooks. Not to mention, Asus and others dropped the ball by using neutered versions of linux on early netbooks. This made zero practical sense. They could have used an xfce desktop with a standard distro and everyone would have been able to use it just fine. Hell, with little time, you can make a linux desktop indistinguishable from a windows one.

Not only do you have more vulnerabilities you're needing to patch, you're taking longer to patch them.

Yep, that's some superior security right there :rolleyes:

Program vulnerabilites are not the responsibility of the OS. Almost all of these vulnerabilities get discovered and fixed before anyone actually exploits them. And the reason you don't see this more often with windows is because it's closed source. Anyone with programming knowledge knows this point is just silly.

Nope. You won't know about a major windows vulnerability until a large portion of the population gets infected with a worm or becomes a botnet server.
 
Yes, both Microsoft and the FCC are "at fault" for these things, technically.
Ok it sounds like we're in agreement.


RAM: Complaining about RAM in Vista shows your utter ignorance.
Memory caching aside, MS states the minimum ram for Vista is 8 times that of XP (64->512 MB). Let's not pretend that this wasn't a substantial increase.

To be clear, I'm not against increasing RAM requirements. I am however able to recognize that, in and of itself, it's something that should be avoided. I don't pretend the bad doesn't exist simply because the benefits outweigh it.
 
Memory caching aside, MS states the minimum ram for Vista is 8 times that of XP (64->512 MB). Let's not pretend that this wasn't a substantial increase.

To be clear, I'm not against increasing RAM requirements. I am however able to recognize that, in and of itself, it's something that should be avoided. I don't pretend the bad doesn't exist simply because the benefits outweigh it.
To that end, higher-capacity memory was much more easily produced by the time Vista arrived, than when XP first arrived. As a result, the cost per megabyte was much lower. I'd argue for making utility of that. I'd also argue that the general usability under Vista was much better with 512MB than XP was at 64MB.
 
Memory caching aside, MS states the minimum ram for Vista is 8 times that of XP (64->512 MB). Let's not pretend that this wasn't a substantial increase.

XP 2001
Vista 2007

512MB of memory in 2007 was probably cheaper than 64MB 2001. Vista wasn't a good candidate for upgrading older hardware but its hardware requirements for 2007 were not anywhere as onerous has some made it to be.
 
Absolutely; cost-wise it's probably cheaper to meet the minimum memory requirements of Vista now, then it was to meet that of XP when it came out.

But let's not kid ourselves, Vista did require substantially more memory than XP.
 
Back
Top