Rate my partition scheme

nilepez said:
Imaging is fine if you're doing you're reinstalling a lot, but otherwise, wouldn't it be better to slipstream XP with all the patches up to now and maybe even add the latest patches for your MB and Video card?

prtition size will not affect performance.

and the idea of a ghost image is to get your system with all your installed software back up and ruing in 1/4 the time it takes to format / update - reinsrtall all your programs you use and such.

Once the image is done, go install newer updates - it will still be ALOT faster then a fresh format.
 
MrGuvernment said:
prtition size will not affect performance.

and the idea of a ghost image is to get your system with all your installed software back up and ruing in 1/4 the time it takes to format / update - reinsrtall all your programs you use and such.

Once the image is done, go install newer updates - it will still be ALOT faster then a fresh format.

Why wouldn't it matter? Let's say that I have a 500gb drive and it's 50% full and overtime my game of choice, whatever it may be, has files scattered from byte 1 to to the very last part that's filled. Would it not take longer to load that game, since the head has to go to roughly one end of the disk to the middle, but if it was on say a 25gb partition, it'd have to move, at most a much smaller area?

I don't see how that can't matter, unless you've got a defragger that lets you specify a lot of sort parameters (certainly beyond anything Norton or any post Nuts And Bolts Defragger I've used).
 
nilepez said:
Why wouldn't it matter? Let's say that I have a 500gb drive and it's 50% full and overtime my game of choice, whatever it may be, has files scattered from byte 1 to to the very last part that's filled. Would it not take longer to load that game, since the head has to go to roughly one end of the disk to the middle, but if it was on say a 25gb partition, it'd have to move, at most a much smaller area?

I don't see how that can't matter, unless you've got a defragger that lets you specify a lot of sort parameters (certainly beyond anything Norton or any post Nuts And Bolts Defragger I've used).
I believe you are over-thinking the problem, making it larger than it is. Even if you simply regularly schedule the built-in Windows defragger, you will eventually wind up with your most used files closer together near the beginning of the partition. Of course, this will include system files, and it will take several defrags and regular disk use for things to get to a more optimal setting, but it will happen. Don't trust programs out there that say they can fix your problems in one or two sittings with a few passes of their app. Just like with any technology, regular maintenance and nominal management are the best way to keep things running their best. Your partition size is going to factor in very little, except that partitions smaller than ~40 GB are going to experience more overhead with little benefit for things like the indexer (which, apparently, most people turn off anyway).

In my opinion, if you are going to have lots of programs you use, the sweet-spot size for a partition is roughly ~60 GB, which allows for the OS, programs, SysVol, and plenty of room for temp files that are used for installation and maintenance (like defragging). I then go ahead and run regulary scheduled disk cleanup to make sure temp files, leftover installation files, and other miscellaneous and unused transient data is removed from the file system, which allows for the regularly scheduled defrag to not have to factor in superfluous data while rearranging files. Since it's all automated (and free, using tools already present in the OS), there is very little management overhead outside of me checking to make sure things run on-schedule. Other than that, I typically remove programs I no longer use to avoid having them take up space. My profile's My Documents is located on a separate physical disk (also a feature able to be set in the OS), which allows me to go ahead and defrag that at a separate interval than the system drive, and leaves less maintenance overhead for the system drive during its own scheduled processes. Added to the fact that I can now take my Docs drive to other machines without much problem (in the case of system failure), the system drive is now optimized for running the OS and programs without having to factor in my over 200 GB worth of data in My Docs. All of this done using just the the features present in the OS, and without breaking things up into a bunch of partitions.

If I need to, I can just reinstall using an image created with the base OS and most used programs. I can update the image at my leisure to include OS and software updates, so my time spent after initial install to get running like before is minimal. No need to repartition disks, no need to go changing loads of environment variables, no registry tweaking, and only a few program installs. If I wanted to go a step further, I could probably roll a few of my non-imaged programs into installers (.msi files) and stick them on a DVD, which would drastically reduce reinstall time (programs would be installed with updates already applied and many settings pre-configured). Of course, these things are done using third-party apps (Ghost and WinInstall), but I can essentially get a whole system (sans My Docs) rolled into the space of two DVDs, complete with my preferred settings and software configurations. Keeping a backed-up copy of my Docs drive allows me to worry less about my critical personal data.

What I'm getting at is that more complexity is not going to solve the problem of reinstall time, nor is it going to provide you with increased performance. It does not offer more security (in a single-user environment), and doesn't really offer much beyond the complexity itself (which is okay for l33t-cred, I guess). I'm not telling you to not do it, but I am saying that to really solve questions of performance, fault tolerance, disaster recovery, and security, the answers are not always going to come as a result of adding more complexity or more layers. From a performance perspective, increasing the number of variables increases the chance for failure. From a security perspective, more complexity often equivocates to obfuscation and little else. From a disaster recovery perspective, the amount of time to install the OS is trivial compared to getting the OS and programs all working properly over the span of three (or even two) hard drives instead of one.

You need to ask yourself: are all of the extra steps somthing you have examined, evaluated, and come to a conclusion on yourself, or are they based on the word of others who based their word on others, and so on? It is much better to learn how to evaluate for yourself and get the best answer for your system specifically, even if you based that evaluation on similar system setups with similar usage. Not everyone's setup is going to be the same-- I rarely play any games, and my list of games is roughly at two or three titles, tops. However, I run a lot of network utilities, I run Services for Unix (and Cygwin), I have development IDEs, I run virtual machines, and I tinker with a few multimedia programs. My requirements are going to be different than someone who has a couple dozen game titles on their system, with an office suite and web browsers being just about the only other software. My girlfriend's workstation is different than mine in many ways, but similar in that she has a lot of web development and web content creation software (freelance designer/developer/host), which means it is very utilitarian and not very recreational. While she needs the same fault tolerance and regular file maintenance coverage I have, she doesn't need the extra allotted space I require for my virtual machines, which means her drive configuration is slightly different than mine, though still similar in that the docs and OS/Apps are on separate drives. A heavy gamer friend of mine has OS/Apps on one drive, page file on a second drive, and docs on a third drive-- he claims it works good for him, though we've never tested changing the page file location.

I've tried to explain it in more than one way this time. Is what I'm saying making sense?
 
Give the OS a good 30 Gigs. Don't break up the rest. Making a 100Gb partition for your movies sounds great now, but that's what, 20 DVD rips maybe. Breaking up the 270 into partitions doesn't get you anything. Keep it all one big partition and control it with your file structure.
 
general said:
Give the OS a good 30 Gigs. Don't break up the rest. Making a 100Gb partition for your movies sounds great now, but that's what, 20 DVD rips maybe. Breaking up the 270 into partitions doesn't get you anything. Keep it all one big partition and control it with your file structure.

I don't think that DVD's are a good example of what I'm talking about. If you're actually storing your DVD's on your hard drive, they're not likely to change....ever. If that was me, I'd probably move them to the end of my partition, just like I do with all my albums...defrag once and never mess with them again (unless you add, rerip and/or delete an album) Performance isn't really an issue, since most movies are made up of very few files that are accessed once each over the course of 90 to 200 minutes.


GreNME said:
I believe you are over-thinking the problem, making it larger than it is. Even if you simply regularly schedule the built-in Windows defragger, you will eventually wind up with your most used files closer together near the beginning of the partition. Of course, this will include system files, and it will take several defrags and regular disk use for things to get to a more optimal setting, but it will happen. Don't trust programs out there that say they can fix your problems in one or two sittings with a few passes of their app. Just like with any technology, regular maintenance and nominal management are the best way to keep things running their best. Your partition size is going to factor in very little, except that partitions smaller than ~40 GB are going to experience more overhead with little benefit for things like the indexer (which, apparently, most people turn off anyway).

If your most commonly used files vary, this strategy seems like it will fail. Furthermore, I haven't found anything that says that XP's defragger actually does any sorting at all. The commercial version of diskeeper seems to make that claim. But even then, unless it keeps all applications stored contiguously, the problem persists.

For example, let's say I play Half Life and I play it a lot, but I really suck at half life, so while the executable and certain files from start up are all near each other, everything past level 1 is not, because I've never gotten past level (or 2...pick your level)...the rest of the data files for subsequent levels could be anywhere on the drive, including all the way at the end of data....maybe that's 300gb away.

The other advantage, as I've already noted, is that defragging a smaller partition is significantly faster than defraging a 500gb partition. The fact that it will move most frequently used up front may cause longer defrags, simply usage patterns change. I know my C drive gets fragmented very quickly.

I'm not terribly interested in the reinstalling of windows procedures. I'd rather use a relatively up to date slipstreamed disk. And God willing, I'll some day get a disk set up that doesn't install outlook express or windows messenger (the last one wasn't supposed to, but it did anyway:()

In the end, I'm certain that my OS will be on it's own partition this time. I'll move program files and My Documents to another parition (which may or may not be on the same physical drive) and the pagefile onto a seperate physical drive.

I will likely put most programs on one partition, though I might put games elsewhere, since I don't install and remove games as often as apps.
 
I recommend keeping a "high-performance" partition close to the perimeter of your drive's platters. While the rotational velocity may remain constant, the transverse velocity is linearly proportional to the radius at any given point on the platter.

You'll be thanking yourself for the precious seconds you'll save!
 
It works fine using games after a reinstall but make sure you export the reg info for the games otherwise there not going to work
 
nilepez said:
If your most commonly used files vary, this strategy seems like it will fail. Furthermore, I haven't found anything that says that XP's defragger actually does any sorting at all. The commercial version of diskeeper seems to make that claim. But even then, unless it keeps all applications stored contiguously, the problem persists.
Just so you know, the built-in defrag in Windows is basically DiskKeeper without their commercial interface (and scheduline plugins and some other proprietary stuff). It won't do some stuff that the commercial version can, like moving locked system files or the page file, but it will move files not in the volatile memory just fine. Try to test it some time: get the disk nice and fragmented by installing huge apps and placing large files on the system drive, run them once or twice, then let them sit for a couple weeks. Remove the large program and the extraneous files, then run a system cleanup and the built-in defrag. Use the computer like normal, and do the cleanup/defrag again the following week. You will begin to observe the tasks you perform most regularly getting more snappy. No, it won't add fifteen more frames per second on whatever game you play, but you will probably notice shorter load times-- if the game is installed on the system drive.

For example, let's say I play Half Life and I play it a lot, but I really suck at half life, so while the executable and certain files from start up are all near each other, everything past level 1 is not, because I've never gotten past level (or 2...pick your level)...the rest of the data files for subsequent levels could be anywhere on the drive, including all the way at the end of data....maybe that's 300gb away.
If you have 300 gigabytes on your system drive, you might want to examine why you have so much space taken up by programs and system files. ;)

All you're trying to do is to find holes in what I'm trying to tell you, yet you aren't applying that same scrutiny to having the program on a separate drive. What if you have Half Life on D: while the system drive is C:? Then, while playing Half Life, you are having to re-spin the C: drive every time a file that is not in volatile memory needs loading from the D:, which will occur far more often than you would reach that hard-to-attain level in the game after letting the cleanup/defrag processes regularly maintain the system. In fact, not only would you be incurring more disk reads spread over two separate disks (instead of fewer disk reads over one disk), but you would be conserving power output from your PSU, you'd be extending the life of your system drive and the other large drive (perpetual disk use is easier on the moving parts than up-and-down use), and you spend less time having to deal with system down-time due to hardware problems (and thus save a few bucks).

Remember: a program on a separate drive than your system drive still has to access the system drive to load the program. You are not reducing disk writes, you are increasing them.

The other advantage, as I've already noted, is that defragging a smaller partition is significantly faster than defraging a 500gb partition. The fact that it will move most frequently used up front may cause longer defrags, simply usage patterns change. I know my C drive gets fragmented very quickly.
You are not thinking logically here. A small drive has less of a need to be defragged than a large drive (depending on the contents). Your argument is very much akin to claiming that FAT32 is better than NTFS because FAT32 doesn't have the journaling overhead (which is a flawed argument and very incorrect in more than 10-20 GB drives). The only way to get smaller system partitions in new drives is to actually partition off smaller sections, at which point you either have unused space or yet another partition that causes the heads to move all over the platters anyway-- thus doing the exact opposite of the intentions behind a smaller system partition in the first place. Allow me to illuminate:

<cue MSPaint>
disk1.jpg


Hard disk with single-partition system drive. Yes, the heads need to move when something is to be accessed, which is why regular cleanup/defrag keeps things relatively close in proximity. Even when things are not very close, the single file system on the drive is simple enough for the computer to find the files in relatively short fashion (of course, cache sizes and seek times help with this). In drives that are larger than 80 GB, I strongly suggest using the Indexing service (or, even better, Windows Desktop Search) to index the drive and keep track of the locations. This provides long-term performance benefits and usually lengthens the life span of the drive.

disk2.jpg


Hard disk with system partition first, then other partitions following. However, when saving programs on partition 2 or files on partition three, then you need to perform read-writes tot he disk...
disk3.jpg


... causing much disk thrashing and significantly more movement of the read-write heads, and all the associated trouble I mentioned earlier.

Another thing to note is that, unlike my little MSPaint drawings, your HDD is a series of platters stacked together, not one single platter. Additionally, partitions are not striped across this stack of platters, they are stored (somewhat) sequentially, which means even more head movement. So, keep in mind that-- unless you don't use the rest of the drive the small system partition reside on-- you are multiplying the complexity of the stress on the moving parts, not just adding to it.

Look, I really don't care what you want to do with your own system. You obviously had your mind made up before this thread existed, you were simply looking for approval and not really an honest technical critique of your own partitioning scheme. You have your reasons for doing things your way, and you have every right to do so. However, I am letting you know that, in the future, you may want to take more things into consideration when planning out your disk drive schema than approbation of people on internet forums or earning 'l33t p01nts' with enthusiast friends. The questions you should be focusing on are "what am I trying to accomplish?" and "what are all of the levels of technical consideration?" when trying to decide how to lay things out. You are disregarding or questioning the need for such considerations on what seems to be like very little evidence of in-depth examination. In the end, it really doesn't matter enough to make any real significant difference anyway. You won't see a write-up on Toms Hardware showing charts and graphs of frames-per-second comparisons or Sandra scores.

Besides: it's your system. Enjoy it. Do your own thing. That's supposed to be part of the fun.
 
GreNME, if only I had read this thread more than 2 years ago!

Could a write up like this be made into a sticky?
 
You're forgetting our good friend the registry, every time you reformat windows is going to think that no programs are installed. You will either have to perform a repair/reinstall for each program or just pray that it works correctly without registry entries. So basically this isn't going to save you much, if any, time in the end.

You will have to reinstall programs, but at least he won't lose any data from his other partition. Also, he won't have to backup everything. I have a "Windows only" partition and it has saved my ass more than once when I had to reinstall windows.

5gb seems small for a windows partition. Mine's 30gb. I hope you never plan to do a XP to Vista upgrade as Vista uses about 11gb. You'll have to reformat the HD and reset the partitions.
 
The windows only partiions works, relink my docs and such to another partition - windows gets hosed, renstall with out loss of losing important info.
 
Back
Top