Paging file

Phoenix86 said:
Now you guys are thinking on the right track... This is also why so many factors affect this (to induce paging), things like startup programs, services, and physical RAM could affect when a system pages. I really think this will have to be done in a very strict environment, esp. the amt. of RAM... Using out normal machines to run a test won't fly, we would have to make a 'test' OS load for this. I'll post more after lunch.
My only problem with a 'test' OS is that it would have to have the exact same memory management subroutines as Windows to be an accurate test of Windows memory management.

Or are you suggesting a stripped-down install of just the OS and test components? I would tend to agree in that case.
 
Fark_Maniac said:
By hand...I could output to a vcr and record that way...I have no other way..
Won't work. Too many variables where error can seep in, even if every participant has the absolute best intentions and is being as honest as possible.
 
By test OS I mean, not our main rigs' OS load. A clean XP install, patched, no 'extras' loaded matching VINs on engine and body... -er, you know what I mean. ;)
 
I look around me. I see things. Things like...empty yogurt containers. CD spindles. A shirt on the floor. Lots of papers.

You know what? I don't think I'm in a laboratory. I don't think very many people here are in a laboratory. I don't think you're in a laboratory.

So why are you insisting on laboratory standards?

Let's look at this another way...
There is a whole spectrum of computers, processors, memory, video cards...everything. And there are a lot of users here. That means "a large sample size". Probability tells us that there will be a bell-curve array of possibilities, ranging from "turning off the page file didn't help", to "holy guacamole, I can't believe how much faster it is." And there will be a bell-curve distribution of people's pre-conceived notions...ranging from "turning off the page-file has no positive effect", to "turning off the page file helps everything." Since that curve is centered on "I don't really have a feeling for it one way or another", if we can get a large enough sample, we can eliminate the effects of pre-concieved notions. Since we're really interested in the mean of the bell-curve for performance (specifically, does it help the majority of users), we can dispense with the rigorous laboratory standards. Besides, I don't care if disabling the page file helps a stripped-down operating system, running synthetic benchmarks. I care about how it will affect MY machine, and any machines I might come in contact with. If you're going to insist on rigid laboratory standards, I can't do that, and neither can most of the people here. So:
It is not possible
It is not necessary.
If you are going to insist on such standards, you are not going to find them here. Any future posts bemoaning the inexactness of our methodology are pointless...we KNOW we can't do laboratory work at home. If that is the case, thanks for the discussion, but you'd might as well move along; there's nothing to see here. I for one WOULD like to find out if eliminating the page file has some positive effect IN A NON-RIGID, NON LABORATORY, ANECDOTAL, WORD-OF-MOUTH WAY. We're obviously never going to satisfy your requirements...does that mean we shouldn't try? You seem to be saying yes; very well, good day to you. Anybody want to try and figure out a way we can do this? Since we've decided that it is not going to be a rigid scientific endeavor, we can use things like stopwatches, we don't need to verify our results through a peer-review process, and really, all we need to do now is decide on a series of tests. First, we need parameters.
We need data on:
frames per second
load times
times for memory-intensive applications
and one I'd like to see...boot times

We need applications or scripts that are easily available, common, and free.
So, in summation...anyone who wants to help...help. Anyone who wants better standards...tough.
 
I don't mean a clean room lab environment.

Just a fresh OS load, patched, with no: ad-ware running, BS startup items, etc. I mean it's not that a 'used' OS load isn't good, just that it isn't good for comparing against others. You want boot times, for example, and why not... However, if I have AIM, AOL, AIMster, Kazaa, AV, firewall, spyware, etc. it's going to boot slower than someone who maintains their OS, or even better a clean load.

I don't think that's too much, maybe for some, but I have an older HDD I could format and use just for this. I'd bet most of us have an older drive laying around, or one we could 'come up with' for short term use.
 
Well, if you did it in percentages, it wouldn't matter what you load...if it's booting 10% faster with the pagefile off, that's more informative if it's done on machines "as they are".
 
sorry, if you boys want a freshly formatted machine...I'm out. The public result doesn't matter that much to me to spend my time rebuilding my machine to the way I want it. I'll run any test, but my machine is tested "as-is"..which I'm willing to bet that is what a decent amount of people will come to this subject as.
 
O[H]-Zone said:
I look around me. I see things. Things like...empty yogurt containers. CD spindles. A shirt on the floor. Lots of papers.

You know what? I don't think I'm in a laboratory. I don't think very many people here are in a laboratory. I don't think you're in a laboratory.

So why are you insisting on laboratory standards?
You know what? Just because you don't have a setup for testing does not mean I do not. Please do not project your own inadequacies on me, because it's insulting.

O[H said:
-Zone]Well, if you did it in percentages, it wouldn't matter what you load...if it's booting 10% faster with the pagefile off, that's more informative if it's done on machines "as they are".
How are you going to be able to show with adequate objectivity that it is, indeed, performing 10% faster?

And to be honest, boot time means little. My machine may take two or three more seconds to boot than another, but it can still perform better once booted. And that is something that can apply no matter what OS is running.

Phoenix, I agree with you totally about having a clean install.
 
O[H]-Zone said:
Well, if you did it in percentages, it wouldn't matter what you load...if it's booting 10% faster with the pagefile off, that's more informative if it's done on machines "as they are".
The cleaner the test is, the more believable it is. There is no doubt this will be done to 'used' OSes, but it's not a good place to test. In fact, that is WHY labs are what they are.

Fark_Maniac, it doesn't have to be setup the 'way you like it' in fact, that's why I would want a clean load. We all install different software, and do different tweaks from the mundane to dramatic. That's why the clean load, Windows+patches+testing apps+tools=go. No need to install: office (unless it's a tested app), AV (shouldn't be on the internet at all), firewall (same reason), <insert alt. browser here>, etc... Shouldn't take more than an hour or so of keyboard time to setup, maybe 2-3 hours total machine time.
 
Phoenix86 said:
Fark_Maniac, it doesn't have to be setup the 'way you like it' in fact, that's why I would want a clean load. We all install different software, and do different tweaks from the mundane to dramatic. That's why the clean load, Windows+patches+testing apps+tools=go. No need to install: office (unless it's a tested app), AV (shouldn't be on the internet at all), firewall (same reason), <insert alt. browser here>, etc... Shouldn't take more than an hour or so of keyboard time to setup, maybe 2-3 hours total machine time.
you didn't get my drift. formatting and all is easy....though after the test...I don't want to go through the hassle of getting my machine back to the way I like it for my purposes...after the test. it takes days for me to get it back to it's state it is now. She runs super lean now. I don't mess with it unless it is absolutely necessary.
 
Phoenix86 said:
...

Fark_Maniac, it doesn't have to be setup the 'way you like it' in fact, that's why I would want a clean load. We all install different software, and do different tweaks from the mundane to dramatic. That's why the clean load, Windows+patches+testing apps+tools=go. No need to install: office (unless it's a tested app), AV (shouldn't be on the internet at all), firewall (same reason), <insert alt. browser here>, etc... Shouldn't take more than an hour or so of keyboard time to setup, maybe 2-3 hours total machine time.

I think he meant that he didn't want to wipe his OS to do testing and then have to do what it takes to return it to how he likes it after the testing is over. Maybe he doesn't have a spare drive.

EDIT: Damn. A bit slow, but at least I was right :p
 
Ahh, got it, that's why I said to do this on a old drive.

Although that doesn't mean we don't want to test on it, just that a 'cleaner' test is better accepted. It would still be nice to compare the same thing on 'real world' loads as well. But as GrenME points out, some people want a more definitive answer, a clean OS would satiate people like him more better.

I ain't gonna blow away my main OS for a test either. :) That rig has been running for over 2 years on the same load.
 
also, just brainstorming...wouldn't a slower hard drive produce a larger gap? Aren't laptop HDs only 4500 rpm? What about booting off of a USB HD...USB 1.0. I once saw my old old boss run/install NT4 on a zip disk...data transfer speeds were horrible. Tests could also incorporate where the pagefile is incorporated...ie, secondary drive (rpm's and cache could play a role), system drive..etc.
 
Testing on slower drives would make the difference more noticable. Remember, older doesn't have to mean slower...

But yes, whatever test we come up with would be good to show the difference in: 2 drive setups, RAID setups, performance differences between a PF on the OS's drive/partition and a slower drive (IE it may be better to keep the PF on a drive with the OS if there is a large enough speed difference between the main and secondary drive), basically any PF option could be tested, once/if we find a good method.
 
GreNME said:
You know what? Just because you don't have a setup for testing does not mean I do not. Please do not project your own inadequacies on me, because it's insulting.


How are you going to be able to show with adequate objectivity that it is, indeed, performing 10% faster?

And to be honest, boot time means little. My machine may take two or three more seconds to boot than another, but it can still perform better once booted. And that is something that can apply no matter what OS is running.

Phoenix, I agree with you totally about having a clean install.
As I mentioned...if our methodology is not up to your standards...good day to you sir.
 
O[H]-Zone, you're the one making accusations, not me. The standards being hashed out are not mine alone, but a consensus we can all agree on. If you plan on continually being objectionable towards everything I say, then go ahead and take your ball and bat and go home. There's no need for being a jerk about it.
 
djnes said:
someone PLEASE explain to me how this is hurting my performance!!!!!!!!

Sure thing.

First, let's consider your game loading times. In normal operation the memory manager will attempt to free up ram by writing less used pages to disk. So your game load consists of reads and writes; reading game data and writing unused pages.

Now if you don't have a page file, you'll see that there's a lot less I/O, and levels will load faster.

Sounds good so far, doesn't it? Here's the problem - the OS just dumped a lot of pages that it may need when you exit the game. If there's a page file, they can be directly addressed and fetched, without going through all the overhead of the the file system.

If you use your computer as a high-end X Box, this may not be a problem for you. But if you've got a bunch of apps running, a page file can make the difference between a nice system and a complete POS.

Try this:

Open a browser window, open an explorer window, maybe another window or two. Load you game, and play a couple of levels. Quit the game, and now activate those windows.

Try timing it with and without a page file.

Without a page file, you'll load the game faster, but be much slower until everything loads up again. With a page file, the game may load a bit slower, but your system will be responsive as soon as you exit.

Besides, what do you think happens when you have no page file and you run out of memory? Is a stable computer worth a few longer seconds of load time?
 
Pigster, there are a few flaws in your analysis.

Here's the problem - the OS just dumped a lot of pages that it may need when you exit the game. If there's a page file, they can be directly addressed and fetched, without going through all the overhead of the the file system.
When you have no page file, paging still happens. The data HAS to be paged somewhere, it can't go into la-la land, otherwise the program would crash. What your doing when you remove the page file is take the HDD out of the VM pool, leaving only RAM, not disabling paging...

Without a page file, you'll load the game faster, but be much slower until everything loads up again. With a page file, the game may load a bit slower, but your system will be responsive as soon as you exit.
Again, not true, because paging does still happen. In fact, quite the opposite, this is one of the benefits of no page file. Because the program your switching to has paged it's memory in RAM and not on the HDD, it loads faster.

You have to remember why a page file was invented, it's not a caching tool, as the HDD is the slowest component in the system. Cache is ALWAYS faster than the device it's caching from. Cache on a processor is faster than RAM, cache on a HDD is faster than the HDD itself. So why was it created? So you can do more with your machine than the limits of your RAM. Think about older computers that didn't have a GB of RAM, they had 16, 32. Now a single program would take half of that, and windows the other half. OK, where is the multitaking environment...?

In order to do that, you would create a swap file, this would give you 'virtual memory' to work with, increasing your capacity. VM=RAM+a chunk of the HDD, when people started sizing this file, the norm was 1.5-2 times your RAM. Fast forward to today, and we have systems with 1GB of RAM, and well, now you DO have *enough RAM to everything* in RAM, and you don't need the HDD to supplement the VM pool, RAM is enough.

Besides, what do you think happens when you have no page file and you run out of memory? Is a stable computer worth a few longer seconds of load time?
Obviously, you don't know, what happens... Are you ready for this?
<suspenseful music>
You get an out of memory error message.
</suspensful music>

Yep, that's it... You click ok, close some programs, and go about your merry way. It's the same thing that happens with your run out of memory when you do have a page file.

OK, so I put some **s around "enough RAM to do everything," why? Because, anyone who does this should know the memory requirements BEFORE removing the PF. That's why it's common discussion @ the 1GB mark, generally that is enough RAM to run 90% of most peoples applications, games included. However, if you have not tested, and don't know the requirements of your machine, that's your own fault. I sure as hell didn't tell anyone to do it that way.

No one is making blanket statements that everyone should remove your PF. djnes, and I have said this several times.
 
Uh, I only bothered to read the first two pages of this thread but uh, somewhere around there got me an idea (maybe it was already proposed-- shoot me). Someone said that having no page file actually improved the loading times (meaning decreased the time it takes). But then there's the other thing that some software/games forces you to actually have a page file. They may not take up a whole lot if that takes any common sense so, why not go in the middle between no page file and having a limited amount of page file. Would 128 MB static for page file sound good enough? Perhaps 64 MB? Much would transferred on RAM and speed up performance. So wouldn't that provide the 'median-all-round' experience?

Inputs or comments to prove me wrong, go ahead.

-J.
 
Oh, I should add that either Kyle or someone else do an Official [H]ard Paging File Lab Test and provide results in boot up times, game loading, various popular 3D imaging graphical software across on various hardware especially with hard drives and such and provide an actual write up covering the truths and myths of file paging. Opinions suck, they just mislead people. I just like the facts to follow by. Well the only fact I've heard was that: some programs require a page file. Everything else is all user-blasphemy-varied.

-J.
 
Phoenix86 said:
Pigster, there are a few flaws in your analysis.

Oh, I'm sure of that :p Let me answer your first objection now, and I'll answer the rest tonight

When you have no page file, paging still happens. The data HAS to be paged somewhere, it can't go into la-la land, otherwise the program would crash. What your doing when you remove the page file is take the HDD out of the VM pool, leaving only RAM, not disabling paging....

I agree that the data has to go somewhere. If you look at http://support.microsoft.com/default.aspx?scid=kb;en-us;Q99768 you'll see the interesting observation

If Windows NT requires more space in RAM, it must be able to swap out code and data to either the paging file or the original executable file.

You see, if the OS requests some pages of memory, it will swap unused pages to the paging file to free up ram - but if there is no paging file, it will dump them and reload from the exe as needed.

This process of opening the exe via the file system is MUCH slower than simply fetching a 4K page via memory mapped i/o.
 
Has anyone ever thought about putting the pagefile (PF) on a RAM disk? This seems like it'd be the best of both worlds. I figure the people that profess they've got enough RAM to not need a PF would probably be the same people that could pull off having a RAM drive to have the PF reside on.
 
BobSutan said:
Has anyone ever thought about putting the pagefile (PF) on a RAM disk? This seems like it'd be the best of both worlds. I figure the people that profess they've got enough RAM to not need a PF would probably be the same people that could pull off having a RAM drive to have the PF reside on.
Actually, any accounts I've seen or heard from about moving it to the RAM drive have yielded the same results as having it on the hard drive. If Phoenix, SJC, and I can come up with a workable method, I don't see why trying the RAM drive configuration couldn't be tested too.

GeForceX said:
Oh, I should add that either Kyle or someone else do an Official [H]ard Paging File Lab Test and provide results in boot up times, game loading, various popular 3D imaging graphical software across on various hardware especially with hard drives and such and provide an actual write up covering the truths and myths of file paging.
Once again, boot times mean nothing to system performance. If one machine boots a couple seconds slower than another with the same hardware, it is doing so because it is loading more things at startup. System performance is not going to be better if you gain an extra couple seconds on bootup.
 
pigster,

Always check the bottom of the KB articles, sometimes the info is still relevant for later OSes, but this is WAY old...
The information in this article applies to:
Microsoft Windows NT Server 3.1
Microsoft Windows NT Workstation 3.1


You see, if the OS requests some pages of memory, it will swap unused pages to the paging file to free up ram - but if there is no paging file, it will dump them and reload from the exe as needed.
It will not 'dump them,' otherwise the program trying to be paged would crash, which doesn't happen. You seem to be missing the point that removing the page file doesn't stop paging, it just changes WHERE it pages to, directly to RAM, since the HDD isn't an option. There are always pages running in RAM, this just adds more (uses more RAM) because you have removed a section of the pool, namely pagefile.sys.

Here's the thing your missing. VMM, Virtual Memory Managment. The entire subsystem manages memory, and a large chunk of that is paging. Paging is the term used for moving pages of memory from RAM to HDD and back, in effort to keep as much RAM available as possible for the foreground application(s).

So VM consists of two things, RAM+HDD=VM. The VM pool is still used on a system with no page file, it just consists of only RAM. Now, if you have enough RAM to run all applications at once, why page anything to the disk? Simple answer, you don't...

BobSultan, by removing the HDD from the VM pool, you ARE page memory in RAM, there is no need to go through the overhead of a RAMdrive. Disabling the page file does the same thing without a RAMdrive. This is why the OS doesn't freak out when you remove it, it simply runs the exact same memory management system, except it's pool is only RAM instead of RAM+HDD.

edit:
Once again, boot times mean nothing to system performance.
Boot times, sure, we are not looking to create the Michael's Computers 7.7 boot time. Load times, however, are a different story. When I double click on an app/game, if it loads faster, thats a good thing, as that's "me-time" waiting for the app to load. Booting a machine, by comparison, doesn't happen nearly as often (most are run 24x7), and I am more willing to hit the power button and not sit down until it's ready to login.

If Phoenix, SJC, and I can come up with a workable method, I don't see why trying the RAM drive configuration couldn't be tested too.
Sure, like I said earlier, if we can come up with a method it would work to test any page file configuration. However, I also think this would be very similar to having no PF, since the end result is all VM paging happens in RAM.
 
Well, I have to report that disabling the page file, for my setup at least, is not an option. I tried it, and it was much less stable. I'm not surprised though; my machine is a dinosaur:
Abit KT7A-RAID
Athlon 1.33Ghz Thunderchicken
512MB PC133 RAM
6 X 160GB Seagate HD's
GeForce 4 Ti4600 VIVO
Hercules GameTheater XP Sound card
2 X NIC
IEEE-1394 Card
MSI TV@nywhere tuner
I set it back to a static 1GB of pagefile, and it's much more stable. I don't think 512MB RAM is quite enough to cover my needs. Just my result...
 
When you say 'unstable' what does that mean?

Did you size your memory requirements first?

I'm guessing no, to size your memory do the following.

Use your machine a lot. Open a lot of various programs, as many as you would under the most extreme usage. Open several apps and each one of your games (games separately). After a while you will have established the 'max' RAM usage for your requirements (can and will fluctuate over time/when you use different software). Now, open task manager and look at your Commit Charge (K) peak. This is the most VM you used in all your operations. On the machine I'm working on now, it's 600MB. Then compare that with your Physical Memory (K) Total (total RAM in the machine). If the Total Physical Memory is significantly over (this is where you make your own decision about 'what is enough') then you should be able to remove the PF.

What is enough? I don't know, that depends on how you use the system. I used to run sans PF on this PC (work machine, BTW) but I hit the 'out of memory' message one too many times, and setup a PF, I kinda had to. I only have 512 MB RAM in this laptop, if I trimmed down my application, it was ok, but now I need to run 2 more apps full time. So my usage requirements change...

However, even if you guess low, the worse thing I have *EVER* seen is 'out of memory' errors. Remember, this is my work PC, I can't have it crashing in the middle of the day because it's 'unstable.'

Dinosaur, hah, that certianly has nothing to do with it... This machine is an IBM T-20, a 600Mhz PIII with 512 SDRAM. The first machine I setup with no PF was a K6-300 with 512 RAM, iirc. Age has nothing to do with it...
 
Boot times, sure, we are not looking to create the Michael's Computers 7.7 boot time. Load times, however, are a different story.
Load times would count as performance, so you'll get no argument from me there. I already know that you realize boot time != load time for programs.

The trick here, though, is whether you have them set to load (partially) at boot time or not. For instance, MS Office, Mozilla, Avant Browser, AIM, MSN Messenger, and a whole list of other programs normally default to having part of their programs load to memory at startup, so that they get faster load times once the machine is up. :)
 
This was taken from a page at :
http://www.tweakxp.com/display.aspx?id=123963

Here's how to boost you RAM memory if you have unused space on your hard disk. It will use space on your hard disk as if it were RAM. It isn't dangerous for your computer and will not make it unstable, it will only transform space into RAM memory. NOTE : You can always revert this tweak if you need extra space on your computer.

1. Go in the advanced tab of your system properties (right-click My Computer icon and choose properties), click on the performance settings button, click the advanced tab and then click the virtual memory change button.

2. Choose the hard disk you want to use if you have more than one, then choose custom size and type the number of megabytes of your disk you want to use as RAM. The maximum amount per hard disk is 4096.

3. Click the set button and click OK. You should be prompted to restart your system.

4. If your computer doesn't run faster or that you need hard disk space, just put a smaller number in the boxe or simply click No paging files.

This tweak dramatically improved my computer speed and I noticed even if I have 1 GB of RAM.
 
My computer is a pure gaming machine. All I want to know is, what can I do to the page file to make it so that it boots faster into Windows XP Home Edition?

By the way, I have Athlon 64 3200 CPU with 1 GB of Crucial RAM.

Also, when I shutdown Windows XP, how can I make it shutdown faster?
Thanks for any answers.
 
Qwestman said:
My computer is a pure gaming machine. All I want to know is, what can I do to the page file to make it so that it boots faster into Windows XP Home Edition?

By the way, I have Athlon 64 3200 CPU with 1 GB of Crucial RAM.

Also, when I shutdown Windows XP, how can I make it shutdown faster?
Thanks for any answers.
GAH! Do you people not understand? Startup and shutdown speeds have little to nothing to do with your game performance! You could have the fastest boot time in the world and a near-instantaneous shutdown, and your actual gaming performance would suck ass. All boot time has to do with speed what is loading (at startup), why it's loading (at startup), and what network connections, if any, are being initialized.
 
GreNME: What's with your overwhelming need to prove yourself right? What's with this holier-than-thou attitude? Only someone with as much of a god-complex as you seem to have would consider accepting your "challenge". Heh, reminds me of grade school... "I dare you to do this! I dare you to do that!". You're obviously not in grade school, so how about you stop acting like it?

Thanks.
 
Well, gee... with such a well-thought-out and valid point-by-point analysis of the issue like that, I have finally seen the light.

Oh, I'm sorry, were you trying to say something other than throwing insults?
 
So, you're saying my posts hurt you?

Sorry about that.

Now, when you can come up with a reasonable and intelligent argument, come on back and say hi. Or maybe you want to take part in the little test that Phoenix, SJC, and I are attempting to figure out? If you think you have something to input besides mindless insults, feel free to jump on in. The more the merrier.
 
bluehorizon said:
GreNME: What's with your overwhelming need to prove yourself right? What's with this holier-than-thou attitude? Only someone with as much of a god-complex as you seem to have would consider accepting your "challenge". Heh, reminds me of grade school... "I dare you to do this! I dare you to do that!". You're obviously not in grade school, so how about you stop acting like it?

Thanks.

How about you stop trying to incite a flamefest?
If you have nothing to add to the thread, I suggest you not post in it at all.
 
Was I took all this good information ,plus my post from Tweak XP .and tried :

1 -40 GB H/D with WIN/XP PRO OS
= total space taken up 1.67 GB



1- 80 GB H/D with software utility's,MP3's, Games ,Misc. stuff, and PageFile set 4096
=total space taken up 10.5 GB


Of course I have 1GB of Ram

Before P/F set at 4096 ,DOOM 3 ran at 40' s ( I had it set 2000) Initial and max

Now with P/F set at 4096 ,DOOM 3 runs (I get around 56-55's) back and forth.

and this is with a BFG5900XT,flashed with underclocked 5950U bios ..

so it works for me.

Plus I defraged everything with O&O defrag before setting it to 4096

There is no way after trying everything here in this thread ,that I would run
without a P/F..and the truth is I don't think people are setting there P/F anywere big enough .
 
O[H]-Zone said:
Well, I have to report that disabling the page file, for my setup at least, is not an option. I tried it, and it was much less stable. I'm not surprised though; my machine is a dinosaur:
Abit KT7A-RAID
Athlon 1.33Ghz Thunderchicken
512MB PC133 RAM
6 X 160GB Seagate HD's
GeForce 4 Ti4600 VIVO
Hercules GameTheater XP Sound card
2 X NIC
IEEE-1394 Card
MSI TV@nywhere tuner
I set it back to a static 1GB of pagefile, and it's much more stable. I don't think 512MB RAM is quite enough to cover my needs. Just my result...

Whoa there!!! Don't knock the T-Bird buddy ;) I've got almost the exact same system as you (only diff is 1GB of RAM, an ATI 9500Pro, and SB Live!).
 
Oh, I'm not knocking it...this old girl can do amazing things. I have it set up as a gaming/video rig...I can do cable, S-video and composite Video in through the TV card, S-Video and composite in/out through the video card, and DV by firewire. Sony DVD burner, and the GameTheater XP and it's a true multimedia rig. It's just old...I need dual opteron/dual 6800 GT SLI goodness!
 
You know what? Just because you don't have a setup for testing does not mean I do not. Please do not project your own inadequacies on me, because it's insulting.

inadequacies? Is that really how you feel? please re-read his post. I hope you were just spun up and didn't really mean to word it that way (for your sake as much as those around you).
...

I have managed to read this whole thread (despite the irritating interactions...) HOPING to see several people say "sample size counts." Why not run the test on clean and dirty installs? People USE their machines. I don't care what the difference is off a clean install... is that with or without OS updates btw? How come? (regardless of which you choose.)

Please, please understand that MOST people would gain more from a "stopwatch feast of dramatic scale" in where people timed the running of apps (a chunk of them "open your word processor and run this...open your two favorite games and...) with and without the PF. sample size WILL allow this to give you USEABLE data if you only had 20ish+ people do this. I'd be willing. We'll all post our data, dump it into excel and TADA... a REAL answer to how the PF changes performance .... we'd have to post hardware and OS data and "other running apps" stuff as well to best sort it all... but this seems much easier than the lab approach... and more importantly, useful, as it addresses the question "Will running without a PF help my XX system run faster? I mostly do XX."

cheers and good night.

sharp
u_sharp.gif
 
sharp said:
inadequacies? Is that really how you feel? please re-read his post. I hope you were just spun up and didn't really mean to word it that way (for your sake as much as those around you).
Actually, it was a reply done with the same amount of sarcasm as the original statement. It's a sad state of affairs on the [H] if everyone switched to "completely literal" mode for everything except flame wars.

In other words, chill out. I read it. It was a ridiculous claim, since I was not and am not making the requirements. In fact, in case you haven't been reading what I've said, you would see that I'm going with a consensus specifically to avoid having it be a situation set forth by me.

I have managed to read this whole thread (despite the irritating interactions...) HOPING to see several people say "sample size counts." Why not run the test on clean and dirty installs? People USE their machines. I don't care what the difference is off a clean install... is that with or without OS updates btw? How come? (regardless of which you choose.)
Actually, in this case, sample size doesn't count. Either not having a page file is going to speed up the page fault and program access/load speeds or it will not. This isn't a social sciences study, nor is it a study that will be dealing with a sizeable margin of error. Either there will be a performance increase, a performance decrease, or no change (the three probabilties I already laid out.

In addition, this is in no way an "official" study performed for any company or organization. In my own testing, it has been done on numerous configurations, including "dirty" installs with lots of programs on it. It's been done on completely clean installs, too. There has never been an increase, and when the amount of memory is 512 MB or lower, there is a decrease in performance (due to running out of memory, leading to the alerts Phoenix mentioned).

Please, please understand that MOST people would gain more from a "stopwatch feast of dramatic scale" in where people timed the running of apps (a chunk of them "open your word processor and run this...open your two favorite games and...) with and without the PF.
I'm sure they would, because it would be visually engaging. However, in the case of software, it is not dealing with the hard numbers and timing methods to adequately guage load and performance times.

The problem with a stopwatch is that it requires a human hand to operate it manually, and with it all of the human errors that reflexes and speed can cause. In essence, a stopwatch would increase the margin of error by introducing variables that we cannot account for.

sample size WILL allow this to give you USEABLE data if you only had 20ish+ people do this.
No, if you introduce stopwatch testing, the margin of error shoots way up and the number of tested machines would have to jump into the hundreds just to make up for sampling error and confidence interval. In pure scientific terms, it's already a shady test as it is having people with already-formed opinions taking the data. However, giving the participants the benefit of the doubt and trusting that having a consensus-led set of parameters to satisfy perceived needs will (hopefully) make up for the distance and opinions of the participants. All it would take, to be honest, is one person to fudge numbers and the whole comparison is shot to hell. I am willing to trust anyone who wants to do this enough to not fudge the numbers.

We'll all post our data, dump it into excel and TADA... a REAL answer to how the PF changes performance ....
You mean to how the PF affects performance in multiple different configurations, which isn't exactly required to be a single "REAL answer" and is far more likely to be a gradient. The disagreement here is which direction the gradient actually is.

we'd have to post hardware and OS data and "other running apps" stuff as well to best sort it all... but this seems much easier than the lab approach... and more importantly, useful, as it addresses the question "Will running without a PF help my XX system run faster? I mostly do XX."
Easier != more accurate. More visually engaging != more accurate. And the only way to ensure an true answer to whatever x is would be to introduce x running along side the test simultaneously. There are a few problems here, as well:
  • If x is a game, this will not mean all games will work better one way or another. It will mean the exact game that was used in the test will. Games are written too differently with different code, even when they use similar (or the same) engines.
  • More programs outside of games are written to specifically take advantage of a page file than there exist programs to specifically take advantage of a lack of page file. Most programs are page file agnostic, because there is often no way for a developer to determine page file size or even the existence of one before release.
  • No two programs are going to be exactly alike.

I can understand your enthusiasm for wanting to find a possible final say in the matter, but there are a few flaws in the approach you suggest that could potentially poison the results. To be perfectly honest, some documentation from Microsoft themselves (which they may or may not agree with) and a specifically-designed-and-written test-bench would need to be written for a more accurate test. I'm cool with doing both in a cooperative manner, as long as there are those who can help out (there's no way I'd have time to write a bench for it on my own time all alone... I'm not completely sure where to start). With the lack of that, we need to come up with reasonable controllable alternatives. Adding variables with less control is not the answer.
 
Back
Top