8800GTS 512MB -- Windows XP or Win 7?

I was wondering if there are specific W7 titles that run poorly modern OS vs. W7.

Like Crysis is a famous one that runs best on XP.

Velocity Ultra runs perfect on Win7, but is a stuttery mess on Win10.
 
I don't know about win7 since I skipped it but sometime between Vista and 10 it became impossible to play Deus Ex Invisible War unless you installed it on an external hard drive. Only true sickos like myself would ever run into that.
 
Also Fallout 3 becomes unstable if you run it on a CPU with more than two cores and on a version of Windows newer than Vista.
 
Most of it was tied up with DRM of some kind of half baked hardware support from both directions. Whether it was store bought systems that could barely launch newer window or Nvidia allegedly over extending themselves to the point that nobody but "Gamers" would touch their products.
 
Yeah, no. Never happened.

Windows 7 runs like trash on anything less than 2 GB for 32 bit and 4 GB for 64 bit. Windows Update for almost the entire lifespan of Windows 7 had a bug that would consume all available system memory on low memory systems and keep the machine locked up in a fever pitch disk swapping routine for literally days.

https://social.technet.microsoft.co...fd1a5b4/windows-update-scan-high-memory-usage
https://www.sevenforums.com/perform...high-memory-usage-svchost-windows-update.html
https://woshub.com/fix-high-memory-usage-by-svchost-exe-wuauserv/
View attachment 542056

Windows 7 also consumes over twenty times the disk space that even an SP3 install of XP does. That grows exponentially larger with Windows Updates being downloaded and stored automatically on the machine, as well as volume shadow copies. Heaven forbid you have a bug that causes Windows to automatically generate crash dumps, which can eat up gigabytes. XP era machines can and did experience complete exhaustion of drive space, resulting in even slower machines. I've had even machines designed for Windows 7 have 100 GB drives completely eaten up with Windows memory dumps.

tl;dr, you're not getting better performance from a bloated OS on 256 MB of RAM. You must have had a fever dream one time years ago. There's a reason that a whole generation of kids grew up hating Asus EeePCs that had shitty Atom processors and 1 GB of RAM trying to run Windows 7. If you forced them to use a single core processor with 256 MB of RAM on Windows 7, you'd have a nationwide mutiny on your hands.



Windows 7 is literally Vista with a spit shine. There is no major difference between the two that would make 7 run better with less memory. I've built literally thousands of machines with both operating systems, they behave exactly the same on low memory, they run like garbage. You aren't doing anymore than notepad.exe and one 90s game on 512 MB of RAM on either. Forget about opening an internet browser. Windows 7 definitely in no way has better resource management than XP did.

Windows XP SP3 had the same minimum memory requirement as RTM, 64 MB. It would also run equally as garbage on 64 MB.

This tracks with my anecdotal experience trying to repurpose my old Athlon 64 4000+ system w/ 2GB RAM to give to my bro who was too cheap to buy a PC several years back when 7 was current still. I had only ever ran XP on it and maybe dual-booted XP 64-bit and then Vista on it for a short period just to test them out, and thought maybe it could run better on Windows 7 64-bit since it was a 64-bit CPU. God no.. Win7 ran like complete ass on it, lol. I seem to remember Win7 hogging all available RAM on it like that as well and kept wondering why it was so resource intensive. 2GB of RAM on XP was gravy and was like running 32GBs today; always enough resources to do whatever you want and would generally fly.

But since XP was EOL'd at the time, I gave it to him running Win7 still and also gave him a 1GB USB drive to help out with the "readyboost" feature that moved the page file to faster flash storage instead of the HDD and that actually did help noticeably.

Anyways to OP, I see you already went with XP, which is the right choice if you want to limit yourself to only one OS for this hardware. But I would personally partition the HDD or add another HDD and dual boot XP and Vista just for the DX10 games/features if you like.
 
Last edited:
Yeah, Windows XP is fine too. If it stays disconnected from the internet you will be fine.

Also Fallout 3 becomes unstable if you run it on a CPU with more than two cores and on a version of Windows newer than Vista.
I played this last summer. There is a fix for that multicore cpu issue, which I ran into myself. It's just a 1 line edit to a config file. Played fallout 1, 2, 3, then New Vegas. Even restarted Fallout 4 but haven't kept playing it again, I played that previously, my first fallout game.


As far as performance of xp/7, that was from my own experience. I ran dual-boot XP and 7, and only bothered doing so for a couple months. The Windows 7 was faster. I'm sure it depends on both the cpu and the ram, some combinations, Windows 7 results in a better experience. Of course it will run like shit (if at all) on something too old.
 
Yeah, Windows XP is fine too. If it stays disconnected from the internet you will be fine.


I played this last summer. There is a fix for that multicore cpu issue, which I ran into myself. It's just a 1 line edit to a config file. Played fallout 1, 2, 3, then New Vegas. Even restarted Fallout 4 but haven't kept playing it again, I played that previously, my first fallout game.


As far as performance of xp/7, that was from my own experience. I ran dual-boot XP and 7, and only bothered doing so for a couple months. The Windows 7 was faster. I'm sure it depends on both the cpu and the ram, some combinations, Windows 7 results in a better experience. Of course it will run like shit (if at all) on something too old.
Ja, I've never had an issue running Fallout 3 on Windows 10.
 
Yeah, Windows XP is fine too. If it stays disconnected from the internet you will be fine.


I played this last summer. There is a fix for that multicore cpu issue, which I ran into myself. It's just a 1 line edit to a config file. Played fallout 1, 2, 3, then New Vegas. Even restarted Fallout 4 but haven't kept playing it again, I played that previously, my first fallout game.


As far as performance of xp/7, that was from my own experience. I ran dual-boot XP and 7, and only bothered doing so for a couple months. The Windows 7 was faster. I'm sure it depends on both the cpu and the ram, some combinations, Windows 7 results in a better experience. Of course it will run like shit (if at all) on something too old.

Well yeah that's true. If you edit a .ini file to force the game to only use two cores then it's perfectly stable in the newer versions of Windows too. Going down rabbit holes for fixes like that though inevitably causes me to spend almost as much time tweaking stuff as I spend playing games. I spent about two days straight fiddling with Cyberpunk 2077 in order to get it running on my Athlon II PC. Sometimes it's nice to just be able to start up games and immediately begin playing and enjoying them. And period appropriate hardware and software can do a lot to keep things from getting over fiddly. Especially if you have less than computer savvy friends that game on your PCs as well like I do.
 
Retro from like 2009? That's not retro that's just obsolete. Also, the last time Nvidia raw dogged consumers. So if you want to relive Nvidia gouging consumers the first time you must be a fan boi lol.
 
Retro from like 2009? That's not retro that's just obsolete. Also, the last time Nvidia raw dogged consumers. So if you want to relive Nvidia gouging consumers the first time you must be a fan boi lol.
Yesterday's shitpost of the day is brought to you by SpongeBob. 🎉👏😎
 
There is a program called StartAllBack that you can run on Windows 11 that makes it look like older OSes. That would give you better security along with the "retro" look.
You could also run linux and maybe take inspriration from the Retro Pi stuff
 
Back
Top