REALLY bad video performance...

Nazo

2[H]4U
Joined
Apr 2, 2002
Messages
3,672
Ok, my sister has a fairly old PC and is trying to at least get a bit better out of some games. Mainly the videocard was the first priority since she had only an old Radeon 9600XT. She's still using AGP though, so the video card options were pretty limited. Between her budget and a bit of searching around, the best I could come up with was the ATI HD 3850. Now, I don't expect this card to just blow games away, but surely it should show a marked improvement from a 9600XT... However, it does not. In fact, framerates are suspiciously close to the same sort of range as with the 9600XT. Bear in mind that initial testing was on World of Warcraft of all things. I decided I needed a comparison I was more directly familiar with (I don't touch WoW) so I put Left 4 Dead on there and tried it with similar results. Framerates tend towards the single digits most of the time with the occasional jerk.

She tells me that before we reinstalled the system that 9600XT was doing at least a bit better however. This makes me think it may be a driver thing. However, along the way I upgraded a couple of parts. She had only 1GiB of memory, so I wanted to give her my old 2GiB of memory. The motherboard she had was having trouble with my old memory though so I figured that while I was at it I'd switch out the boards. Both are the nForce 3, and my old board was just a little bit better more than anything else (much more overclockable especially, though, due to a none too great ambient temperature there is no overclocking taking place here -- this should at least mean more leeway when not overclocked though.) In fact, even the CPU is incredibly similar. She had the Athlon 64 3500+ and the board I switched in had the 3700+. Same exact clock (and basically the same core,) just twice the L2 (which likely doesn't help in gaming to be realistic, but certainly can't hurt.) It also had better cooling on it, which I'm sure doesn't hurt under the conditions either, though it surely should have no effect on performance since there is currently no overclocking.

I've tried tweaking the AGP settings. Namely, I turned off stuff like fast writes (though I personally never had problems with it with my previous ATI card on that board.) I tweaked XP a bit for better performance and even tried to disable swapping so the harddrive should be less of a bottleneck after loading is completed (and yes, you can get away with this with 2GB of memory with normal gaming. In fact, most things short of CAD design tend not to run into problems. Of course, it helps a bit that I keep things pretty light on my PC with even the shell switched to something more efficient, but I'm pretty sure it doesn't help even on the same scale as the difference I'm seeing.) Now, I don't really know EXACTLY where the 3850 stands and I assume it's far slower than my old X850XT PE that's the next best comparison I have, but I don't think it should be in the single digit framerates.

Any ideas? I'm pretty sure I got much better performance out of that same basic hardware other than the video card. In fact, while I don't have a direct comparison here, I'm pretty sure I got more out my my old 9600 Pro way back when (hey, it ran Doom 3 pretty surprisingly well for such a low end card and I'm pretty sure I still had that overclocked mobile Barton at that time) than she is getting out of her 9600 XT and this HD 3850 right now. I didn't have enough time for really good testing, but something definitely would appear to be wrong.
 
Your video card is seriously crippled by the slow single-core CPU, and your standards back then were a lot lower than you remember now. There's no easy way around that bottleneck, but you may be shocked at how cheap a DDR2-800 motherboard, dual-core CPU, 2GB of DDR2-800 RAM and a 4670 is if you cruise the combo deals at newegg.
 
Your video card is seriously crippled by the slow single-core CPU
No. That CPU can handle, for example, Bioshock quite well with very high quality settings. (And yes, Bioshock can be played on a X850XT-PE. SM 2.0 is already in the engine, they just disabled it and didn't bother to compile the shaders for it. It actually looks extremely good even in SM 2.0 on an X850XT-PE.) It drove me crazy then and I see now that it still bugs me today how people just assumed that once dual core came out suddenly single core sucked. (And similarly it bugs me that people assume quad core is better simply because it's even more cores when, in fact, in a way it's actually worse. Namely, few games can even take much advantage of dual core and almost none can truly take advantage of quad core, so you're spending either the same amount for a slower CPU simply to have more cores or you're spending more to get the same performance...) That CPU was no poor performer then and even today it's not necessarily anything to sneeze at. Oh, it may hold back a videocard's maximum capabilities a bit, but it won't even have an effect on it with this sort of situation and can still manage many modern games well I'm sure. But we're not even talking about the latest and greatest, but games that are meant to be quite backwards compatible. No, I assure you that while the CPU may keep it from going over some really high triple digit number when it runs at its best, it is NOT what is making it run in the single digits. Actually, in most things, I have seen very little difference between my old 3700+ and my current E6600. Mostly because the CPU just wasn't really holding me back. Oh, I will admit I had it overclocked a bit (2.4GHz was my limit by then -- that's one of those San Diegos that run extremely cool but won't overclock worth anything no matter how much voltage you feed it) but then, as I said, she won't be running the likes of Bioshock (and I don't think 200MHz made THAT much difference in such games anyway.)

and your standards back then were a lot lower than you remember now.
No. My standards were probably higher then in fact now that I play less PC games and play more the sorts where graphics matter less. However, my standards are far more flexible than most PC gamers. The reason being that I tend to play more old games than new games. For example, my favorite FPS games are games like Deus Ex (the original based on the original Unreal/UT engine, not the crappy sequel made on the Unreal 2 engine and designed to please the console crowd) and I'm no stranger to DOS games even. Right now my standards tend more to be "better looking 2D graphics than EGA quality" and "better looking 3D graphics than seven ploy Goraud-shaded wireframe models" than "highest possible graphics settings with 8x FSAA and 16x AF at 1920x1080 resolution."

Also, you're forgetting something rather important... It's not just my standards in question here -- in fact, it's not my standards at all that matter here. Her standards are based around a crappy MMO that has been running on a videocard that was horribly outdated the day she got it. Lowest quality settings have been all she has known for a good year at least now. Why on earth would her standards suddenly jump up that much? Besides, I've been gaming all of my life practically. I think I know enough about PC gaming to know when to try different settings. Ok, I will admit that I didn't think to mention that I used the lowest possible settings (no FSAA/AF/etc, minimum draw distance/quality/texture size/etc, and so on) but I kind of thought this was a given since obviously the first thing to test in such a situation is the settings.

Oh, and I looked over the L4D quality settings before trying it for a comparison test. In my experience, the 3700+ -- or let's say 3500+ as the L2 probably does only a little here -- probably compares more to at least a 3.2GHz P4, probably more. The videocard also easily exceeds the minimum requirements with her previous card even being the 9600XT versus the requirement of a 9600 vanilla (and this is assuming the 3850 isn't any better than the 9600XT -- which I doubt.) I doubt that when they posted those minimum requirements they meant "minimum needed to play the game at 5 FPS when using the lowest possible quality settings." Perhaps my standards are too high because I consider a game unplayable at such framerates. WoW's requirements are a different story. Her system now exceeds the minimum requirements by approximately double what they say it needs. Both the memory and videocard exceed the recommendations even and the CPU only fails because they say only that they recommend a dual core (and I'll bet you that 3700+ would outperform the lowest end dual core processors in any real life game. Oh, I'm sure most benchmarks might say it's worse, but I still think real life counts more. Heck, one of their suggestions is the P4D and I'll take an Athlon 64 over a P4 any day, even if it's a P4D.) 7-9 FPS with the lowest possible quality settings and all requirements far exceeded is unacceptable.

There's no easy way around that bottleneck, but you may be shocked at how cheap a DDR2-800 motherboard, dual-core CPU, 2GB of DDR2-800 RAM and a 4670 is if you cruise the combo deals at newegg.
I already know how cheap it all is. It's not as if it never occurred to me to check this much myself... She just can't afford it just at the moment. Regardless though, as I said, the hardware itself isn't the problem in this. Games like WoW don't exactly tax such systems very hard. I know because I ran it on that same hardware except a different videocard (I believe I had a Geforce 6800 nu back then with all of the pipes unlocked and the game's quality settings absolutely maxed out -- though I will allow that it has probably improved visual qualities some since then) and neither the CPU nor memory held me back in any visible way (of course, I care less about triple digit framerates perhaps. I tend to prefer vsync when framerates go that high as tearing detracts from gameplay for me at least.)
 
Last edited:
Nazo, did you load bios defaults on new board and performed a clean windows install?
 
Nazo, did you load bios defaults on new board
No, I set it to the best settings. I don't overclock the AGP bus or anything like that if that's what you're thinking though. I always thought that was a pretty bad idea for basically no real value. Right now the memory is on stock (other than that I lowered it to a 1T command rate) though since I can't remember what I worked out on it after long testing. I can find no value in completely resetting the BIOS as this is a lot I'd have to just change right back. If there's some particular setting you suggest changing, I'm listening, but I've already tried disabling AGP fast writes both in SmartGart and in the BIOS just in case this was somehow related (though I must say that I never had troubles with fast writes on my X850XT-PE despite all the claims people have of it causing problems sometimes.) Oh, and ATM there is no overclocking. The base FSB is 200MHz (so there's no worry over AGP/PCI being locked or not even -- well, I know this particular board could go way up to around 315 or so, but I don't dare do any overclocking with her ambient temperatures.)

and performed a clean windows install?
Yeah, this is actually part of the problem. She says her old card (the 9600XT) ran better before I reinstalled Windows in fact. This is a very clean install though. I haven't even had time to do much to try to tweak it. The only thing I've done to tweak Windows since this problem showed up was to try the tweak to keep the kernel in memory instead of paging and to disable paging altogether in the hopes of less swapping (I think her harddrive is holding her back here some. It's pretty old, though it is 7200 RPM UDMA133. Maximum bandwidth is fine, but I suspect seek latency is an issue. I don't think the harddrive is the culprit here though. For starters, as I said, it ran better for her before. Also, it still runs pretty badly even once everything is fully loaded and the harddrive access seems to have all but completely quit.)

Anyway, later today I mean to stick my X850XT-PE in her system just to verify what I already suspect. It's useless for her since it can't handle heat well at all, but it will be handy for proving what is going on (even if I'm not sure what to do about it yet.)
 
I ask because if you changed motherboard, added more ram and replaced the videocard, i always load bios defaults, or optimal defaults, actually never had issue with agp performance/stability doing it, and right after that, a clean windows install, slipstreamed to latest service pack, chipset, nic and audio drivers, since the card you're using is a not so new card, i suggest installing a not so recent driver version like 8.12.

good luck and let us know how it goes :D
 
A 3850 should crush a 9600. It's 4 generations the successor. You should be getting double if not triple the frame rate. Enable fast writes as XP only had a problem with it before service pack 1. Use the drivers that came with the 3850 as they are probably the only ones that will work worth a shit on it. Disabling the swap file does not mean better performance 100% of the time. In fact, it only does less then 50% of the time and only during certain actions.
 
Well, as I stated, it's not the video card. I've completely eliminated it from the equation now. I couldn't get over there yesterday (or most of today) but when I did a bit ago I tested my old X850XT-PE. I got about the same results even from it. I KNOW that videocard isn't that weak. Yes it's not the latest and greatest, but it's among the absolute best of its generation by quite a wide margin despite the lack of SM 3.0 support.

More importantly though, I've discovered something else that might be more important here. In general the whole system has been running pretty badly apparently. For example, viewing a web page (particularly those with Flash content) can be a bit sluggish. It scrolls poorly normally and Flash animations run horribly. This is pretty bad really. My EeePC with its crappy Intel Atom processor (oh how I wish I had gotten one of the last of the 900MHz P3-M models as I need the performance more than the power saving and it had darned good power saving on its own even really) can view HD videos on YouTube even, so these results are definitely not good and to me this seems like it's a darned good start to getting an idea of what exactly is going on here...

I started trying to run some PC benchmarks to try to get an idea here. Unfortunately, I haven't bothered with benchmarks in anything in quite some time... I didn't really know what to look for. Well, of course I tried PCMark05 first, but it wouldn't give me any results, so that's out. (Though, of course, it didn't tell me it wouldn't until AFTER I waited some 30 minutes for all of the tests to complete...) I tried Prime95's test, but I couldn't trust the results since the reports on their site don't exactly follow the client (though there are some FFT sizes that match up, I can't trust even those since they could even have a different number of iterations.) If the numbers are to be believed though, then same speed 3700+ CPUs should be producing numbers almost 10x faster than what hers did... Finally, though, I remembered one I rather liked called CrystalMark that's among the rare few completely free benchmarks and tried to look up results from 3700+s at stock speeds. I still am not 100% sure from this as the results were varying, but, all results seemed to be at least twice what I got from her system. In fact, here's the truly scary thing. I ran CrystalMark on my EeePC and the total was less than 1000 less on the EeePC with the EeePC getting a total of 22405 and her system getting 23289. This is mostly because the GPU on the EeePC sucks so badly that it performs worse than an abysmally crippled normal video card on her system. The CPU tests actually did far better with the ALU and FPU totals doing almost 4x better on the EeePC... When a 2.2GHz Athlon 64 (with extra L2 I might add given that these tests are the sorts that CAN benefit from more L2 cache) can't beat a 1.6GHz Intel Atom, you KNOW something is wrong... (BTW, I didn't feel like running a 3D benchmark, but, I did run across one example of a benchmark comparison where both the 8800GT and the X850XT-PE were present and it would seem like the 8800GT is about 2x as powerful as the X850XT-PE, though interestingly the 8800GT was on a system much like mine and the X850XT-PE was on a system much like hers. Well, the OGL test on my system that isn't even optimized for benchmarking was 38538 versus hers getting 9629 with the absolute lowest possible video settings, so I think this at least says somewhat given that this benchmark is about as simple as it gets...)

It looks as if something is holding the whole system back. Games were just the things that showed it the most (especially since she doesn't do things like encoding.) Earlier I tried different chipset drivers, including a "nForce Unified Remix" where they modified later drivers to be able to install on nForce 2 and 3 systems (the hardware isn't actually all that different, nVidia just wanted to try to force people to update all the faster -- much like how MS refuses to put DX10 on XP.) I just have no idea what's doing this at this point though. I know that 3700+ is capable of a LOT more than this system is getting out of it right now... Well, probably the day after tomorrow I can take another crack at it and play around with some things to see if I can figure this out.


EDIT: Well, I gave up. I just went ahead and switched back the motherboard+CPU since I know her old one was working fine and reinstalled. I still don't know what is wrong, but she seems to be up and running ok now I think. I tested Left 4 Dead with the HD3850 and it ran fine on the minimum settings at least (didn't test higher.) No jerking or slowdowns. She should be able to play her MMOs fine now.
 
Last edited:
Back
Top