Playability Perception - Framerates & Clockspeeds

AuxNuke

Gawd
Joined
Nov 11, 2001
Messages
605
Let me apologize ahead of time for the long post... but I hope some of you would either find this intresting or have some insight on this topic.

Previously, I had a single BFG 256mb 7800GTX which served me well over the last 8 months or so. About a month ago, I wandered into the realm of widescreen gaming with the purchase of a Dell 2405FPW. The 7800 wasn't really cutting it for the 2 supported widescreen resolutions for this monitor, 1680x1050 and the native 1920x1200, for some of the more intensive games (CoD2, FEAR, Quake 4, etc.)

Knowing that the release of the 7900 series was right around the corner, I settled on playing some games at 1680x1050 with some eye candy turned down and AA OFF. With the release of the 7900 series, I pulled the trigger and purchased 2 7900GTX's. I was lucky enough to receive the cards yesterday at around 1pm and, ironically, also have a Friday off from my chaotic job.

After installation, I used the reviews here at the [H] and some other places as a guideline as to what settings I might be able to play at. Those of you who read the graphics card reviews here at the [H] know that their benchmarking teqniques measure the gaming experience on each card being reviewed rather than a scripted apples to apples test. With the inclusion of widescreen testing in their latest review, I was excited to see such high res settings being playable on a pair of 7900's!

After playing these games for hours yesterday and hours today, I have found the results of the [H] review to be fairly close to what I would also deem "playable" in a single player game. However, these settings are definitely NOT playable, IMHO, in multiplayer.

Lets use CoD2 as an example. Settings are 1920x1200, 2x TR MSAA, AF Enabled in the game (Auto sets AF to 8x). Fire up one of my favorite servers and BAM... chop city. At first I thought something was wrong with my settings (like the terribly annoying fact that you have to click "Optimise for SLI" every time you start the damn game :mad: ). So I looked over everything, and tried again. Same thing. After a restart of my computer, I tried again, this time on another server to rule out it having anything to do with the other server. Same thing... low pings... beautiful visuals... unplayable multiplayer. Only after stepping down to 1680x1050 was the game playable again in multiplayer. Even THEN it wasn't always up to my personal "standard."

So I fired up FRAPS, set the resolution back to 1920x1200 and did some testing to see if I my system was on par with theirs. I benchmarked part of the 2nd level in the Russian campaign and got around 39.8 FPS avg. Looks about right. I ran the test again, but this time in multiplayer and got around 30.2 FPS. Definitely lower frame rates but not only that... the game felt a lot more sluggish.

I then started doing a little video card OC'ing to see if that had any effect on the playability of the game. At 700 / 1750, the benchmarks showed some improvement but the real world gameplay felt about the same. So then I decided to run my CPU at its max overclock. I had been doing much of my testing at 2.5 and proceeded to bump the cpu up to 2.6 @ 1.51v. THIS made a noticeable difference in the game play. The benchmarks were, again, only slightly higher than before but the real life gameplay was noticeably smoother.

All of this is making me start to wonder... is it our CPU's that are holding us back? These 7900GTX's in SLI seem to be able to handle anything the games of today can throw at it... as long as you have the fastest cpu to back it up. It also seems that certain games are more dependant on the cpu more than others; CoD2 for example.

Playability, as we all know, is subjective to the specific tolerances of the individial gamer. In my case 30FPS AVG in CoD2 or 30 fsp MIN in Quake 4 is unacceptable when in multiplayer. What do you guys think?
 
I think that 50fps + seems good , Im using integrated 6100 atm but thats only until i can
get somenthing in the range from 7600gt to 7900gt. But ive kinda gotten used to playing
at low fps because i had a celeron 1.3ghz , 192mb ram , pos geforce mx 4000 pci
and then i got some cash together and built my system which was until i switched the motherboard , Sempron 2800+ oc'd to 2ghz , 1gb ram , radeon 9250 128bit 128mb.
So i gues in my way im just used to lower fps but i play cs:s and get 30 and it isnt to bad
because i know that people have it worse "cough intel integrated"

-TopGun

BTW: Nice system
 
Sounds like the CPU is definitely your bottleneck. In alot of games, multiplayer mode will create it's own set of threads to hold all the internet/socket/packet stuff.. Take a game like Joint Operations, on a high end system it hardly gives the GPU a workout, its a good looking game, but it can sustain 80+FPS easily.. However, I knowtice that frame rates would drop whenever there is a lag-spike. >.< (this is on a 3500+).

Dont underestimate the importance of a fast CPU in any next-gen title. In CoD2 I knowtice the same thing, so I try to squeeze every last bit of speed from the CPU that I can (I overclock, heck I even went out and spent 50 bucks on a [used] Audigy 2 ZS sound card because offloads like 99% of the sound processing off of the CPU, compared to the integrated audio that was on my Mobo.), and it seems to make a huge difference.. Although, it's still not perfect.

What CPU are you using anyways?

 
I just responded to your FFXI thread, lol.

kaleb_zero said:
What CPU are you using anyways?
It's a X2 4400+ oc'ed right now to 2.6(see system in sig). Normally I run at around 2.55 or so.

I guess my point was more to start a discussion on the interaction of playability and how it inter-relates with framerates. :)
 
AuxNuke said:
I guess my point was more to start a discussion on the interaction of playability and how it inter-relates with framerates. :)
What? Are you trying to ask how FPS relates playability? If so, then there is a positive relationship, but you may be asking a totally different question. If that is the case, be more specific.
 
I wasn't really asking a question. I was more of... opening a discussion of peoples perception of playable framerates and the relation of playablility to framrate / hardware / OC'ing.
 
AuxNuke said:
of... opening a discussion of peoples perception of playable framerates and the relation of playablility to framrate / hardware / OC'ing.
You have to forgive me, but I still cannot grasp exactly what it is you are trying to accomplish. There is not much more to say on the topic, except FPS is very important for playability. Multiplayer will lower your FPS considerably, but there is little you can do about that. Ask a question, more people will get involved.
 
I agree....even though I have a slower 256mb 7800GTX, at 1680x1050 in COD2, multiplayer is rather....shitty....I do not like turning off all the eye candy though, that's NOT why I pay hundreds of dollars for a video card. My dual core cpu at 2.6ghz is hardly "slow" either. I have to crank the res DOWN for MP if I want to make it "playable" and face it folks, 30fps is NOT playable for any FPS....just my opinion.

To be honest, I run the game at 1024x768 in online play.....kinda underwhelming for a "modern" gaming rig....

Also I decided not to get a second GTX, it is not worth the few fps I get....not by far.
 
For one, add this to your config_mp.cfg file

seta r_multiGpu "1"

It is located at C:\Program Files\Activision\Call of Duty 2\main\players\*name* by default

You wont have to set it every time this way
 
The whole multiplayer thing...I will play single player mode in a game at 30fps...I won't play multiplayer if it dips below 80...I will lower the settings for 100fps constantly if I can in multiplayer mode. I'll play 1024x768 no eye candy for all I care...when I play multiplayer it's for gameplay and playing against other people, not about great graphics (don't get me wrong, if I can play with all the candy on with 100fps I will, but they go down when fps isn't high enough)
 
Back
Top