How does real world gamplay FPS single player performance compare to multiplayer?

Joined
Mar 4, 2008
Messages
754
From what I've seen I suspect that a 30 player full multiplayer server, would be more stressful then single player, when a lot of opponents are in a firefight together.
Since some of the guns fire 1200 rounds per minute, this equals 20 rounds per second or 1 round every 50 milliseconds. Does that mean you have to keep your minimum frame rate above 20 so your gun will fire full speed? If it does, a person who's minimum framerate drops to 15 in a firefight, would be at a disadvantage against someone who is staying at 50 FPS.
 
the gun should fire at what ever rate no matter what fps you are getting. What game are you playing because it maybe different then the games i am thinking of.

I think you are at more of a disadvantage from the choppy fps because you are not getting all the information available to you. When i play a game i can feel when i am dipping below 60fps. Just try and find the best setting to get you 60fps and maybe a 10fps buffer for the firefights.
 
The gun firing is totally independent of your frame rate. Your video card isn't the thing that tells the server you're firing the gun. All the server knows is "he pressed his mouse button" and it begins firing the gun "he let go of his mouse button" and it stops. That's a rather simplistic explanation, but more or less accurate.
 
For this example I was thinking of COD4. If you crank your graphics settings up so high that you are running 20 FPS and you hold your mouse button down, you'll find your machine gun fires slower then at 60 FPS. If my system is preventing me from firing every 50 milliseconds would this not prevent the game server from getting the info from me every 50 milliseconds? This is assuming UDP is getting info at half my ping of 80 to update the game server. I think the default snaps for COD4 servers is 20, which would be 20 updates per second from the server.
 
You are confusing frame rate lag with server lag. Your gun firing slower is due to you seeing less frames of your gun firing per second. When you have 60 frames per second you are seeing your gun firing faster because you are seeing more frames in the same amount of time. However, your gun is firing at the same rate both ways, you are just seeing a difference. There is no actual difference besides that.
 
There are 2 parts to this thread.
Part A is to define a minimum frame rate so that you are not at a competitive disadvantage in a multiplayer firefight.
Part B is to take this minimum frame rate and apply it to the [H]Enthusiast recommended real world single player game play settings. I think you'll find the [H]Enthusiast settings are to aggressive for crowded multiplayer servers.

I've noticed when my frame rate drops to 30 in a firefight I don't do as well as when it stays above 60. Why is this? I think the data your computer sends out to the game server is linked to your frame rate. My friend had his graphics cranked up way to high in FEAR for his old computer and he couldn't even follow me when he ran behind me. I could see him run a few steps then stop then run a few steps more. As soon as he dropped his resolution and graphics quality down it smoothed out and he could keep up with me.

The big question is how do we figure out what settings to use for multiplayer? I don't think any review site answers this question.
 
You are confusing frame rate lag with server lag. Your gun firing slower is due to you seeing less frames of your gun firing per second. When you have 60 frames per second you are seeing your gun firing faster because you are seeing more frames in the same amount of time. However, your gun is firing at the same rate both ways, you are just seeing a difference. There is no actual difference besides that.

If that is correct then the problem must be that your reaction time on your computer is slower when you have low frame rates. I've noticed I can't move my gun barrel as fast when that happens. Are we agreed on 60 FPS as a minimum frame rate?

Once we pick a number for the minimum FPS, how about some people reporting their minimum frame rates on crowded servers so we can figure out the proper settings.

I can tell you I can't keep my minimum above 40 on my rig in COD4 no matter what I do. It might start out at 100 FPS but it slows down to the mid 30s in a firefight. I compromised on 1024x768 with everything off or low quality except model detail normal. Texture settings are bilinear and normal for everything. It's just too ugly at 800x600.
 
There are 2 parts to this thread.
I think the data your computer sends out to the game server is linked to your frame rate.
I doubt it. Why would the netcode be coupled to the main game loop? I would imagine that the networking is done on its own time, as often as the game/windows wants it to, and iterations of the game loop fit inbetween somewhere. *shrugs* but I"ve never written a game.
 
I'm no programmer but I'm hoping someone out there knows for sure how things work.
I've been playing first person shooters for over 15 years now and I'm trying to find an explanation for some of the strange things I've been seen over the years.

This is the way I understand it:
My computer renders the game environment.
The game server updates all the clients every 50 milliseconds with information on where all the opponents are what weapons they are carrying and what they are doing.
My computer then takes all this data and draws all the opponents that are in view on my screen with new updates occurring every 50 milliseconds.
My computer tells the game server where I am and what I am doing at a rate of half my ping based on the User Datagram Protocol . Since it is my computer that tells the game server where I am shooting, it can't just update according to my ping. It has to wait for me to pull or hold the trigger then decide if I killed that guy 200 feet away with a head shot, or if he moved before I told the game server where I was shooting. This is why I think that if my frame rate drops below the fire rate of a specific weapon that I will be firing less bullets. If my computer is only firing every 100 milliseconds at what I am aiming at, how can the game server be getting data based on the weapons maximum fire rate of every 50 milliseconds? How does it know where the extra shots were aimed? It doesn't make sense to me which is why I'm asking the question in this forum of 100,000 plus users looking for some facts. I don't think a gun fires faster at 200 FPS then 50 FPS. I think it fires slower when your fps drops below the firing rate of the gun. In a game where the maximum fire rate is 1200 rounds per minute this would be 20 FPS. The game server should know the firing rate for each gun so it won't let you exceed it but your computer tells the game server when it fires a gun. If your FPS is 10 you are only telling the game server you are firing 10 times a second instead of 20 times a second.

This isn't even the point of this thread. Just part of determining the minimum FPS to shoot for, especially since the maximum firing rate in COD4 is 20 rounds per second. Anybody who plays online knows 20 FPS is too slow in a firefight. The question still remains what is the lowest minimum FPS to try for in multiplayer?
I can't believe nobody is watching their minimum FPS in a firefight to say 'hey, now that you mention it, my minimum FPS is dropping to 30 FPS on my quad core 9800 GTX on a full 48 player server. I never noticed it before. I guess I better turn off that 16X AA. No wonder I'm getting pwned.'

The command to turn on the FPS counter in COD4 is" ~/cg_drawFPS 1.
Let's see some real multiplayer benchmarks and stop just using these canned time demos and single player benchmarks!
 
Back
Top