The Elusive Frame Timing Conundrum Or Why Do My Games Stutter?

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,080
We have all wondered why do our games stutter even when we maintain at least 60 fps. Alen Ladavac, Chief Technology Officer (CTO) at game developer Croteam tackles this issue head-on in a way that a simple layman or tech enthusiast can appreciate. He discusses how older games were hand drawn to make sure that a character move an exact amount of pixels per frame and games had different animations depending on if the game was released for 50 Hz PAL or NTSC 60 Hz regions.

Then he moves onto how the game can think that the GPU didn't make 60 fps because of other operations multitasking on a PC so it creates frames designed for a slower frame rate but in reality everything was running smoothly. This causes the dreaded stuttering that we all experience in games. He goes on to state, "The real solution would be to measure not when the frame has started/ended rendering, but when the image was shown on the screen." None of the graphics API have support for this feature on all OS platforms so Croteam is advocating for support to be added in the near future.

What happens here is that the game measures what it thinks is start of each frame, and those frame times sometimes oscillate due to various factors, especially on a busy multitasking system like a PC. So at some points, the game thinks it didn't make 60 fps, so it generates animation frames slated for a slower frame rate at some of the points in time. But due to the asynchronous nature of GPU operation, the GPU actually does make it in time for 60 fps on every single frame in this sequence.
 
He goes on to state, "The real solution would be to measure not when the frame has started/ended rendering, but when the image was shown on the screen." None of the graphics API have support for this feature on all OS platforms so Croteam is advocating for support to be added in the near future.
I bet a looooot of games would fail to pass that test with flying colors.
 
correct me if i am wrong
when animations were done in hand we didnt do rendering of games. we were working with sprites aka 2dimensniel picture. so you had to make all the picture of the animations up front
so of cause you need to have a picture fitting time wise
Today we do 3 d rendering aka the computer makes the animations live in according to the time pasted.

it just seems like one thing is not really the same as the other and the comparisons is meta at best
 
correct me if i am wrong
when animations were done in hand we didnt do rendering of games. we were working with sprites aka 2dimensniel picture. so you had to make all the picture of the animations up front
so of cause you need to have a picture fitting time wise
Today we do 3 d rendering aka the computer makes the animations live in according to the time pasted.

it just seems like one thing is not really the same as the other and the comparisons is meta at best
Yes, in 3D we use key frames and mathematically solve for the deformation throughout the rest of the animation. I was quite amused by this part of the linked article, as well. To think that stuttering has anything to do with frametimes is laughable. The feeling of choppiness happens with uneven frametimes regardless of how fluid the animations are.
 
No, the real solution is to get rid of fixed display refresh rates so you don't have these problems to begin with.
That doesn't really get rid of the problem described here.

There are a lot of different possible causes for stutter, this is just one that they found affecting their games.
I'd assume Croteam understands their own game engine well enough to believe them on this
 
No, the real solution is to get rid of fixed display refresh rates so you don't have these problems to begin with.
And how will that fix this issue?

The what? Stutter is created by uniformity issues in the frametimes, nothing to do with aninations or such.
Have you read the article?

I was quite amused by this part of the linked article, as well. To think that stuttering has anything to do with frametimes is laughable. The feeling of choppiness happens with uneven frametimes regardless of how fluid the animations are.
giphy.gif
 
*sighs*

I understand what he's saying, but he's stating it poorly. It's caused by uneven cadence. And if you ever watched 24fps theater movies on 60Hz screens and watch the credits scroll, you would see the stutter. This is known as 3:2 cadence pulldown.

In this case we are dealing with predictive motion based on the previous frame.

When you calculate the next frame you expect it to be on the display for 1/60th a second. Well if a character is moving 6 feet per second and you calculate, then you want to move his character 1/10th a foot. 6/60 = 1/10th

NOW what happens if there is a glitch and you realized the last frame took 1/50th of a second. 1/10th foot * 50 fps = 5 fps. He's no longer moving at the same speed and it looks like it's stuttering.

Now there are various ways to handle this.

You could switch to post movement.

timerStart = getTimer(); //in ms
RenderCharacter(position)
timerStop = getTimer();
timeDelta = timerEnd - timerStart
movement = (timeDelta / 1000) * moveSpeedPerSecond
position += movement

Now in this example we are moving after the frame is done. But that's displaying where the character was. This sometimes would cause glitches where you looked like you cleared a bullet, the game stalled, and then you died and you were like WTF?

To be fair and honest post movement versus premovement delays are very small and beyond the threshold of even the best players. At 1/60th a second you are relying on muscle memory to save your bacon. Because your brain can't process that fast (proven fact)
 
To be fair and honest post movement versus premovement delays are very small and beyond the threshold of even the best players. At 1/60th a second you are relying on muscle memory to save your bacon. Because your brain can't process that fast (proven fact)
Well then we're talking about something different then, because the stuttering looks clear as day to me in that clip.
 
Now there are various ways to handle this.

You could switch to post movement.

timerStart = getTimer(); //in ms
RenderCharacter(position)
timerStop = getTimer();
timeDelta = timerEnd - timerStart
movement = (timeDelta / 1000) * moveSpeedPerSecond
position += movement
If I understand the article correctly, this is what games are doing now. timerStart is when the previous to last frame was sent and timerStop is when the last frame was sent. The movement is then calculated for the current frame. The issue is what you bind the timer start/stop triggers to.
 
Well then we're talking about something different then, because the stuttering looks clear as day to me in that clip.
Whats the refresh rate on you monitor? I know everyone has this argument all the time about "your eyes can't see the differences between 60 fps and 100fps", I always beg to differ because when you game on a 144Hz vs a 60Hz, to me the difference is astronomical. I didn't see much of a difference in the clip but im at work on a pretty standard 1080p monitor. Mine at home is 144Hz gaming monitor so id be interested to see if the clip looks different.
 
Whats the refresh rate on you monitor? I know everyone has this argument all the time about "your eyes can't see the differences between 60 fps and 100fps", I always beg to differ because when you game on a 144Hz vs a 60Hz, to me the difference is astronomical. I didn't see much of a difference in the clip but im at work on a pretty standard 1080p monitor. Mine at home is 144Hz gaming monitor so id be interested to see if the clip looks different.

There's a difference between what you can see and what your muscles can react too. It takes time to process what you saw, and then it takes time for the impulse to travel down to your extremities. Just because you noticed a stutter doesn't mean your mind realized it the moment you saw it. There's a slight delay there.

https://duckduckgo.com/?q=fastest+human+reaction+time&t=h_&ia=web
 
Whats the refresh rate on you monitor? I know everyone has this argument all the time about "your eyes can't see the differences between 60 fps and 100fps", I always beg to differ because when you game on a 144Hz vs a 60Hz, to me the difference is astronomical. I didn't see much of a difference in the clip but im at work on a pretty standard 1080p monitor. Mine at home is 144Hz gaming monitor so id be interested to see if the clip looks different.
I'm at 60Hz. It's not the worst stutter I've ever seen, but it's very apparent. It's all over the place in the first run, especially around the 0:02-0:03 second mark. It's much better in the second run, although I can see two points where it stutters noticeably going in each direction.


EDIT:

Okay two things.

1. If you don't see the stutter, change your playback speed to 0.25 on Youtube, it should be clear as day then for the first pass.

2. At quarter speed, I actually only see one stutter point on the second pass (when it changes direction) now, which means the stutter I saw earlier must have been from Youtube itself. Anyone who has ideas on how to eliminate that, feel free to make suggestions.
 
Last edited:
Whats the refresh rate on you monitor? I know everyone has this argument all the time about "your eyes can't see the differences between 60 fps and 100fps", I always beg to differ because when you game on a 144Hz vs a 60Hz, to me the difference is astronomical. I didn't see much of a difference in the clip but im at work on a pretty standard 1080p monitor. Mine at home is 144Hz gaming monitor so id be interested to see if the clip looks different.

I agree that this has always been a false precept. On the one hand, what they are saying is technically true, but the other end, the one you and I are talking about is also true. You eyes will see what is presented to them, and if the impact of different frame rates or stuttering produces anomalies, you will see the anomalies.
 
It terrifies me to imagine what kind of FPS I tolerated playing Quake on a Pentium 90.

Now I have Afterburner/HWinfo64 OSD, Rivatuner framelimiter, and FRAFS for measuring 99%, 1%, & 0.01%....

I'm not complaining, but it seems like every time I install a new game there is a 30+ minute exercise in measuring various metrics/settings to achieve "smooth gameplay".

Again, not complaining but damn.... when does it become too much information?

Lastly, can someone explain to me how some game's Vsync option suckssss and Nvidia control panel Vsync spits out more stable FPS with higher 1% / 0.01%??
 
*sighs*

I understand what he's saying, but he's stating it poorly. It's caused by uneven cadence. And if you ever watched 24fps theater movies on 60Hz screens and watch the credits scroll, you would see the stutter. This is known as 3:2 cadence pulldown.

In this case we are dealing with predictive motion based on the previous frame.

When you calculate the next frame you expect it to be on the display for 1/60th a second. Well if a character is moving 6 feet per second and you calculate, then you want to move his character 1/10th a foot. 6/60 = 1/10th

NOW what happens if there is a glitch and you realized the last frame took 1/50th of a second. 1/10th foot * 50 fps = 5 fps. He's no longer moving at the same speed and it looks like it's stuttering.

Now there are various ways to handle this.

You could switch to post movement.

timerStart = getTimer(); //in ms
RenderCharacter(position)
timerStop = getTimer();
timeDelta = timerEnd - timerStart
movement = (timeDelta / 1000) * moveSpeedPerSecond
position += movement

Now in this example we are moving after the frame is done. But that's displaying where the character was. This sometimes would cause glitches where you looked like you cleared a bullet, the game stalled, and then you died and you were like WTF?

To be fair and honest post movement versus premovement delays are very small and beyond the threshold of even the best players. At 1/60th a second you are relying on muscle memory to save your bacon. Because your brain can't process that fast (proven fact)

Not to say you are fully wrong but a lot of speed runners and pro fps players would disagree that it is fully and only muscle memory.
 
This explains the problem, but doesn't really go into the solution. The usual one is to smooth delta time, as explained well here (I have no affiliation): https://bitsquid.blogspot.com/2010/10/time-step-smoothing.html

It isn't foolproof, and if Croteam didn't talk about it because they're still looking for a better way to approach the problem, I applaud them for it.

Recent games seem to be getting into a lot more trouble than this smoothing can fix, even when it's working exactly as intended. In addition to the obvious CPU->GPU pipelining, we've had sim->render pipelining on different CPU threads for quite a while now, and recently CPU pipelining has been getting a lot deeper even than that. This would make latency terrible if input sampling were only done at the very start of work on each frame, so low-latency inputs get sampled multiple times in a frame and used similarly to how VR reprojects at the last moment (except much earlier). Two problems arise from this (three if you count the excessive complexity itself and resulting bugs, which are unfortunately common):

* It isn't feasible to resample everything, and some inputs are stuck with terrible latency. The usual fix seems to be snapping animations ahead to make most of it balance out, which is better than nothing but still pretty bad.

* Actual frame start times are often all over the place when GPU-bound. It's possible to control when and where work happens well enough to make this a non-issue, but it seems to be too complex for the real world or something. If heavy enough delta time smoothing is used, frame delivery and animation can both be smooth despite this, but having wildly different input latency on each frame feels terrible, even if it's only on a few inputs. I've even had a game running ~50 fps have major problems with dropped inputs because some gaps between frame starts were that big.

I'd much rather play a UE3 game at 45 fps (UE3 has sim->render pipelining on different CPU threads but nothing else fancy) than a typical modern game GPU-bound at 75 fps. Even when there aren't more blatant errors in handling all this, modern games' controls just seem to need twice the framerate or more to feel as responsive and solid.

When games expose an FPS cap in their own settings, it usually operates on frame starts and ends up spacing them out correctly. Setting this just below the typical framerate the GPU can manage in a game (CPU framerate doesn't matter so much) is the most consistent (if still incomplete) fix I know of, and often a night-and-day difference in a game's control feel. Even when no shenanigans are in effect, a similar framerate cap keeps frames from piling up at the boundary between CPU and GPU and tends to help latency a bit. (I never use vsync, FWIW.)
 
No, the real solution is to get rid of fixed display refresh rates so you don't have these problems to begin with.

Gsync stutters too. World of warcraft stutters exactly the same way shown in the video with gsync
Gsync is not a fixed refresh rate btw
 
I agree that this has always been a false precept. On the one hand, what they are saying is technically true, but the other end, the one you and I are talking about is also true. You eyes will see what is presented to them, and if the impact of different frame rates or stuttering produces anomalies, you will see the anomalies.
Exactly, again, I'm on a 60Hz monitor, so we're not talking about the difference between 144Hz and 60Hz, we're talking about a 60fps clip playing on a 60Hz monitor, and to me, the stutter on the first pass is obvious. I can admit some people are going to notice this thing more than others though.
 
Not to say you are fully wrong but a lot of speed runners and pro fps players would disagree that it is fully and only muscle memory.

i didn't realize pro-gamers were also neural scientist with CT, EKG, and MRI scans with a ton of data. :D

How silly of me.
 
This explains the problem, but doesn't really go into the solution. The usual one is to smooth delta time, as explained well here (I have no affiliation): https://bitsquid.blogspot.com/2010/10/time-step-smoothing.html

It isn't foolproof, and if Croteam didn't talk about it because they're still looking for a better way to approach the problem, I applaud them for it.

Recent games seem to be getting into a lot more trouble than this smoothing can fix, even when it's working exactly as intended. In addition to the obvious CPU->GPU pipelining, we've had sim->render pipelining on different CPU threads for quite a while now, and recently CPU pipelining has been getting a lot deeper even than that. This would make latency terrible if input sampling were only done at the very start of work on each frame, so low-latency inputs get sampled multiple times in a frame and used similarly to how VR reprojects at the last moment (except much earlier). Two problems arise from this (three if you count the excessive complexity itself and resulting bugs, which are unfortunately common):

* It isn't feasible to resample everything, and some inputs are stuck with terrible latency. The usual fix seems to be snapping animations ahead to make most of it balance out, which is better than nothing but still pretty bad.

* Actual frame start times are often all over the place when GPU-bound. It's possible to control when and where work happens well enough to make this a non-issue, but it seems to be too complex for the real world or something. If heavy enough delta time smoothing is used, frame delivery and animation can both be smooth despite this, but having wildly different input latency on each frame feels terrible, even if it's only on a few inputs. I've even had a game running ~50 fps have major problems with dropped inputs because some gaps between frame starts were that big.

I'd much rather play a UE3 game at 45 fps (UE3 has sim->render pipelining on different CPU threads but nothing else fancy) than a typical modern game GPU-bound at 75 fps. Even when there aren't more blatant errors in handling all this, modern games' controls just seem to need twice the framerate or more to feel as responsive and solid.

When games expose an FPS cap in their own settings, it usually operates on frame starts and ends up spacing them out correctly. Setting this just below the typical framerate the GPU can manage in a game (CPU framerate doesn't matter so much) is the most consistent (if still incomplete) fix I know of, and often a night-and-day difference in a game's control feel. Even when no shenanigans are in effect, a similar framerate cap keeps frames from piling up at the boundary between CPU and GPU and tends to help latency a bit. (I never use vsync, FWIW.)

frabz-I-like-your-style-new-guy-welcome-to-the-forum-cab36b.jpg
 
Exactly, again, I'm on a 60Hz monitor, so we're not talking about the difference between 144Hz and 60Hz, we're talking about a 60fps clip playing on a 60Hz monitor, and to me, the stutter on the first pass is obvious. I can admit some people are going to notice this thing more than others though.


I should qualify my comment a little better tho.

Let's say you're running like many of us did before adaptive sync, with V-Sync off on say a 60hz refresh rate and the title is demanding enough that the card can't produce perfect frames. We all know what it's going to look like at times, torn images. Now we know we are getting more than say 50FPS on average, but we are certainly seeing the artifacts and just because the refresh rate and frame rates are greater than 30fps it doesn't magically hide the quality of what is presented to us.
 
Last edited:
Yes, in 3D we use key frames and mathematically solve for the deformation throughout the rest of the animation. I was quite amused by this part of the linked article, as well. To think that stuttering has anything to do with frametimes is laughable. The feeling of choppiness happens with uneven frametimes regardless of how fluid the animations are.

I think the spikes and stuttering are due to the gaps between the input and the resulting image being draw on screen increasing too much. It's got nothing to do with 2D/3D imo
 
I should qualify my comment a little better tho.

Let's say you're running like many of us did before adaptive sync, with V-Sync off on say a 60hz refresh rate and the title is demanding enough that the card can't produce perfect frames. We all know what it's going to look like at times, torn images. Now we know we are getting more than say 50FPS on average, but we are certainly seeing the artifacts and just because the refresh rate and frame rates are greater than 30fps it doesn't magically hide the quality of what is presented to us.
Oh okay, yeah, that's a separate issue than what I'm talking about, which the video illustrates. In other words, in some games you have a situation where you hit 60fps easily, have vsync on a 60Hz display, yet still have stuttering because of how the game is operating.
 
Here's another visualization of how the original subject can be a problem, if it helps at all. Frame delivery is perfectly smooth in this case thanks to pipelining and consistent GPU frametimes, but variable CPU frametimes and a (implied) lack of delta time smoothing are making animation messy anyway:

framePipelining.png


LOL, thanks. Hi everyone!
 
Always wondered why no matter what monitor I own, or what kind of graphics card I have, i always have some kind of stutter.
 
Here's another visualization of how the original subject can be a problem, if it helps at all. Frame delivery is perfectly smooth in this case thanks to pipelining and consistent GPU frametimes, but variable CPU frametimes and a (implied) lack of delta time smoothing are making animation messy anyway:

View attachment 91992


LOL, thanks. Hi everyone!

Yes, pre rendering. This is very popular for streaming game rendering services where the cloud can calculate several different render paths for your character at once then pack it into a delta frame and send it over the web. But it falls apart if you move in a different way then the algorithm guessed. The nice thing about this method though is it works particularly well if the character has a steady predictable animation...like a slicey wheel of death. So you can handle the draw call and store the result from the draw call.
 
The work on both sides isn't quite that clear-cut, but that level of disconnect between CPU and GPU is a normal thing happening just about everywhere. Setting flip queue (AMD) or maximum pre-rendered frames (Nvidia) to 1 tightens it up a bit. That graphic should correspond to a flip queue of 2; 3 (the usual default) would make the CPU's work on frames 4 and 5 happen half a frame sooner, and 1 would make frame 4 actually late getting to the display as you would think when someone says "stutter". The graphics driver tries to do some management of this to keep average latency down, but IME it too sometimes trips over itself and does more harm than good.

The real trouble in that visualization is that having every fifth CPU frame take 4x as long is a pathological case.
 
Just wanted to say thanks to cageymaru for posting this story but also thanks to all posting in the threads. Actually learning something here.

From the first days of having 60hz displays to my current 144hz, v-sync or no or g-sync, improving rigs every which way to increase game render speeds, I've seen odd but consistent stuff no matter what my FPS counters said. By the time I got my 1080TI hooked to a 1440p/144hz I figured it'd be over. It disproved much of what was being attributed to SLI/microstutter. The same stuttering existed, just less pronounced. I keep a close eye on all the usual metrics, disc usage, ram, cpu, gpu, temps, power, etc. Sometimes you can see clear correlation, sometimes not at all. Always find it hilarious when counters register as 70-110 fps and clearly some npc/background animation is obviously running at ~30fps or less.
 
Gsync stutters too. World of warcraft stutters exactly the same way shown in the video with gsync
Gsync is not a fixed refresh rate btw

It's not fully dynamic either.

You'll always have *some* stutter, simply because some frames are going to come out significantly later then others. You'd need a combination of Variable Rate Refresh and Interpolation to fully remove stutter. But VRR removes the biggest hurdle, the fact you are tied to a 16ms refresh window.
 
i didn't realize pro-gamers were also neural scientist with CT, EKG, and MRI scans with a ton of data. :D

How silly of me.

Not sure if serious..

Anyways there are a ton of factors for each person (not all eyes nor brains are the same) and for what is causing stutter, anything from network to engine data streaming to scalers in monitors to even a bad usb device causing hangs..
 
Back
Top