Can someone Explain to me 0.1% low FPS

princeboy47

Limp Gawd
Joined
Aug 30, 2019
Messages
198
I want to know what it is and why it happens and is it a problem always or can sometimes happen as a result of bad rig optimisation?
 
So FPS is frames per second, which can also be translated to "frame time", or how long it takes to render each from. For example 100 FPS would have an average frame time of 0.01 seconds.
FPS is typically an average of that over time. the 0.1% low FPS is how slow the slowest 0.1% frames rendered.
You want the .1% to be as close to the average frame rate as possible, if it's a lot slower that means your framerate is inconsistent and you'll probably see hitches.

A very low 0.1% low FPS could be because the game is crap, or it could be your hardware or configuration. It's a pretty important benchmark to look at for CPUs.
 
So the only solution is to cape the fps? My monitor is asus vs248h and it has 60hz refresh rate. I get .1% low fps on every camera transition. I have my afterburner osd to show graph for frametime and so i notice many ups and downs in a variety of games and when setting my monitor to 30fps i get slow frames that feels stretching and the 1ms does not help with this. Is there something i can do about it?
 
So the only solution is to cape the fps? My monitor is asus vs248h and it has 60hz refresh rate. I get .1% low fps on every camera transition. I have my afterburner osd to show graph for frametime and so i notice many ups and downs in a variety of games and when setting my monitor to 30fps i get slow frames that feels stretching and the 1ms does not help with this. Is there something i can do about it?
Turn off v-sync if you don't mind tearing. With Vsync-on if your card can't sustain 60fps then the game will basically fluctuate between 30 and 60 fps all the time. Otherwise there is not much you can do apart from getting a freesync/gsync monitor, or at least one with a high refresh rate.
 
I already have vsync turned off and amd enhanced sync enabled along with amd anti lag and having rtss limiting fps to 59. I wanted to use scanline sync but i do not know if it is better or not. I get 59 fps all the time though sometimes i get drops.
 
I want to know what it is and why it happens and is it a problem always or can sometimes happen as a result of bad rig optimisation?
.1% low is determined by: .001 X all the frames analyzed with the lowest fps, the bottom .1% fps framerate

The .1% fps low is the framerate or threshold which .1% of the frames fall below

If the .1% low was 30fps then the other 99.9% of the frames would have fps above 30fps while .1% of the frames were 30fps or less

the .1% percentile can be erratic due to the small sample size for a typical benchmark, 1% percentile seems to be more usable and filters out game engine type loading screens etc.

90 percentile is the threshold for 90% of the number of frames, if 80fps then 90% of the frames will be 80fps+
 
1% or .1% low can be argued as being more important than total average framerate.
 
In my 25 years of gaming I've never cared about this.
0.1% of fps being low is irrelevant.
It's a game, not death penalty if your fps drops for a split second.
 
In my 25 years of gaming I've never cared about this.
0.1% of fps being low is irrelevant.
It's a game, not death penalty if your fps drops for a split second.

totally wrong

If your .1% fps is way lower than your average that means it's a stuttering mess. Super easily noticeable and basically unplayable.
 
Do you have an example?

Let's say your average is 100 fps. That means each frame displays for 10 ms on average.
If your .1% fps average is 10 fps that means those .1% slowest frames frames are displaying for 100 ms on average.

That means you're getting a silky smooth 100 fps then all of a sudden the screen is paused for a tenth of a second, then you're back to 100 fps.
I've had this happen in Fallout New Vegas when there were some weird driver issues with the game. It was basically unplayable.

That's a more extreme example. Something that is more common is the .1% being 40 fps and average being 100 which would be super annoying with a fixed refresh rate. It can be greatly mitigated with VRR but is still noticeable and annoying. For example you're playing an FPS running around getting 100 fps but every time there is a big explosion or something your CPU can't handle your fps dips way down. Depending on the game that might not be a big deal, but if you're trying to do stuff like aim during those fps drops it's going to suck.
 
Let's say your average is 100 fps. That means each frame displays for 10 ms on average.
If your .1% fps average is 10 fps that means those .1% slowest frames frames are displaying for 100 ms on average.

That means you're getting a silky smooth 100 fps then all of a sudden the screen is paused for a tenth of a second, then you're back to 100 fps.
I've had this happen in Fallout New Vegas when there were some weird driver issues with the game. It was basically unplayable.

That's a more extreme example. Something that is more common is the .1% being 40 fps and average being 100 which would be super annoying with a fixed refresh rate. It can be greatly mitigated with VRR but is still noticeable and annoying. For example you're playing an FPS running around getting 100 fps but every time there is a big explosion or something your CPU can't handle your fps dips way down. Depending on the game that might not be a big deal, but if you're trying to do stuff like aim during those fps drops it's going to suck.

Hmmm, I haven't experienced that.
Not claiming it doesn't exist. Just that I haven't noticed.
I feel like there's a test for everything. Who came up with 0.1% anyway? How do they know every situation is 0.1% fps?
 
Back
Top