What is min/max framerate?

M76

[H]F Junkie
Joined
Jun 12, 2012
Messages
14,039
I've ran a benchmark on Cyberpunk 2077, and this is what I got:

benchmark completed, 4216 frames rendered in 76.406 s
Average framerate : 55.1 FPS
Minimum framerate : 46.5 FPS
Maximum framerate : 73.7 FPS
1% low framerate : 41.3 FPS
0.1% low framerate : 31.7 FPS

How is it possible for the minimum fps to be higher than the 1% low and .1% low? Isn't minimum supposed to representative of the absolute worst frame during the entire benchmark?
Is the minimum framerate the worst 1s during the benchmark with the lowest number of frames? But then in turn that means that maximum isn't the fastest rendered frame either, but the 1s during which the AVG was the highest?

Is this the common understanding? I always thought the min/max was calculated for a singular frame.
 
Screen tearing maybe? Turn on v-sync and G-Sync/freesync if your monitor supports. You really want at a minimum freesync 2 (skip freesync 1 monitors) or g-sync compatible.

That doesn't sound like the games' built in bench, run that and compare.

I agree those results are off.
0.1% low should = the minimum framerate. 1% low should be of course above that and below the average and max.
 
I think you are right that the minimum and maximum use the average fps of a short amount of time (like 1 second) and there some distribution that make it possibly higher than the 0.1% low.

Would they take 1 / lowest millisecond between any 2 frame for the max and the opposite for min, it would maybe be always some ridicule number
 
I think you are right that the minimum and maximum use the average fps of a short amount of time (like 1 second) and there some distribution that make it possibly higher than the 0.1% low.
That is the only plausible explanation unless it is a bug and this is not supposed to happen. Assuming it is not a bug this opens up a whole can of worms on what is the formula for calculating min max fps in each benchmark tool. I used MSI Afterburner.
Would they take 1 / lowest millisecond between any 2 frame for the max and the opposite for min, it would maybe be always some ridicule number
Yes it would be the worst / best outlier, which is what I thought min / max were until now. The FPS calculated for a singular frame that took the least / most time to render.
 
I've ran a benchmark on Cyberpunk 2077, and this is what I got:

benchmark completed, 4216 frames rendered in 76.406 s
Average framerate : 55.1 FPS
Minimum framerate : 46.5 FPS
Maximum framerate : 73.7 FPS
1% low framerate : 41.3 FPS
0.1% low framerate : 31.7 FPS

How is it possible for the minimum fps to be higher than the 1% low and .1% low? Isn't minimum supposed to representative of the absolute worst frame during the entire benchmark?
Is the minimum framerate the worst 1s during the benchmark with the lowest number of frames? But then in turn that means that maximum isn't the fastest rendered frame either, but the 1s during which the AVG was the highest?

Is this the common understanding? I always thought the min/max was calculated for a singular frame.

When you calculate .1% and 1% lows, the others become the 99% and 99.9% lows in most software. So you've removed the "lowest low" and no have a new "low" from the bigger set of data. Its basically chopping the ends off of the bell curve.
 
When you calculate .1% and 1% lows, the others become the 99% and 99.9% lows in most software. So you've removed the "lowest low" and no have a new "low" from the bigger set of data. Its basically chopping the ends off of the bell curve.
That doesn't make any practical sense. I'm not saying you are wrong, but what do I gain by knowing the lowest of the dataset that already excludes the 1% lowest?
 
That doesn't make any practical sense. I'm not saying you are wrong, but what do I gain by knowing the lowest of the dataset that already excludes the 1% lowest?

It just removes outliers due to outside factors I guess, like windows updates or whatever. Maybe it would be better described as "average low" and "extreme low", idk.
 
When you calculate .1% and 1% lows, the others become the 99% and 99.9% lows in most software. So you've removed the "lowest low" and no have a new "low" from the bigger set of data. Its basically chopping the ends off of the bell curve.
Only 4200 frames so I imagine anything is possible, but if you remove all the frames rendered lower than a 41.3 fps pace, would it be strange for the minimum to not be really close to 41.3 then, 46.5 is a good jump.
 
Only 4200 frames so I imagine anything is possible, but if you remove all the frames rendered lower than a 41.3 fps pace, would it be strange for the minimum to not be really close to 41.3 then, 46.5 is a good jump.
41.3 is the average of the 1% slowest frames, not the threshold, so 46.5 can be the cutoff point.
 
41.3 is the average of the 1% slowest frames, not the threshold, so 46.5 can be the cutoff point.
I always thought those 0.1%, 1% were distribution percentile and not an average (i.e. 1% of the frame were at 41.3 fps pace or lower in the distrubition, not the average frame rate of the 1% lowest) .
 
I always thought those 0.1%, 1% were distribution percentile and not an average (i.e. 1% of the frame were at 41.3 fps pace or lower in the distrubition, not the average frame rate of the 1% lowest) .
I'm just assuming based on what could produce these numbers. I've searched far and wide and found absolutely no documentation on this. So it's anyone's guess. It's possible that different benchmarking tools use a completely different formula to calculate them.
 
Back
Top