Maximum Pre-Rendered Frames

evilsofa

[H]F Junkie
Joined
Jan 1, 2007
Messages
10,078
In another thread, SpongeBob brought up a question about the Maximum Pre-Rendered Frames settings: what is it, and how should it be used?

MavericK96 pointed out this relatively new thread with some interesting benchmarks and comments, and the confusing conclusion that in the case of MPRF, 0 = 3, so settings of 1 or 2 will be lower than a setting of 0.

http://forums.laptopvideo2go.com/topic/29333-max-pre-rendered-frames-benchmarks/

I found a slightly older thread, in which the comment was made that:

"It should be used based on each game really...
High = Mouse input lag, but smoother gameplay.
Low = Fast input speed and possibly jittery gameplay."

http://forums.nvidia.com/index.php?showtopic=211621
 
I'd like to know how this effects benchmarks and game play as well.
 
Well, in previous versions of the nVidia drivers, you had the option of 0-8, with 3 being the default. Apparently, now you get the option of "Use Application Default", and 1-4. Supposedly, "default" is 0, and you can manually set it to 5-8 in nVidia Inspector.

I would like to know more about this as well. Hopefully we can get some real results going.
 
Anybody figure this out?

Reason I ask is that I'm having a small, random amount of stutter in BF3 Close Quarters and someone mentioned to set it to 1 in the NV control panel...
 
Last edited:
I'm kind of sad that this thread fell into obscurity. I was hoping someone ([H]? wink wink) would do a short study on it.
 
You could do a short study on it. I personally set my global to render 1 and my stuttering is gone in most games. I also v-sync my games so this might be why I was getting stuttering in the first place.
 
Don't you need proper hardware to do studies on micro-stutter? Kyle said something about nVidia helping them get a setup going to accurately measure it, but I don't recall hearing about that again...
 
Heh, I used to have a room-mate that insisted this setting was the same thing as enabling 'Triple Buffering' lol.
 
Max pre-rendered frames are the number of frames the driver has your CPU line up to be put in queue for rendering and rasterization. Basically, all the stuff in a given scene (objects, effects, etc) is/are already in place and lined up, but needs to be fed to your GPU to actually be rendered.

This number is the number of frames waiting to be rendered that are kept cached and ready to be feed to your GPU. Higher values will cause input lag, so it's best left default unless you're getting really jumpy gameplay even though your frame rate is staying good. Set to 1 if everything is hunky-dory. Try 2-3 if you're getting stutter despite a solid framerate. Set higher as needed, balancing between input lag and butter smooth gameplay. Also, it WILL compound with Vsync as far as introducing input lag.

It's also worth noting that it'll only work in DirectX games, and not OpenGL, unless they've changed something recently.
 
Last edited:
On my system it varies from game to game. Some games (Oblivion, Deus Ex) are smoother as a result of lowering it to 1, while others like Diablo and Crysis 2 get really stuttery. I tend to adjust that value more than pretty much any other.
 
It's also worth noting that it'll only work in DirectX games, and not OpenGL, unless they've changed something recently.

They have, it works in OpenGL now. Unfortunately nvidia forums are down so I can't show you the threads and benchmarks that are there to prove it.
 
Back
Top