Introduction to Triple Buffering

Bo_Fox

[H]ard|Gawd
Joined
Aug 23, 2006
Messages
1,544
Here's my own article, or a mini-guide:

Triple Buffering with Vsync enabled offers the best visual image quality. However, 60 Hz refresh rate limits the frame rate to 60 fps with Vsync on. If Triple Buffering is not used and the frame rate dips below 60, then it automatically reverts to 30 fps which is 1/2 of 60 Hz. 30 fps causes two identical frames to be displayed in sequence for each refresh interval before it swaps to the next frame on the third refresh interval.

If it dips below 30 fps, then it automatically reverts to 20 fps, which is 1/3 of 60 Hz. That causes THREE identical frames to be displayed in sequence before you will be able to see the next frame. That makes the whole screen appear to be much more choppy in an ugly way. Below 20, and it would be 15 (or 1/4), and then 1/5, and so on...

With triple buffering, the frame rates can fluctuate freely below the refresh rate even if the computer is not powerful enough to maintain a minimum frame rate that matches the Vsync monitor refresh rate. Please keep in mind that with triple buffering on, it is strongly recommended that your computer is powerful enough to maintain at least 1/3 the frame rate of the refresh rate you're playing at. It's because triple buffering offers only 2 "back buffers". Or, you would be experiencing noticeably worse choppiness whenever the frame rate dips below 1/3 of the refresh rate. If Vsync is still an absolute must even with the frame rates dipping below 1/3 of the refresh rate, then you could add a third backbuffer through a special program called DXTweaker (type "3" in backbuffer count in the "Present Changer" module), thus making it "Quadruple Buffering". That would only be applicable for games that can be tolerated at such low frame rates such as The Sims 2 or Elder Scrolls: Oblivion.

60 Hz is rarely desirable, though. For one thing, on CRT's it causes screen flickering that is very noticeable and fatiguing to your eyes. Second, if your computer is powerful enough to run at 85 frames per second, why limit it to 60 in any games (except for Doom3 that is already capped at 60fps)? 85 allows for a much smoother gameplay that does look smoother and more fluid, regardless of whether Vsync is on or off.

If you just cannot or just do not want to enable Triple Buffering, then setting it at 60 Hz instead of 85+ Hz would only make sense if the game drops below 85 fps but stays above 60 fps. If you set it to 85 Hz with Vsync on, but your computer can only maintain like 65 fps minimum, then that is understandable since you do not want to experience drops to 1/2 of 85 Hz which gives only 42.5 fps. In that case, just turn on triple buffering! Besides, the small lag that triple buffering brings is usually preferrable to the horrible screen flickering at 60 Hz on CRT monitors. Just force it through DXTweaker or ATI Tray Tools if it doesnt work in DirectX games under Nvidia or ATI drivers.

Vsync at 60 Hz could also make sense with Doom3 / Quake 4 games that are capped at 60 fps. Most LCD monitor panels only do 60 Hz anyway, which sucks!

If you have a good CRT monitor, it would make even more sense to just upgrade your computer to something slightly faster so that you could do a minimum of 85. (ha ha..) That gets rid of the screen flickering that is so evident at 60 Hz.

Triple buffering with Vsync on is not recommended for extremely fast-paced multiplayer games especially when you play online, or for really fast racing games. Triple Buffering causes 32 ms of lag at 60 Hz since you will be looking at a frame buffer that was drawn 2 frames (or at least 2 screen refreshes) ago. If you want the absolute least lag when playing fast-paced games, just disable Vsync and Triple Buffering altogether. Vsync itself without triple buffering still causes 16 ms of lag at 60 Hz. However, without Vsync at 60 Hz (common on most LCD monitors), the screen tearing is much worse than at higher refresh rates like 85+ Hz. Some would find the screen tearing to be completely unbearable at 60 Hz, so Vsync is almost a "must" for any games on LCD's.

That's why CRT's still have a huge advantage over LCD panels for games, asides from the fact that most LCD's have an additional 10-50+ ms of variable lag. I can only pray that there will soon be LCD monitors that display 120 Hz in native resolution.


Edit: Added a link with more info. on Triple Buffering and DXTweaker:

http://www.ocworkbench.com/2006/articles/DXtweaker/

If you experience hitches at high resolutions with 4X FSAA after turning on Triple Buffering, you might be running out of video memory. Now would be a good time to upgrade to a card with more video memory. (Note that the hitches with "liberal" frame rates could sometimes actually be preferable to the *fractioned* frame rates caused by Vsync's default Double Buffering, which also causes a different kind of stuttered slow-down!) Anyway, the link above explains the math of the video memory used by Triple Buffering.


Any questions or comments?
 
Pretty good guide.. how about talking about Vmem limitations when using AA, high res and Triple buffering? Personally I can't stand vsync seems to lag much more..but I still use it and DXtweaker in some games that tear big time like oblivion.

Another comment is triple buffereing in D3D games does not work with nividia even though the check box is there it only works in OGL, must use Dxtweaker.
 
Wow after years of gaming I finally know why certain games I play always fluctuate at 60, 30, or 20 fps and not in between.
 
I have a question. My monitor is capable of running at 100mhz a most resolutions below 1600x1200 but I'm always worried that perhaps this high refresh rating will cause my monitor to go bad in some way faster than if I use 85mhz refresh rate, is this in any way possible?
 
I belive it is rated to go 100mhz then it is not unhealthy for hte CRT to do so. It should not significantly lower the life of you PC, as it was designed to go that fast.


And I have a few questions...


One: Does an 85MHz refresh have anything to do with anything if VSync is not on? Is it still better to use a higher refresh? Does it reduce flickering? How about FPS? Will a higher refresh without VSync be detrimental to FPS or will it contribute to better gameplay, (E.G. Reduce Tearing, etc...)


Thats it... Well... It is only one question, but its got some different parts :)

Just some content that you maybe could add too your guide. Good questions methjinks :)
 
legendz411 said:
I belive it is rated to go 100mhz then it is not unhealthy for hte CRT to do so. It should not significantly lower the life of you PC, as it was designed to go that fast.


And I have a few questions...


One: Does an 85MHz refresh have anything to do with anything if VSync is not on? Is it still better to use a higher refresh? Does it reduce flickering? How about FPS? Will a higher refresh without VSync be detrimental to FPS or will it contribute to better gameplay, (E.G. Reduce Tearing, etc...)


Thats it... Well... It is only one question, but its got some different parts :)

Just some content that you maybe could add too your guide. Good questions methjinks :)

VSync's only function is to time the release of the frame from the back buffer with the beginning of the screen scan. However, with an 85Hz refresh rate your screen can still only refresh 85 times per second. This means frame rates in excess of 85 FPS can result in tearing.

Higher refresh rates do reduce flickering because the screen is redrawn more often. Flicker occurs because the electron gun scans the screen and causes the phosphors to glow. If not scanned again quickly enough, the phosphor begins to fade. It is this difference in fully charged to faded state, and then back again that appears as flicker, much like a strobe light effect.

Without VSync the refresh rate will have no effect on FPS, but again, can result in tearing when the FPS exceed the refresh rate.
 
I also think it is important to discuss Vmem usage with AA and Triple Buffering on.

I do not remember the math, but if you have Triple buffering on, and 4xAA, your Vmem becomes full with all these upsampled pre-drawn frames. What happens when you max out your Vmem? And does Vista solve this problem with it's ability to share system memory with the video card?

This should be touched on and explained.
 
So then, is this to say that for all the old bragging about 100-200-300 FPS in Q2/HL/Q3, all that was clearly visible was the refresh rate of your monitor?

In which case, with most performance tests where people are talking 70+ FPS would be pretty much useless for real-life gaming, would it not? (As if I haven't already been arguing this)
 
I did not think flicker was an issue with LCD's and that is why a 60hz refresh rate is recommended/defaulted too?
 
Tae said:
I did not think flicker was an issue with LCD's and that is why a 60hz refresh rate is recommended/defaulted too?

Right, it is not an issue with LCD's. But it is a limitation upon games in how often the screen can be visually updated. Without Vsync enabled, there is severe shearing and tearing artifacts at only 60 Hz, even if the frame rate is *lower* than 60 fps.
 
legendz411 said:
I belive it is rated to go 100mhz then it is not unhealthy for hte CRT to do so. It should not significantly lower the life of you PC, as it was designed to go that fast.


And I have a few questions...


One: Does an 85MHz refresh have anything to do with anything if VSync is not on? Is it still better to use a higher refresh? Does it reduce flickering? How about FPS? Will a higher refresh without VSync be detrimental to FPS or will it contribute to better gameplay, (E.G. Reduce Tearing, etc...)


Thats it... Well... It is only one question, but its got some different parts :)

Just some content that you maybe could add too your guide. Good questions methjinks :)

Yes, higher refresh rates reduce the screen tearing if you do not want Vsync enabled in order to minimize the lag as much as possible. Also, with Vsync disabled, it could sometimes appear to be smoother despite some shearing (especially smoother if the frame rate is lower than 60 fps).

Again, Vsync is almost a "must" for LCD panels due to the horrible screen tearing/shearing at only 60 Hz in most games.
 
Would override windows defaults on LCD's and just go with the manufacturer's reported high refresh rate?
 
Tae said:
Would override windows defaults on LCD's and just go with the manufacturer's reported high refresh rate?

A few LCD's support 75 or 85 Hz refresh rates at 1280x1024 or lower. If your LCD does not support more than 60 Hz at the resolution that you are using, then there is basically no way to force the LCD to display higher refresh rates at that resolution.
 
pain.angel said:
I have a question. My monitor is capable of running at 100 Hz a most resolutions below 1600x1200 but I'm always worried that perhaps this high refresh rating will cause my monitor to go bad in some way faster than if I use 85 Hz refresh rate, is this in any way possible?
I fixed the units for you.

Bo_Fox said:
60 Hz is rarely desirable, though. For one thing, it causes screen flickering that is very noticeable and fatiguing to your eyes. Second, if your computer is powerful enough to run at 85 frames per second, why limit it to 60 in any games (except for Doom3 that is already capped at 60fps)? 85 allows for a much smoother gameplay that does look smoother and more fluid, regardless of whether Vsync is on or off.
You should make clear that this only related to CRTs. LCDs are fine at 60Hz, due to the different underlying technology.

Bo_Fox said:
If you want the absolute least lag when playing fast-paced games, just disable Vsync and Triple Buffering altogether. Vsync itself without triple buffering still causes 16 ms of lag at 60 Hz.
Are you certain of this? 60Hz does imply that there are ~16ms between two refreshes, but wouldn't the average case lag be ~8ms, since you never know when the video card finished rendering the scene.

Aelfgeft said:
So then, is this to say that for all the old bragging about 100-200-300 FPS in Q2/HL/Q3, all that was clearly visible was the refresh rate of your monitor?

In which case, with most performance tests where people are talking 70+ FPS would be pretty much useless for real-life gaming, would it not? (As if I haven't already been arguing this)
Often times people quote average, not worst-case FPS. And if you want vsync on, then you'd want the worst case to be greater than your monitors vertical refresh rate.
 
drizzt81 said:
You should make clear that this only related to CRTs. LCDs are fine at 60Hz, due to the different underlying technology.

Done that. Edited.

drizzt81 said:
Are you certain of this? 60Hz does imply that there are ~16ms between two refreshes, but wouldn't the average case lag be ~8ms, since you never know when the video card finished rendering the scene.

Yes. The average case lag would be ~8ms. It is for the serious gamers who want to minimize the lag as much as possible. For example, with a ping of 35ms, who would want to add another ~8ms of lag to that? 8 ms is more than a big deal to Jonathan "Fatal1ty" Wendel or anybody who competes at LAN tournaments with zero ping.

drizzt81 said:
Often times people quote average, not worst-case FPS. And if you want vsync on, then you'd want the worst case to be greater than your monitors vertical refresh rate.

That is what triple buffering is for, and that is why I often made a mention of minimum (worst-case) FPS.



I found a good article / guide on Triple Buffering with information on the video memory used at high resolutions with 4X FSAA. Here's the link:

http://www.ocworkbench.com/2006/articles/DXtweaker/
 
Nice thread BTW. Great for people to coem and read about the confusing topic that is VSync.
I like :)
 
Does anyone else notice that tearing is significantly reduced on games with frame rate limiters in place (as long as your card(s) are capable of rendering at the locked rate consistently)?

Like with BF2, my monitor @ 75hz, if I use the console command to limit my frame rate to 80-85 it seems to render at around 75fps consistently.
 
I should give that a try. What command line prompt are you using. Right now I have triple buffering on and am forcing Vsync at 75 Hz, the highest my monitor will go. Picutre quality has deff. increased for me as well as picking up FPS. I used to see some tearing, or what i though was. My frames used to sit around 60 and I could never figure out why until this thread. I had trip. buff. and vsync on with 60 Hz refresh rate. "Doh"

Great Thread!
 
Yeah I can run 85hz at 1600x1200, but I don't because text gets a little fuzzy on the desktop and for some reason I am worried about running a game at a different hz than the desktop ( I think this stemmed form a bad expirience in the past where a game would revert to 60hz if it didn't match... I am not sure if this is an issue with games anymore or if just happend to be a driver bug)

Anyway command for frame limit on bf2 is:

game.lockFps [framerate]

When you type in the desired rate don't use the brackets...
 
Bo_Fox said:
A few LCD's support 75 or 85 Hz refresh rates at 1280x1024 or lower. If your LCD does not support more than 60 Hz at the resolution that you are using, then there is basically no way to force the LCD to display higher refresh rates at that resolution.

Hmm. Well my native resolution is 1680x1050 with a max reported refresh rate of 75hz. Since Viewsonic recommends native res, I assumed the 75hz was for it. I don't really know though. I have kept the windows recommended 60hz because I understood that flicker was no longer an issue.

I have to enable vsinc in Oblivion or my tearing is massive, but Company of Heroes does not even have the option and I really don't notice tearing any way.

CCC allowed me to select triple buffering, but not vsinc. I noticed my max fps in CoH dropped, but it really did not affect min and avg. So I am not really sure what it did.
 
COH has a vsync option, though I don't notice any tearing either.

Actually, for whatever reason, the only tearing I ever notice is in Source games and even then it's rarely and hardly noticeable. BF2/FEAR/Prey/Doom3 I see no tearing whatsoever.

And I'd never go back to a CRT :p
 
squishy said:
COH has a vsync option, though I don't notice any tearing either.

Is it an external .ini option, becuase it is not availble through options.
 
Am I the only one here that has tried running an LCD at 100Hz?

If I don't install the driver .inf's for my lcd (Samsung 940b using DVI) and just use windows to set the refresh, then I can select all the way up to 160. Every utility I've used to check the refresh rate reports back at whatever I've set it at (I'm running 120hz now). However, the LCD panel's info shows 75Hz. The confusing part is that everything seems to run smoother when it's set to a higher refresh. The mouse even seems quicker. It's not just me either, my wife noticed it too (and she didn't know I was messing with the refresh rates...she was bitchin' that I increased the mouse speed to mess with her).

Care to shed some light on what's happening there?
 
Tae said:
Is it an external .ini option, becuase it is not availble through options.

You're right, it's not there. I could have sworn I saw an option.

/sorry
 
nst6563 said:
Am I the only one here that has tried running an LCD at 100Hz?

If I don't install the driver .inf's for my lcd (Samsung 940b using DVI) and just use windows to set the refresh, then I can select all the way up to 160. Every utility I've used to check the refresh rate reports back at whatever I've set it at (I'm running 120hz now). However, the LCD panel's info shows 75Hz. The confusing part is that everything seems to run smoother when it's set to a higher refresh. The mouse even seems quicker. It's not just me either, my wife noticed it too (and she didn't know I was messing with the refresh rates...she was bitchin' that I increased the mouse speed to mess with her).

Care to shed some light on what's happening there?

I'd imagine that Windows is sending a 160 fps to the monitor and the monitor is just displaying what it can ?

The higher frame/refresh rate would affect your mouse. I'm not sure why it happens, but in FPS your mouse sensitivity is very affected by your framerate.
 
then that could be a way to run things at a super-high refresh rate for games, and still not damage the monitor if that's what's happening. I've never had any tearing from games either when the rate is set to that high, and they're smooth as butter.
 
nst6563 said:
Am I the only one here that has tried running an LCD at 100Hz?

If I don't install the driver .inf's for my lcd (Samsung 940b using DVI) and just use windows to set the refresh, then I can select all the way up to 160. Every utility I've used to check the refresh rate reports back at whatever I've set it at (I'm running 120hz now). However, the LCD panel's info shows 75Hz. The confusing part is that everything seems to run smoother when it's set to a higher refresh. The mouse even seems quicker. It's not just me either, my wife noticed it too (and she didn't know I was messing with the refresh rates...she was bitchin' that I increased the mouse speed to mess with her).

Care to shed some light on what's happening there?

Your monitor is almost certainly running at 75Hz, not 120Hz. Theres hardly any lcds on the market that run at more than 75Hz. Its probabaly just windows veing stupid and thinking your lcd can do something it cannot. All other programs are likely to be just reading the setting from windows, not the real output.

One thing to note about LCDs and refresh rates. Setting your refresh rate above 60Hz sometimes hurts the effects of overdrive, which in turn can then make your response times slower and increase ghosting. The best setting for an LCD is 60Hz, even if it can do 75.
 
DarkBahamut said:
Your monitor is almost certainly running at 75Hz, not 120Hz. Theres hardly any lcds on the market that run at more than 75Hz. Its probabaly just windows veing stupid and thinking your lcd can do something it cannot. All other programs are likely to be just reading the setting from windows, not the real output.

One thing to note about LCDs and refresh rates. Setting your refresh rate above 60Hz sometimes hurts the effects of overdrive, which in turn can then make your response times slower and increase ghosting. The best setting for an LCD is 60Hz, even if it can do 75.


dunno how well you read, but you apparently missed this statement:
However, the LCD panel's info shows 75Hz

If you're talking about overdrive on the video card, I could care less if the card uses overdrive. Everything I play (whenever I have time to play that is) runs butter smooth at 60hz with vsync on, and even better at 120 or 160hz.

If you're talking about something called overdrive in the lcd panel, then I've never heard of it and even though the pc is set at 120 or 160hz and lcd is at 75hz, I've never seen ghosting or blurring or any other visual anomolies.
 
nst6563 said:
Am I the only one here that has tried running an LCD at 100Hz?

If I don't install the driver .inf's for my lcd (Samsung 940b using DVI) and just use windows to set the refresh, then I can select all the way up to 160. Every utility I've used to check the refresh rate reports back at whatever I've set it at (I'm running 120hz now). However, the LCD panel's info shows 75Hz. The confusing part is that everything seems to run smoother when it's set to a higher refresh. The mouse even seems quicker. It's not just me either, my wife noticed it too (and she didn't know I was messing with the refresh rates...she was bitchin' that I increased the mouse speed to mess with her).

Care to shed some light on what's happening there?

I tried that on my Dell 2405FPW. It does not seem to make a difference at all. Maybe I should uninstall the driver .inf's... and try it again.
 
it's only made a difference on my machine without the Samsung driver's. As soon as I install the samsung drivers, the highest windows will let me set the refresh is 75hz. Take away the drivers and use the windows default and I can uncheck the "hide unsupported modes" box and select all the way up to 160hz. Obviously the lcd itself only runs at 75hz but you can definately tell a difference with the system when it's set to anything above 100hz.
 
So the analog circuit inside the monitor is downsamples the refresh rate, may not be a good thing at all.
 
BBA said:
So the analog circuit inside the monitor is downsamples the refresh rate, may not be a good thing at all.

Uhhhh... not exactly so. It's the monitor driver .inf or something like that, which sets the refresh rate. If you REALLY force the video card to run at say, 90 or 100 Hz refresh rate, it just will not output on the LCD TFT display.

But, I am not 100% sure. However, I do think that I am more than 50% sure, though. You might be right. Anybody out there know for sure??
 
that was a good read, I have to keep vsync going, and I need the video upgrade.. When you get out of sync on a HDTV and that sort, it starts to tear REALLY bad.. so vsync always needs to be on..
 
Back
Top