Why dont people use vsync?

Joined
Apr 13, 2008
Messages
856
posted it here since it's more of a technical gaming issue rather than general Gaming.

as topic says...Why does it seem that most people leave V-sync off with a powerful machine that gets over 60FPS most of the time? is page-tearing not an issue with LCDs?

if page-tearing still occurs, why would anyone care about 70, 80, even 130 FPS in any game when half the screen tears b/c the display cant keep up with the GFX card? i know v-sync generally lowers FPS besides "capping" it at your refresh rate (60 for almost all LCDs), but if you're getting 80+ fps steady you can run v-sync at 60 without issues and not get page tearing.

am i missing something or are people that e-peen-oriented that they need to claim 100+ FPS in all their e-peen forums?

i keep vsync on in any game where my system can support it without dropping much below 45-50 fps. i get steady 60 in almost every game i regularly play and there's no benefit whatsoever b/w 60 and 1million FPS even if page-tearing didnt occur.

for games where i cant run v-sync without dipping below acceptable frame rates, i disable it and pick up an extra 5-10 fps, but since it's probably still below page-tearing range, it's a moot point.

thoughts?
 
Because running the video stress test on CSS and getting 60 is lame. I've never had problem with page tearing even getting well over 100fps, so I just don't use it.
 
People just want to see how fast the their video can render their games at. Why get good vid when you have to limit the hp? I don't notice the tearing and I just got used to. Fear plays smooth at 15-200 fps. Who cares you are so busy trying to kill others in MP mode
 
I disable vsync in CS:S only because double-buffering incurs a slight input latency hit. That's the only time it's ever disabled.
 
There's no benefit to getting over 60? You clearly don't play fps game very often or you might not be that great at them... Not a personal attack on you, but when you're used to getting 150+ and then you cap yourself to 60, you notice a lot of the finer details that disappear due to less frames.
 
Heh. Explain that ;)

Counterstrike source: The rate of the flashbang to wear off, precision in awping, player movement. I suppose they're things you wouldn't really notice unless you were really into the game, but it all makes a difference to me.

I don't want to get into other games since this is what I spend most of my time on, but the same could be said for, say, UT3. By default the FPS is capped to around 60, but if you edit the config, you can set it higher and the game feels a lot smoother and more fluid.
 
Counterstrike source: The rate of the flashbang to wear off, precision in awping, player movement. I suppose they're things you wouldn't really notice unless you were really into the game, but it all makes a difference to me.

I don't want to get into other games since this is what I spend most of my time on, but the same could be said for, say, UT3. By default the FPS is capped to around 60, but if you edit the config, you can set it higher and the game feels a lot smoother and more fluid.

Doesn't the refresh rate of the monitor "cap" the framerate to 60 anyway?
 
Well, I don't think the flashbang effect wears off any faster at 30fps than it does at 200fps. The rate at which the effect wears off is probably tied to the engine's internal tick rate, not the frame rate.

Neither weapon firing nor player movement are tied to the frame rate in Source engine games. These are fixed to the game tick rate.
 
There's no benefit to getting over 60? You clearly don't play fps game very often or you might not be that great at them... Not a personal attack on you, but when you're used to getting 150+ and then you cap yourself to 60, you notice a lot of the finer details that disappear due to less frames.
What finer details? A screen CANNOT display more frames than its refresh rate limits. 60 Hz = 60 Fps.

Let me explain it this way:
VSync off, Fps:
100, 100, 100, 90, 80, 80, 85, 50, 40, 35, 35, 40, 90, 100

VSync on, 60 Hz, Fps:
60, 60, 60, 60, 60, 60, 60, 50, 40, 35, 35, 40, 60, 60

Anything over 60 Fps on a 60 Hz screen would not be visible. Therefore, you will not SEE any difference between VSync on or off when it is > 60. I keep mine off because I like to see my card's actual performance.

However, as phide pointed out, there may be some instances where it is beneficial to turn it on or off. Search "vsync input lag" and you'll find a lot of information on the subject.
 
I love vsync. I see a huge difference when it is off, as I get a ton of page tearing during the game and cutscenes. It drives me nuts. So for me, the point is to have a great vid card that allows me to play with vsync on at acceptable frame rates (30-60).

When I benchmark vid cards I usually do it with vsync on.

But apparently many people do not notice the page tearing. To each his/her own, eh.
 
Screen tearing can be very annoying but I don't force V-sync in CCC but do enable it by game to game basis. Prey can seem almost unplayable with v-sync off.
 
I meant that the movement and such is smoother, making your ability to react faster. Of course, the flashbang is still going to wear off at the same rate, but you're going to get more frames of it going away, meaning you can see again sooner, whereas with say 30 frames, it's going to be choppier and you can't react properly. Same goes for my statement on player movement. I know they're not moving faster, but they're moving smoother as is your aim, therefore enhancing your own gameplay.
 
I use it all th time, you just cant beat the smooth rock steady image you get with it, makes things look so much more solid, i really cannt stand the horrible rippling/tearing and general roughness of the screen without it.
 
I use it all th time, you just cant beat the smooth rock steady image you get with it, makes things look so much more solid, i really cannt stand the horrible rippling/tearing and general roughness of the screen without it.
that's how i feel.

also, i play primarily FPS games. i started with the original Unreal and have been playing FPS primarliy for 10 years. i played CS 1.1 through 1.6 semi-competitively and now play TF2 competitively. extra FPS dont add anything to the game, although i was unaware of the input lag issue.

edit: upon some quick research i've found 2 important things regarding vsync and input lag:
1 - the input is incredibly noticable if occuring.
2 - enabling Triple Buffering greatly reduces the lag in most games, to the point where it's not noticable any more than having vsync off.

there's an option in RivaTuner's D3D options panel to set the pre-render limit (in V-Sync tab). this should be set to 3. higher numbers will increase perfomance but may also increase input lag. 3 is required for Triple Buffering and is recommended by most people with an opinion on the subject. the default is generally double buffering, or 2 pre-rendered frames. triple Buffering gives better performance and in many system in many games is the best option without any noticeable added input lag. some games may require this to be Double Buffered, though.

looking back, i think this may be where my weird Bioshock input lag came from. From everything i;ve read, it's safe and better to leave vsync on unless there's an obvious 200ms+ mouse delay.
 
I meant that the movement and such is smoother, making your ability to react faster. Of course, the flashbang is still going to wear off at the same rate, but you're going to get more frames of it going away, meaning you can see again sooner, whereas with say 30 frames, it's going to be choppier and you can't react properly. Same goes for my statement on player movement. I know they're not moving faster, but they're moving smoother as is your aim, therefore enhancing your own gameplay.

You really won't. Once you exceed your monitors refresh rate it turns purely into a placebo effect (aside from vsync input lag). Lets say you get 120 FPS in CSS, but your refresh rate is 60 - it just means that 60 of those frames are being thrown away or only being partially drawn (and you'll get the same 60 frames if vsync was on). You cannot exceed 60FPS because your monitor can't. You also can't compare over 60FPS to 30FPS - less than 60fps isn't the point here.
 
I first started using v-sync about 2 years ago and I've never looked back. It's been with me through about 10 major FPS shooters and I've never thought once about not using it (except of course when on those rare occasions when I'm testing frames on a time demo and such). Don't get me wrong, when I didn't really understand what V-sync did, I got used to the screen tearing on shooters, but man oh man, once I saw the difference there was no contest. I guess it all depends on your preferences. I'm a big graphics whore and I demand that my games "look" as pretty as possible. Unless you're only moving in a straight line and don't do much looking from side to side, FPS just don't look very good on LCD's with lots of screen tearing. I don't know about that guy who "sees" much more detail with v-sync off, but to me, aesthetically, games look better with it on.
 
aesthetically yes because it solves tearing but for some odd reason in some games there becomes motion lag... esp if it falls below the 60fps and goes to 30fps... unless i can stay at more than 60 all the time then i turn it on
 
i run with vsync on because i dont see a reason for the system to output images that wont get used....
 
aesthetically yes because it solves tearing but for some odd reason in some games there becomes motion lag... esp if it falls below the 60fps and goes to 30fps... unless i can stay at more than 60 all the time then i turn it on

well said. the 30fps drop is very noticeable and lame. but anyone that is really worried about a 60fps ceiling needs to go outside and throw a football around or something.. im sure some day 120hz refresh monitors will be the norm so untill then just dont worry about it.
 
I use V-Sync. But then again, I'm not really competitive (as in I don't play with other people [not online anyways]). I find the whole competitive nature of online gaming a kill-joy.
 
Always off. Maybe I don't notice it and would rather not have the input lag or the drop in frames. Or maybe my Trinitron CRT doesn't get much tearing because I have the refresh at 75.
 
I always turn it on, that way my machine does less work, uses less power, creates less noise and more importantly the cpu is able to do more background tasks, due to the lower Cpu usage, thinks like video encoding still get done. With Vsync off, video encoding typically gets 1% of my cpu resources whilst playing a game as the game hogs 90-99% of my cpu in a vain effort to produce as many FPS as possible.

Its was great when Everquest got a frame rate cap slider bar, it meant that if you were just trading/sitting in pok/guild halls etc, then you could tell it to cap you at 5fps and you could use the rest for your other programs.
 
aesthetically yes because it solves tearing but for some odd reason in some games there becomes motion lag... esp if it falls below the 60fps and goes to 30fps... unless i can stay at more than 60 all the time then i turn it on

Triple buffering solves that problem :) (I forced VSync + Triple Buffering globally in the nvidia control panel and haven't looked back)
 
I've always left V-Sync on. Tearing in games is one of the most annoying things ever for me.
 
I think it depends. If you primarily play online twitchfest games, then yeah, V-Sync is anathema to everything you stand for. 90fps helps more than 60fps in those kinds of games.

But if you're playing slower-paced online games (like MMOs) or are playing singleplayer, then why wouldn't you enable V-Sync? You don't need the extra frames after 60FPS, because nothing's riding on it.

In Team Fortress 2, I have V-Sync disabled. In the various HL2 games, I have V-Sync enabled. I'm happy.
 
If you are using an LCD monitor, what benefit can there be from generating more than 60 fps when your monitor is limited to 60 fps? LCD users should always use Vsync and also force triple-buffering (to reduce the input lag) in the video card driver control panel. Most games don't seem to include triple-buffering in their display options, so if you don't know about this, you may have tried vsync without it.

For those who don't know what tearing is, run a game which can generate lots of FPS, then simply spin in place. The effect of the lower half of the screen not keeping up with the upper half should be very apparent. There's no reason to put up with that with vsync and triple-buffering on.

If you are using a CRT (as I gather competitive gamers often do), then your monitor is not capped to 60 fps, so that changes the whole discussion. But I think unless you are earning a living based on whether 100 fps will get you more headshots, it's long past time to get an LCD monitor which really have caught up to the CRTs in image quality.
 
If you are using a CRT (as I gather competitive gamers often do), then your monitor is not capped to 60 fps, so that changes the whole discussion. But I think unless you are earning a living based on whether 100 fps will get you more headshots, it's long past time to get an LCD monitor which really have caught up to the CRTs in image quality.

VSync caps it at the refresh rate, which isn't necessarily 60hz. If your CRT is set to a 75hz refresh rate, then VSync caps it at 75fps. LCD or CRT - it doesn't matter, VSync + Triple Buffering is win-win.
 
Well, in Call of Duty games there are specific jumps that can only be achieved at certain FPS or above, so that's one reason I know of.
 
For like CS 1.6 (GO OLDIES!), I turn off vsync because if I have vsync on and I move my mouse from left to right, it's like super smooth to the point that I feel that my movement in the screen is not as fast as my reaction time. With vsync off, as soon as I move my mouse from left to right, the screen lands where I want it to be at that point, no delays whatsoever. I could do with the tearing as long as matches are won in events.

For FPS games it's off for me...but for like RTS or any other more laid-back mouse movement (IMHO), I leave it on.
 
I used Vsync in Crysis and it helped. The blur wasn't enough to cover up the tearing.
 
Mine can do 75hz

Its not really doing 75Hz. Most (all?) LCD monitors do 60Hz regardless if they say 70, 72, 75Hz. At 75Hz they skip every fifth frame, and your OSD (if you have one) will probably still say 60Hz. All the software thinks its running at 75Hz but the hardware still only draws 60 frames per second.
 
I always turn vsync on. I have a real problem with tearing. It seems to annoy me more with LCD's then it did in the CRT days.
 
From TweekGuides.com on VSync, FPS, and Triple Buffering:

FPS & VSync

When VSync is disabled, your FPS and refresh rate have no relationship to each other as such. This lets your graphics card work as fast as it wants, sending frames to the monitor as fast as it can draw them. Whether the monitor can actually show all these frames properly or not is another matter, which we've already discussed above. Clearly if disabling VSync can cause graphical glitches, however minor they may be, wouldn't it make sense to always enable VSync so that your graphics card doesn't wind up wasting its efforts only to generate more tearing? Well once again, things are not as simple as that.

When VSync is enabled, what happens is that your graphics card is told to wait for your monitor to signal when it's ready for a new frame before supplying a single whole frame, each and every time. It can't race ahead, it can't just pump out lots of partially completed frames over old ones whenever it's ready - it has to provide a single whole frame to the monitor whenever the monitor says it's ready to refresh itself during VBI. The first noticeable impact is that your FPS becomes capped at a maximum equal to your current refresh rate. So if your refresh rate is 60Hz for example, your framerate can now only reach a maximum of 60FPS. By itself this isn't really a problem, since every monitor can do at least a 60Hz refresh rate at any resolution, and as we've discussed under the Frames Per Second section, if your system can produce 60FPS consistently in a game this should be more than enough FPS to provide smooth natural motion for virtually any type of game.

There is however a more fundamental problem with enabling VSync, and that is it can significantly reduce your overall framerate, often dropping your FPS to exactly 50% of the refresh rate. This is a difficult concept to explain, but it just has to do with timing. As we know, when VSync is enabled, your graphics card pretty much becomes a slave to your monitor. If at any time your FPS falls just below your refresh rate, each frame starts taking your graphics card longer to draw than the time it takes for your monitor to refresh itself. So every 2nd refresh, your graphics card just misses completing a new whole frame in time. This means that both its primary and secondary frame buffers are filled, it has nowhere to put any new information, so it has to sit idle and wait for the next refresh to come around before it can unload its recently completed frame, and start work on a new one in the newly cleared secondary buffer. This results in exactly half the framerate of the refresh rate whenever your FPS falls below the refresh rate.

As long as your graphics card can always render a frame faster than your monitor can refresh itself, enabling VSync will not reduce your average framerate. All that will happen is that your FPS will be capped to a maximum equivalent to the refresh rate. But since most monitors refresh at 60Hz or above, and in most recent games it is difficult to achieve 60FPS consistently at your desired resolution and settings, enabling VSync usually ends up reducing your FPS. Fortunately, because this problem is pretty much caused by the frame buffers becoming filled up, there is a solution and that's to enable a third frame buffer to allow more headroom. However this is not a straightforward solution, and to read more about this see the Triple Buffering section.

So Which is Best, VSync On or Off?

VSync poses a real dilemma for many people: with VSync off, tearing can occur whenever your graphics card and monitor go out of sync, and this can be very annoying for some people, especially in fast motion games. However with VSync on, your FPS can often fall by up to 50%. This can be resolved on many systems using Triple Buffering, but that also brings with it a range of possible problems. So which choice is right for you?

Well clearly I can't give you a one-size-fits-all answer, but I can provide some suggestions. To start with, I strongly recommend setting VSync to 'Application Preference' (or similar) in your graphics card's control panel. This is because ideally you should set your VSync preference on a game-by-game basis, preferably using the in-game settings, as the choice will differ depending on the type of game you are playing. Newer games with complex graphics for example will be different to older games which your system can run much more easily. Remember, in games where your FPS is consistently above your refresh rate, enabling VSync is perfectly fine and results in no drop in FPS.

In general, I recommend starting off with VSync disabled in any game as this is the most trouble-free method of gaining the fastest possible performance. This is the simplest solution, and on monitors which have lower refresh rates, or for games in which your framerate is not very high, this appears to be the best solution. You may notice some tearing, but this will generally be minimal if your FPS remains below your refresh rate anyway. Remember though that even if your FPS matches your refresh rate exactly, or is even below it, whenever VSync is disabled the graphics card and monitor are not strictly in sync, and tearing (however minor) can occur at any time.

In any game if you find tearing annoying, you should enable VSync. If you find your FPS has halved, you should then specifically try enabling Triple Buffering, as this can help fix the FPS drops related to enabling VSync, but it introduces the possibility of hitching on graphics cards with less VRAM, and possible control lag on some systems. See the Triple Buffering section for details.

There is no clear choice for everyone when it comes to VSync, and this is why the option to enable or disable VSync exists both in the graphics card control panel and in games. As long as you understand what it does however, you can make an educated choice to suit your hardware and tastes.

Triple Buffering

In the Graphics Process section of this guide under Step 8, an overview is provided of the way in which the graphics card holds rendered frames in the Frame Buffer. There are actually two buffers on modern graphics cards, the Primary Buffer and the Secondary Buffer, also often called the Front Buffer and the Back Buffer. Both are storage areas on the Video RAM of the graphics card, and the process of using two buffers at one time is called Double Buffering. It was only relatively recently that graphics cards had enough VRAM to provide two buffers at all resolutions, since a single frame of high resolution detailed graphics can take up a great deal of video memory, much less two of them.

The graphics card uses the secondary buffer to compose a new frame while the primary buffer is sending an existing completed frame to the monitor. When these tasks are done, the buffers are essentially 'flipped' around so that the recently completed frame in the secondary buffer now becomes the primary buffer ready to send to the monitor, while a new frame begins composing in what was the primary buffer a moment ago. This is repeated over and over and thus the use of two buffers means that the graphics card is not constantly waiting for a single frame buffer to be cleared before getting on with rendering more frames to store there. It's like putting out a fire using two buckets of water instead of just one - one bucket can be filled with water while the contents of the other is being thrown on the fire, and then they're switched and the process repeated; much faster than just using a single bucket.

There is still a problem with double buffering, and that is when VSync is enabled, the graphics card can often fill both buffers and then have to stop working on any new frames until the monitor indicates it is ready for a new frame for its next refresh. Only then can the graphics card clear the primary buffer, switch buffers and begin rendering the next frame in the secondary buffer. This waiting is what causes a drop in FPS when VSync is enabled on many systems, and is covered in more detail in the Vertical Synchronization section.

Wouldn't it then make sense to have more than two buffers? Why not three buffers for example - that would give the graphics card more room to render frames without having to worry about where to store them before they're sent to the monitor, even if VSync is enabled. Well there is an option which does just that, called Triple Buffering. And it generally does precisely what the name implies, it creates a third buffer in the VRAM, which we can call the Tertiary buffer.

Problems with Triple Buffering

It may seem odd that if Triple Buffering resolves the problem of low framerates when VSync is enabled, it doesn't appear as a standard option in many games, or in the Direct3D-related settings of the Nvidia or ATI graphics control panel. There are three main concerns that appear to be the reason behind this:

1. If it is not properly supported by the game in question, it can cause visual glitches. Just as tearing is a visual glitch caused by information being transferred too fast in the buffers for the monitor to keep up, so too in theory, can triple buffering cause visual anomalies, due to game timing issues for example.

2. It uses additional Video RAM, and hence can result in problems for those with less VRAM onboard their graphics card. This is particularly true for people who also want to use very high resolutions with high quality textures and additional effects like Antialiasing and Anisotropic Filtering, since this takes up even more VRAM for each frame. Enabling Triple Buffering on a card without sufficient VRAM results in things like additional hitching (slight pauses) when new textures are being swapped into and out of VRAM as you move into new areas of a game. You may even get an overall performance drop due to the extra processing on the graphics card for the extra Tertiary buffer.

3. It can introduce control lag. This manifests itself as a noticeable lag between when you issue a command to your PC and the effects of it being shown on screen. This may be primarily due to the nature of VSync itself and/or some systems being low on Video RAM due to the extra memory overhead of Triple Buffering.

However it appears that most recent graphics cards and most new games will not experience major problems by enabling Triple Buffering. Given the fact that it can help to both remove tearing while also preventing the significant FPS drop encountered when VSync is enabled, it is at least worth trying for yourself to see the results on your system.

By the way I'm not aware of any compelling reason why ATI or Nvidia don't build Triple Buffering for Direct3D as an option into their standard graphics control panels. For the moment, if you wish to use Triple Buffering in D3D games you will have to use third party utilities like DXTweaker, RivaTuner or ATI Tray Tools utilities above until it is incorporated into more games and finally into the graphics driver control panels.
 
Vsync is fine as long as you can guarantee your framerate remains above 60fps.

If your framerate drops below 60 fps (say for example, 45fps), it causes jitter because your video card alternates between 30 and 60 fps.

Basically, you see the following frame progression (this is a fixed-size font, so they should all line up at the same framerates):
Code:
[FONT="Fixedsys"]

NO VSYNC (60fps CONSTANT)
[1][2][3][4][5][6][7]

VSYNC (60fps CONSTANT)
[1][2][3][4][5][6][7]

NO VSYNC (45fps CONSTANT)
[1 ][2 ][3 ][4 ][5 ][6 ][7 ]

VSYNC (45fps CONSTANT)
[1][2  ][3][4  ][5][6  ][7]

See the annoyingly long gaps when the 45fps VSYNC ON renders at 30fps[/FONT]

Hopefully the above makes it clear: the systems running at 60fps with and without VSYNC show no difference, but the system running at average 45fps shows much more consistent results with VSYNC off. With VSYNC ON, the system alternates between 30 and 60fps, creating annoying jitter.

And the above only applies if your framerate is constant. If it dips lower than 30fps, you will see your framerate at 1/3 60, or 20fps.

The reason people would rather run without VSYNC is because you need to double the average framerate to get playability with VSYNC on. Basically, if you shoot for 60fps (average, not constant like above), it's playable without VSYNC, but with VSYNC the video will stutter. the reason the video stutters is because half the time, the framerate is below 60fps, and renders as if it were at 30fps, just as in the example above.

Thus, you would have to get in the realm of 120fps average before VSYNC would be silky-smooth, double the framerate you need without it.
 
that's how i feel.

also, i play primarily FPS games. i started with the original Unreal and have been playing FPS primarliy for 10 years. i played CS 1.1 through 1.6 semi-competitively and now play TF2 competitively. extra FPS dont add anything to the game

What does semi-competitively mean? The difference between fps_max 100 and 60 is quite noticeable with any type of snap aiming, and I don't know of a single serious player who would back up your statement. fps_max 100, 100hz or higher (CRT only for scompetitive gaming) and Vsync off. The main reason for the disabling Vsync was because of the FPS hit. Even on the super old HL engine, smokes caused fps problems a 3+ years ago.
 
Vsync is fine as long as you can guarantee your framerate remains above 60fps.

If your framerate drops below 60 fps (say for example, 45fps), it causes jitter because your video card alternates between 30 and 60 fps.

Basically, you see the following frame progression (this is a fixed-size font, so they should all line up at the same framerates):
Code:
[FONT="Fixedsys"]

NO VSYNC (60fps CONSTANT)
[1][2][3][4][5][6][7]

VSYNC (60fps CONSTANT)
[1][2][3][4][5][6][7]

NO VSYNC (45fps CONSTANT)
[1 ][2 ][3 ][4 ][5 ][6 ][7 ]

VSYNC (45fps CONSTANT)
[1][2  ][3][4  ][5][6  ][7]

See the annoyingly long gaps when the 45fps VSYNC ON renders at 30fps[/FONT]

Hopefully the above makes it clear: the systems running at 60fps with and without VSYNC show no difference, but the system running at average 45fps shows much more consistent results with VSYNC off. With VSYNC ON, the system alternates between 30 and 60fps, creating annoying jitter.

And the above only applies if your framerate is constant. If it dips lower than 30fps, you will see your framerate at 1/3 60, or 20fps.

The reason people would rather run without VSYNC is because you need to double the average framerate to get playability with VSYNC on. Basically, if you shoot for 60fps (average, not constant like above), it's playable without VSYNC, but with VSYNC the video will stutter. the reason the video stutters is because half the time, the framerate is below 60fps, and renders as if it were at 30fps, just as in the example above.

Thus, you would have to get in the realm of 120fps average before VSYNC would be silky-smooth, double the framerate you need without it.

Or you could read the thread and learn that triple buffering solves the problem you are describing. You can still get 45 FPS and have VSync enabled.
 
Or you could read the thread and learn that triple buffering solves the problem you are describing. You can still get 45 FPS and have VSync enabled.

No, you can't, not 45fps constant. The buffer will eventually empty if the video chip is supplying frames at 45fps and the screen is eating them at 60fps; at this point, the triple-buffer becomes a double-buffer, and has the same problems.

What triple-buffer CAN do is give you a fairly smooth experience if you have say 60fps AVERAGE. The buffer will empty as it smooths (temporary) low spikes in framerate, and will refill if the framerate spikes (temporarily) above 60fps.

HOWEVER: triple buffering takes %50 more framebuffer memory, and doubles your video lag. In other words, there is no easy solution to the problem; they all have their drawbacks.

Which one drives you crazier? Input lag, or tearing? For me, it's definitely input lag.

NO VSYNC: best performance, has tearing.

VSYNC: no tearing, but requires MINIMUM framerate of the monitor refresh rate to prevent jitter.

VSYNC WITH BUFFER: no tearing, AVERAGE framerate required for smooth gameplay is same as monitor refresh. HOWEVER: with AVERAGE framerates slower than monitor refresh, gets jitter. Also takes more memory, and adds input lag.
 
Back
Top