vsync on or off?

With three buffers, there is no contention, and rendering is completely asynchronous; there is no delay, and there is no problem.

You're correct. With 2 backbuffers you could make it asynchronous. I don't know where I got the idea that it was using two frontbuffers for the swap instead of two backbuffers.
 
you may not notice it but tearing can occur at ANY framerate so theres nothing magical about 120hz in that regard. heck gunfire, flickering lights and explosions can all cause screen tearing.

From what I understand, tearing occurs when the framerate output by the videocard exceeds that which the monitor can handle, causing partial frames to be displayed on the monitor. If you're monitor is 120 hz and your framerate never exceeds that, then you shouldn't experience tearing. Explosions and other full screen changes should make tearing apparent to your eye if its present, not actually cause it.
 
If you are going for image quality Vsync on.
For everything else Vsync off.

I also find it funny how people are just rambling random comments with no clue what they are talking about. One guys says Vsync is mostly a CRT problem another says it mostly just noticable on LCDs. Clearly someone is not right.

For me personally I don't care about tearing having the smoothest most recent picture is the most important thing to me as I can aim much better. So I keep Vsync off at all times.
 
From what I understand, tearing occurs when the framerate output by the videocard exceeds that which the monitor can handle, causing partial frames to be displayed on the monitor. If you're monitor is 120 hz and your framerate never exceeds that, then you shouldn't experience tearing. Explosions and other full screen changes should make tearing apparent to your eye if its present, not actually cause it.
100% incorrect and all you have to do is look at a game to see this. yes technically there should be MORE tearing when you exceed your refresh rate but again it can happen at ANY framerate. I cant even average 30 fps in Metro 2033 on very high settings much less 60fps but the game tears insanely bad. and yes flickering lights, gunfire, explosion and other stuff from with in the game can make the screen tear really bad no matter how low your framerate is. this isn't a theory and again all you have to do is look at it for yourself. sometimes I wonder if anybody even pays attention to anything when they play games.
 
Last edited:
sometimes I wonder if anybody even pays attention to anything when they play games.
That's exactly what I was thinking after reading some of this thread.

Vsync can also cause input delay, not just decreased framerate.

higher framerate also lets you input more control also, because the game IS being processed at a higher framerate, whether or not you can see all of the frames. A great example of this is vanilla quake 3, in which the maximum speed is reached by following an extremely exact curve of velocity augmentation with the mouse. In order to do this to an engine-perfect extent requires 125 fps, much higher than almost all of the monitors currently being sold.
 
From what I understand, tearing occurs when the framerate output by the videocard exceeds that which the monitor can handle, causing partial frames to be displayed on the monitor. If you're monitor is 120 hz and your framerate never exceeds that, then you shouldn't experience tearing. Explosions and other full screen changes should make tearing apparent to your eye if its present, not actually cause it.

No, tearing occurs when a new frame is finished in the middle of sending the old frame to the monitor. You can get tearing at 1fps with a refresh rate of 1000000hz. A higher refresh rate just reduces the duration that a tear is on the screen (so in that example the torn frame would only be on screen for 1/1000000th of a second, give or take), it doesn't reduce the chances of tearing.
 
Yeah, all that changes as your framerate increases is the probability of tearing a frame, but it's never zero.

Once it takes less time to render a frame than it takes to send it to the monitor, you're guaranteed a tear on every single frame, and you have a chance of seeing two or more. Note that it doesn't take an entire refresh cycle to send a frame, so this threshold is higher than 60fps on a 60Hz monitor.
 
I have been using vsync in games for 10 years, but for some reason I am beginning to notice massive input lag in games like Bioshock. I don't know if triple-buffering isn't working properly or something, but it is extremely noticeable.
 
Using D3DOverrider? The one in your control panel is OpenGL-only.

Ah thank you!
So does that actually remove all input lag? I have really been digging the awesome responsiveness of vsync off. I guess I will just download d3dov and see for myself.
 
I generally enable vsync in all my games, as I like my games to be locked at 60 fps. When my fps fluctuates, the game doesn't appear as smooth as a locked 60 fps.
 
I wish I had a CRT that could run 1920x1200 with 100hz refresh rate.
That would be the ultimate for FPS games.
 
lol guys alot of you can find such CRTs for $30 at a local surplus.
 
lol guys alot of you can find such CRTs for $30 at a local surplus.

Explain yourself at once.
Dont toy with my heart this way!

No, but seriously I remember searching ebay and craigslist for a couple months straight and never seeing one in proper working condition within 50 miles of me for under $200. And I live 40 miles from Chicago.
 
I usually have Vsync off unless it's a RTS or RPG game. I just can't stand the lag that VSYNC introduces.
 
I dont know what to say you just have to find the places. Over the last 5 years most big organizations have been trying to ditch CRTs to reduce energy costs. Most of these places had hundred or more of really nice big CRTs. There is usually some sort of way they get rid of them. For instance near me there is a university with a surplus store. They litterally had palletts of these monitors. I have bought 3 FW900s and whole bunch of 21" displays all capable of high resolution high refresh. Many of them used to be priced over $1000 new. There was nothing I ever paid more than $80 for. This is holding me over till LCDs or something else can get competitive with CRTs.
 
you may not notice it but tearing can occur at ANY framerate so theres nothing magical about 120hz in that regard. heck gunfire, flickering lights and explosions can all cause screen tearing.

I have also read that tearing can in fact happen at 120hz below 120fps but for me at least not to a noticeable extent to hinder my gaming experience. From my experience @ 60hz the tearing I notice over 60 fps is MUCH worse when I surpass the 60fps mark. However tearing can happen even before that threshold is passed, however, my eye can't seem to catch it. @ 120hz it's much much harder to reach the noticeable tearing above 120fps and I have found that the tearing below 120fps is near impossible to notice especially since my monitors refresh the frames much faster than my eye can see them. BTW I'm referring to true 120hz poling rate monitors like the ones in my sig, not any flimsy tricks impleneted on some TV's.

From what I understand, tearing occurs when the framerate output by the videocard exceeds that which the monitor can handle, causing partial frames to be displayed on the monitor. If you're monitor is 120 hz and your framerate never exceeds that, then you shouldn't experience tearing. Explosions and other full screen changes should make tearing apparent to your eye if its present, not actually cause it.

Tearing can occor at any fps however the only noticeable tearing I have experience is present in the scenarios you just described. When below 60 fps @ 60hz or 120fps @ 120hz I can't remember noticing in game tearing. If it happens then it's fine as long as my eye can't catch it. When your refresh rate is faster than your fps many times the torn frames are refreshed before your eyes can even pick up on it. I guess ignorance is bliss in these scenarios. I would, however strongly recommend a 120hz monitor for anyone who can't stand noticeable tearing in game as when it is noticeable at least by me it's a experience killer.
 
Lord_Exodia, to be honest the games I notice the most tearing in are actually the games with the lowest framerates. for example in Metro 2033 and Just Cause 2, I see tons of tearing even though my framerate is not usually above 30-50 fps in those games. yet take a game like CSS and I hardly notice tearing at all even at 250fps.

heck here is Metro 2033 doing lots of tearing and even has a wave like effect across the scree but I am only averaging 25-30 fps. and the video does not even capture how bad it really looks. http://www.youtube.com/watch?v=RkO8U8r2Tf8
 
Last edited:
I disable vsync immediately for all games. The input lag is goddamned awful for one and two it feels a hell of a lot smoother without it on. Sure I might not see those extra frames but imo I can definitely feel them. There is also the special jumps that can only be done with vsync off in some games.

I can deal with some screen tearing if need be. If anyone knows of a way to fix the above then by all means.
 
Lord_Exodia, to be honest the games I notice the most tearing in are actually the games with the lowest framerates. for example in Metro 2033 and Just Cause 2, I see tons of tearing even though my framerate is not usually above 30-50 fps in those games. yet take a game like CSS and I hardly notice tearing at all even at 250fps.

heck here is Metro 2033 doing lots of tearing and even has a wave like effect across the scree but I am only averaging 25-30 fps. and the video does not even capture how bad it really looks. http://www.youtube.com/watch?v=RkO8U8r2Tf8

This is 100% proof that tearing can indeed occur at lower fps, but it's not as bad as the straight vertical tearing that occurs when your fps exceeds 60fps @60hz without vsync or triple buffering to offer assistance. I notice the tearing in the video no doubt at all. It seems to be more of a jitter tear when the character turns and the image has a problem syncing with itself. The Vsync tear I'm talking about can also be seen in videos and it's more of a vertical or horizontal line straight across the image and is even more annoying. However I'm confident that with my monitors a jitter tear like your seeing in this video would be much less noticeable or not noticeable at all. Eventually the frames catch up and the picture re-synchronizes even in this video. If the monitor was refreshing the frames faster it'll re-synchronize faster and many of the torn frames will be refreshed so fast that your eye would have a much harder time catching them. I played metro in portrait eyefinity and it was one of the first games that I played with my new system and I could not notice the same jitter and tearing that this video clearly shows. That's my point though, true 120hz poling rate gaming FTW.

My gut tells me that in due time more people will begin to see the light. However since 120hz is mainly associated with nvidia surround too many people dismiss it because they don't know what they're missing. I'm not gonna be shocked when 120hz gaming panels are in ridiculously high demand in about a year or so. You should see how quick about 3 of my 4 closest friends sold their IPS panels and got 120hz TN panels. Yes that's strange as hell but it happened because 120hz gaming really does make that much of a difference. Once they saw my rig in action they understood why I payed over $300 a screen for only 24 inch screens.

BTW I recently went to a local lan party which had their own panels there and I brought my monitors and people couldn't stop looking at them in game. The fluid motion just really took everyone by surprise. They also loved the small inner bezels underneath my monitors stock bezels. I've seem to have converted quite a few people and I simply sat there and hooked them up and gamed. All those gamers came to the conclusion that it was more fluid and better by themselves. They couldn't believe that my panels were TN.
 
Last edited:
There is no concrete answer because some games run smoother with it off and others with it on. It's not just about frame rates but also smoothness that can be ruined by hitching and quite often that is caused by vsync on/off. You have to test both yourself to see which is best.
 
Lord_Exodia, to be honest the games I notice the most tearing in are actually the games with the lowest framerates. for example in Metro 2033 and Just Cause 2, I see tons of tearing even though my framerate is not usually above 30-50 fps in those games. yet take a game like CSS and I hardly notice tearing at all even at 250fps.
Yeah, that's the other side of it... Although the chance of a tear is less at a lower framerate, it tends to be more severe, as the time between frames is higher. Nothing much happens in 1/250th of a second, so even though you're probably seeing four partial frames on every refresh, they're going to be similar enough to the adjacent one that it's hard to pick the transitions.
 
Back
Top