vsync on or off?

I've always had it off. Tearing doesn't bother me as much as a dip from 60 fps to 30-15-7.5 fps does.

Edit: doh! Didn't know I was feeding in to a resurrected thread. Trololol
 
Haha! I clicked on this thread and saw I was the OP 5 years ago. Too funny.
 
Llama[Style];1037224173 said:
I've always had it off. Tearing doesn't bother me as much as a dip from 60 fps to 30-15-7.5 fps does.

FYI, triple buffering fixes that.
 
FYI, triple buffering fixes that.

Not totally, but it does HELP. Assuming triple buffering is working correctly you will usually need to be able to maintain 70 fps with v-sync off in order to maintain 60fps with v-sync on, otherwise it will drop to the next refresh interval (interval 2 = 30). With double buffering you will probably need to be able to maintain 80-90 fps with v-sync off in order to maintain 60fps with v-sync on. So it helps, but not as much as people give it credit for. V-sync will always lock your framerate to an interval of your refresh rate whether you're using double buffering or triple buffering. Of course sometimes people try to force triple buffering with third party software like D3DOverrider and assume it's working correctly even when it's not (if you ever see a framerate other than 60/30/20/15 on a 60hz monitor v-sync is not working) but that's a whole different story.

I personally only turn on v-sync when I can maintain 60fps constantly even with it on, which is usually only in older games. In all other situations I find the cons outweigh the pros. In many situations a simple framecap of 60fps via 3rd party software will eliminate most of the severe frame tearing (although it won't do a perfect job like v-sync will) without causing huge drops in framerate.
 
How can you take a "real" hit in FPS if your monitor defaults say 60hz..that's 60 refreshes a second. If your card is outputting say 200fps..you're still only getting 60 refreshes. Those 140fps go to waste.

Just because you limit FPS to 60 with vsync and your LCD is 60 Hz, doesn't mean they actually synchronize. You can take a FPS hit because sometimes they do not. That is why you have triple buffering, but that doesn't solve all the issues either.

It's an option, use it as needed and turn it off as needed....

Just saw this is an OOOOLD thread :rolleyes:
 
I'll turn it on in single-player games where I can consistently get framerates over 60fps, but other than that I leave it off. For multiplayer games, vsync seems to have a bit more latency than without, and the tearing doesn't bother me. Like above, usually the cons of vsync outweigh the pros for me.
 
I'll turn it on in single-player games where I can consistently get framerates over 60fps, but other than that I leave it off. For multiplayer games, vsync seems to have a bit more latency than without, and the tearing doesn't bother me. Like above, usually the cons of vsync outweigh the pros for me.

I'm similar, the only MP FPS I'm playing these days is BC2 in which my rig cannot maintain 60fps even at medium settings, so I turn vsync off to have smoother overall FPS. The tearing does bother me, especially in situations like gunning from a heli (the muzzle flash looks awful). In TF2 I can maintain 60fps almost all the time so I have vsync on and don't feel any mouselag.

Back in the days of CS on a 120hz CRT, I never used vsync, and never had issues with mouse lag or dropped frames.

Single player I always have it on as I want the best looking graphics, versus twitch reaction time. That said, my twitch/aim sucks now compared to 10 years ago, guess I am too old to keep up with the kids :).

D3DOverrider seems to help, at least in TF2 I will drop from 60fps to 45fps then 30fps.
 
Not totally, but it does HELP. Assuming triple buffering is working correctly you will usually need to be able to maintain 70 fps with v-sync off in order to maintain 60fps with v-sync on, otherwise it will drop to the next refresh interval (interval 2 = 30). With double buffering you will probably need to be able to maintain 80-90 fps with v-sync off in order to maintain 60fps with v-sync on. So it helps, but not as much as people give it credit for. V-sync will always lock your framerate to an interval of your refresh rate whether you're using double buffering or triple buffering. Of course sometimes people try to force triple buffering with third party software like D3DOverrider and assume it's working correctly even when it's not (if you ever see a framerate other than 60/30/20/15 on a 60hz monitor v-sync is not working) but that's a whole different story.

Uh, what? If you have triple buffering on and your FPS drops below 60, it doesn't then drop to 30. You can get 50fps with vsync on and triple buffering, for example. "refresh interval", as you call it, doesn't exist with triple buffering. Your FPS is *not* locked to an interval of your refresh rate with triple buffering (which is pretty much the whole point). Thus, problem doesn't exist with triple buffering.

If you are playing a game with vsync on and your FPS flips between 60 and 30, then you don't have triple buffering.
 
i've personally found that I can live with tearing, but I can't live with the huge mouse lag I encounter for turning it on.
 
Most people don't understand how triple buffering works - it's not like it's double buffering with 1 extra frame. Triple buffering works well with most games and is worth the extra effort to download a tool like D3DOverrider if need be to enable it. Try it with FRAPs if you're worried about fallacies like dropping to 30fps.
 
I'm actually kinda glad that this thread was resurrected, since I now learned what triple-buffering actually is. I'll have to give a shot.
 
I run it because my LCD does 120. So if I'm doing 120FPS and get hit with a drop to half I'm still @ 60FPS :)
 
Uh, what? If you have triple buffering on and your FPS drops below 60, it doesn't then drop to 30. You can get 50fps with vsync on and triple buffering, for example. "refresh interval", as you call it, doesn't exist with triple buffering. Your FPS is *not* locked to an interval of your refresh rate with triple buffering (which is pretty much the whole point). Thus, problem doesn't exist with triple buffering.

Nope. Totally wrong. It doesn't matter if you're using double buffering or triple buffering, if v-sync is actually working you're locked to an interval of the refresh rate. How are you supposed to sync to the refresh rate without syncing to the refresh rate? That would make no sense. The whole point of v-sync is to eliminate screen tearing by syncing to an interval of the refresh rate, THAT'S WHAT IT DOES. It doesn't matter if you use 2 buffers, 3 buffers, 8 buffers, v-sync still does this.

If you're using triple buffering you have two frontsbuffers and one backbuffer. The GPU alternates the rendertarget between the two frontbuffers on every other frame. After a refresh the backbuffer is swapped with the previous rendertarget. Both openGL and d3d have fixed functions that let you control the interval and number of buffers with a single function.

It is also to important to note that it is almost impossible to get an accurate reading of your actual framerate with any piece of software if you are using triple buffering. Any standard piece of software that measures framerate (lets use fraps as an example) looks at the number of draws to a rendertarget regardless of how many you have since the programmer for that piece of software has no possible way of determining the number of frontsbuffers being used by another piece of software. Therefore it will only read the number of frames per second drawn to the frontbuffer(s), not the number swapped with the backbuffer (which is the number of frames per second you will actually receive). However with double buffering you only have one frontbuffer which is only drawn to once per refresh, so the number of swaps always matches the number of frames per second drawn to the frontbuffer. This is why it is easy to get an accurate reading of your real framerate with double buffering but not with triple buffering. Triple buffering can reduce the performance hit of v-sync in exchange for increasing input lag since it essentially allows up to one additional frame to be drawn for each that is swapped with the backbuffer. If your GPU draws 50 fps to the frontbuffers it will only swap 30fps with the backbuffer and you will only see 30 of those 50 frames it generated. From what I know very few programs that can measure backbuffer swaps per second correctly exist.

Another possible explanation is that you are using 3rd party software like D3DOverride to try and force it which usually breaks v-sync, creates incorrect framrate readings due to"hackish" methods of implementing additional buffers or swap chains, or in some cases 3rd party software uses swap chains instead of alternate frontbuffers to achieve 3 framebuffers.

What annoys me more than anything about this topic is the number of programs that get swap chains and triple buffering mixed up. Even game developers do it occasionally.
 
Last edited:
Nope. Totally wrong. It doesn't matter if you're using double buffering or triple buffering, if v-sync is actually working you're locked to an interval of the refresh rate. How are you supposed to sync to the refresh rate without syncing to the refresh rate? That would make no sense. The whole point of v-sync is to eliminate screen tearing by syncing to an interval of the refresh rate, THAT'S WHAT IT DOES. It doesn't matter if you use 2 buffers, 3 buffers, 8 buffers, v-sync still does this.

No, I'm totally right. With triple buffering and vsync the render FPS is not locked to an interval of the refresh rate, which is what matters. Also, triple buffering does not have two front buffers, but two *back* buffers. Go read up on it: http://www.anandtech.com/show/2794
 
How are you supposed to sync to the refresh rate without syncing to the refresh rate? That would make no sense. The whole point of v-sync is to eliminate screen tearing by syncing to an interval of the refresh rate,
You synchronise the buffer flips to the refresh rate, without synchronising frame rendering with the refresh rate.

Let's say, for simplicity's sake, that your refresh rate is 1Hz, with vblanks on the second, and your rendering rate is 1.01fps. (Note that as kllrnohj said, you've got your front and back buffers, uh, back-to-front :p ). With three buffers, rendering is asynchronous, and we can start drawing a new frame immediately after the old one. So:

At 1.0s, nothing happens, as the first frame isn't ready yet.
At 2.0s, we have frame 1, completed at 1.01s, and we flip it to the front buffer.
At 3.0s, we have frame 2, completed at 2.02s, and we flip it to the front buffer.
At 4.0s, we have frame 3, completed at 3.03s, and we flip it to the front buffer.
...
At 99.0s, we have frame 98, completed at 98.98s, and we flip it to the front buffer.
At 100.0s, we have frame 99, completed at 99.99s, and we flip it to the front buffer.

So, we've displayed 99 frames in 100 seconds.

Now if we only had two buffers, then the rendering is no longer asynchronous, because it needs to wait for a vacant buffer to start the next frame. So:

At 1.0s, nothing happens.
At 2.0s, we display frame 1, completed at 1.01s, and start frame 2.
At 3.0s, nothing happens.
At 4.0s, we display frame 2, completed at 3.01s, and start frame 3.
At 5.0s, nothing happens.
At 6.0s, we display frame 3, completed at 5.01s, and start frame 4.
At 7.0s, nothing happens.
...

After 100 seconds, we've only completed 50 frames.

So this is where the VSync halved-framerate thing comes in. But it only happened because we delayed the rendering, and we only delayed rendering because there weren't enough buffers to go around. With three buffers, there is no contention, and rendering is completely asynchronous; there is no delay, and there is no problem.
 
It is also to important to note that it is almost impossible to get an accurate reading of your actual framerate with any piece of software if you are using triple buffering.
That's not true at all. Every single framerate counter I've ever seen bases its numbers on buffer flips, not rendered frames. If they didn't, I wouldn't be pegged at 60fps with triple-buffered VSync on. And this is in OpenGL, so no blaming it on "hackish" third-party tools...

Which, by the way, work just fine... I can tell tearing from no tearing, and I can tell 30fps from 55fps, so it's pretty obvious when D3DOverrider is working and when it's not.
 
Haha! I clicked on this thread and saw I was the OP 5 years ago. Too funny.

it's still an issue though. as risqu3 said, i can live with tearing (actually i don't even notice it), but not with mouse lag. which is very noticeable.
 
Just got used to turning it off all the time to avoid input lag. Rather have tears than lag.
 
whats the point in having nice graphics but having the screen tearing all over the place? really I hate both lag and tearing so I just experiment on a game by game basis.
 
Always on, always with triple buffering and only via D3DOverrider.

Don't notice any mouse lag with that method. Terrible lag if Vsync/TB is enabled via the in-game settings.
 
I keep it on. I don't like tearing and I don't notice a difference for anything over 60 fps
 
On, almost all my games tear pretty badly, and when the option is available i turn it on. I get constant 60FPS on my Rig with it on , so 60FPS with no tearing is ideal. I wish i knew why it was tearing and if i had a setting wrong -like my tv is operating at ___Hz and i have my game to something different so it tears-..idk.
When you have Vsync on, is there any reason to have any of the other settings on? the mspaa and aaf and all those acronym settings?
 
Read this.


Yes.

Christ, dude, you've been here for 3 years, spent $250+ on a video card, and you honestly don't know what antialiasing is? :rolleyes:

Well iv always had it on 6x or 8x etc...never really cared cause the graphics looked fine and i got amazing FPS. BFBC2 which i just got for PC is the first game iv had issues, where my FPS is dropping below 30FPS. And btw i have only had the card for a few months. Before that iv always had low-medium low cards so turning those on/up has never been a option. So never learned what i wasnt using.

Reading the article now, its a great explanation and a amazing article. Thank you!

My question now, where is the triple buffer option in games? What is it called? Is it AA? And if so, what x4/x6/x8/x16 do i want?
 
Last edited:
Vsync in game settings off. Force triple buffering via D3DOverrider.

I assume thats a third party program, il find it and install. So once i have that forcing triple buffering, do i want AA and AF etc on or off?
 
Before that iv always had low-medium low cards so turning those on/up has never been a option. So never learned what i wasnt using.
OK.

The fact that you're looking at pixels means diagonal lines on the edges of objects look like staircases, which tend to "crawl" when you move. AA smooths them out. Can carry a bit of a performance hit, though. I'd set this on a case-by-case basis.

Textures can look quite blurry when you view them on an angle. The problem gets much worse with distance. AF sharpens them significantly. It's relatively cheap, so you can probably set it to 16x by default, and turn it down when you need to.

I think the big performance killer in BFBC2 is HBAO. If you're running into performance problems, this is the first thing I'd disable.

None of these have anything to do with vsync or triple buffering. Turn AA on if the aliasing bothers you, and turn VSync on if the tearing bothers you (and the input lag doesn't...). But always turn triple buffering on if you turn VSync on.
 
I assume thats a third party program, il find it and install. So once i have that forcing triple buffering, do i want AA and AF etc on or off?
triple buffering is just to help out when vsync is on. its not perfect though and can cause its own issues.

AF and AA having nothing to do with vsync. they are simply settings to make the visuals look better. use google if you need a basic understanding of what they do. if you want to use AA or AF its best to use the in game AA/AF settings whenever possible.
 
OK.

The fact that you're looking at pixels means diagonal lines on the edges of objects look like staircases, which tend to "crawl" when you move. AA smooths them out. Can carry a bit of a performance hit, though. I'd set this on a case-by-case basis.

Textures can look quite blurry when you view them on an angle. The problem gets much worse with distance. AF sharpens them significantly. It's relatively cheap, so you can probably set it to 16x by default, and turn it down when you need to.

I think the big performance killer in BFBC2 is HBAO. If you're running into performance problems, this is the first thing I'd disable.

None of these have anything to do with vsync or triple buffering. Turn AA on if the aliasing bothers you, and turn VSync on if the tearing bothers you (and the input lag doesn't...). But always turn triple buffering on if you turn VSync on.

Ok thanks. I fully understood that. I have read a article before about AA so i had a clue what that was. Yeah well my performance was basically too crummy to play, so i turned some settings to medium and it became playable (40 FPS average probably). I then decided after reading the article to turn Vsync off, and everything back up to high (HBAO on). I was then getting 50-100+ FPS with id say minimal tearing (the tearing in Borderlands is the worst). With a CF 6950 i could turn on Vsync and keep a constant 60FPS but i only have one card. Interestingly when i play Vietnam mode, i get constant 60FPS Vsync On, i think cause the maps are so basic (huts and foxholes).

Questions: what is HBAO? Never seen it until i got BFBC2 in a game.
Read your comment Cannondale06, thanks that answered a Q i had.
 
HBAO is just a form of ambient occlusion. you are asking fairly common questions so you can just google any of those terms and get more info than you know what to do with.
 
if you have vsync off all together than triple buffering does not do anything.

I knew someone would say this but I promise it works. Just like turning vsync off in game but forcing it through CCC, nhancer, ati tray tools, etc.
 
I knew someone would say this but I promise it works. Just like turning vsync off in game but forcing it through CCC, nhancer, ati tray tools, etc.
please go read up on triple buffering because it does NOTHING if you don't have vsync on too.
 
Last edited:
off, with my 120hz dislays I notice no tearing in game and will probably never notice tearing until I exceed 120fps in eyefinity!! LOL. Not likely to happen, it would be a meaningless fps drop for nothing in my case. However my case is different than most.
 
off, with my 120hz dislays I notice no tearing in game and will probably never notice tearing until I exceed 120fps in eyefinity!! LOL. Not likely to happen, it would be a meaningless fps drop for nothing in my case. However my case is different than most.
you may not notice it but tearing can occur at ANY framerate so theres nothing magical about 120hz in that regard. heck gunfire, flickering lights and explosions can all cause screen tearing.
 
Back
Top