How VSync works, and why people loathe it

So is there a list of games that support triple buffering? because I have seen that too.
All OpenGL titles should support triple buffering -- no specific game support required. Just enable it and go, though some games have in-game settings where you can enable it, so enable it there if possible. It's also possible to force triple buffering in D3D games on most hardware.

Is there any point to turning on triple buffering without using vsync as well?
As I understand it, sort of.
 
I enabled VSync in Crysis, and it seemed to fix my tearing problem. What's odd is that according the in-game framerate counter, I am now often locked at 40 FPS. My monitor's refresh/polling rate is 60 Hz, so doesn't this imply that the triple buffering is enabled by default? (I never changed this setting).
 
Tearing has been the worst in Bioshock and Crysis. Completely horrible. I have yet to put Vsync on in Crysis, because I can't sacrifice my already pitiful framerates.
 
After playing TFC for so long i noticed anything that even barely effected my performance. When vsync was turned on it felt like i was using my mouse underwater. Even tho i had 85fps.
 
Even with decent hardware, vsync will lose you frames and increase mouse input lag.

Anyone who seriously plays FPS games online or just generally likes a responsive mouse will disable vsync without even thinking about it. Triple buffering certainly doesn't help the mouse lag either.

I always have vsync off for this sole reason. And yes, the tearing does annoy me, but I'd rather that than have a laggy game experience.

I agree with this because when I have vsync enable at 60fps, I feel slower and my shooting response is slower. I just can not play with it on especially after years without it. Perhaps thats why I am banned from servers for shooting "too fast"
 
I agree with this because when I have vsync enable at 60fps, I feel slower and my shooting response is slower. I just can not play with it on especially after years without it. Perhaps thats why I am banned from servers for shooting "too fast"

I've never experienced any significant input lag from enabling V-Sync. There are some games where I can feel it such as UT2004, but the difference wasn't massive. I was able to get used to it in short order. I still kickass in terms of scores so I don't think it effected my gaming prowess too much.
 
I run Crysis with Vsync enabled @ 1680x1050 everything maxed with Triple Buffering. I don't get GREAT frame rates (~20-25) but it feels very playable.

I absolutely hated the tearing.
 
Tearing has been the worst in Bioshock and Crysis. Completely horrible. I have yet to put Vsync on in Crysis, because I can't sacrifice my already pitiful framerates.

Same boat. Bioshock had horrific tearing, along with almost every game I've played though. :p I guess I just notice it a lot more then most. Because seriously, I don't know how some can ignore it.
 
So, how do you enable triple buffering? Where is the setting? Does this information tell me what kind of monitor I should buy? What are optimal settings?


Not sure if this has been answered yet but heres how you turn on triple buffering for an Nvidia card.


DesktopRightclick.jpg


NvidiaControlPanel.jpg
 
Originally Posted by KillRoy X
Tearing has been the worst in Bioshock and Crysis. Completely horrible. I have yet to put Vsync on in Crysis, because I can't sacrifice my already pitiful framerates.

Turning on Vertical Sync and Triple Buffering actually improves performance when it's running below your monitors refresh rate. Nvidia claims this in their drivers.

TripleBuffering.jpg
 
when i use triple buffering, the tearing returns. anyone else have this problem? WoW, HL2 are really bad. so i run vsync only alot. depends on the game i guess
 
when i use triple buffering, the tearing returns. anyone else have this problem? WoW, HL2 are really bad. so i run vsync only alot. depends on the game i guess

You have to run triple buffering in conjunction with vertical sync.
Did you try that?
 
lol, yea sorry, thats what i meant. some/ most games that i run vsync with cant run triple buffering without the tearing coming back.
 
So this is no built in triple buffering support in the Source engine games right.?


How would I enable it for them?
 
Bump for linking to this thread a lot. I think the information is still relevant for CRTs. Not sure about LCDs.
 
Can anyone tell me whether anyone gets any input lag (mouse) and also why it happens when vsync is enabled?
apparently i read from another post that input lag (mouse) will be diminished only if the game FPS is lower than the refresh rate or something like that. Is this true?


Now this is where the common misconception comes in. Some people think that the solution to this problem is to simply create an FPS cap equal to the refresh rate. So long as the video card doesn't go faster than 75 FPS, everything is fine, right? Wrong.

What is the difference of having Vsync turned on compared to having a Screen Refresh Lock and an FPS limiter to have the same amount?
 
Last edited:
Can anyone tell me whether anyone gets any input lag (mouse) and also why it happens when vsync is enabled?
I don't get it myself, or it doesn't bother me as I'm a pretty casual gamer, but the reason it happens is because the frame that is shown is drawn right after the previous one is shown and then saved to be displayed at the next opportunity (the next blanking interval). Once the frame has finished drawing though the graphics engine does basically nothing, it's just waiting around for the back buffer to clear (ie. the frame to be displayed) before it can start drawing to it again. If you're running a fast machine and maybe could get 2x the refresh rate in FPS, that means the video card had time to draw a whole 'nother frame while it was just waiting around for the sync interval. So if you had vsync off in that situation, you'd see a frame slightly newer than with vsync on.

I honestly am skeptical that people can actually perceive this, but it's the only explanation I can come up with to explain it if they do.

What is the difference of having Vsync turned on compared to having a Screen Refresh Lock and an FPS limiter to have the same amount?

I don't know what you mean by a screen refresh lock, that sounds like vsync to me. Just limiting FPS doesn't synchronize the frame drawing with the refresh rate, so you still get tearing, but all the problems with delaying frames as well.
 
Bump for linking to this thread a lot. I think the information is still relevant for CRTs. Not sure about LCDs.
It's equally relevant for all kinds of displays.

By the way, the OP is way off on a couple of key points (tears do not result from partially drawn frames, nothing is ever physically copied between buffers, and the section on double buffers is mostly nonsense). If you want a link to explain tearing, VSync and triple buffers, you're better off with this one.

Can anyone tell me whether anyone gets any input lag (mouse) and also why it happens when vsync is enabled?
Mostly because of the artificial delays it introduces to the rendering process. The game is forced to wait before starting on the next frame, hence is forced to wait before processing input (assuming the engine doesn't use its own independent tick rate). Triple buffering never delays rendering, so the problem is minimised.

What is the difference of having Vsync turned on compared to having a Screen Refresh Lock and an FPS limiter to have the same amount?
As the name implies, without VSync it's not synchronised, so it still tears. It might be rendering at 60fps and refreshing at 60Hz, but there's nothing stopping it from finishing a frame and flipping the buffer while the last frame is still being sent to the display.
 
Nice article and well written, but, for FPS shooters vysnc on is just horrible!!!

Some guys have touched upon it already and whoever said it's like using your mouse underwater has got it perfectly right!! I tried it for a month in different games to be sure that I got used to it, but, I couldn't, the mouse cursor on screen always seems to be behind my actual mouse movements. It feels like I am waiting for the mouse all the time.

To the guy who said that he was a bit skeptical that peope can actually notice this, well, I feel the same way, just from the opposite side!! I can't believe that people who play FPS's with vsync on don't notice the lag.
 
To the guy who said that he was a bit skeptical that peope can actually notice this, well, I feel the same way, just from the opposite side!! I can't believe that people who play FPS's with vsync on don't notice the lag.

Thanks for all the answers guys :)

Are there any discussion on how to minimize input (mouse) lag when using Vsync in this forum with solid proofs/outcome?
 
Are there any discussion on how to minimize input (mouse) lag when using Vsync in this forum with solid proofs/outcome?

Triple buffering. It means the rendering engine can draw continuously at full speed (processing input continuously as well, if that's tied to rendering), and the newest complete frame is always shown, minimizing display lag.
 
A very good explination overall, but some minor inaccuracies, everything is good up until this point where you give an example.

Why does this happen? Lets take a specific example. Let's say your monitor is set to a refresh rate of 75Hz. You're playing your favorite game and you're getting 100FPS right now. That means that the mointor is updating itself 75 times per second, but the video card is updating the display 100 times per second, that's 33% faster than the mointor. So that means in the time between screen updates, the video card has drawn one frame and a third of another one. That third of the next frame will overwrite the top third of the previous frame and then get drawn on the screen. The video card then finishes the last 2 thirds of that frame, and renders the next 2 thirds of the next frame and then the screen updates again. As you can see this would cause this tearing effect as 2 out of every 3 times the screen updates, either the top third or bottom third is disjointed from the rest of the display. This won't really be noticeable if what is on the screen isn't changing much, but if you're looking around quickly or what not this effect will be very apparant.

Remember that if you match your frame rate with your refresh rate, for example 60fps on a 60hz monitor you're going to tear on average once per refresh, that is to say each refresh will be made from 2 composite images and will have 1 tear line.

Your reasoning is close, when you talk about a tear line being introduced one in every X many frames, but you're talking about 1 extra tear line, so each frame will tear once and then every so many refreshes you'll actually tear twice on that refresh (refresh comprised of 3 images)

It should also be noted that you tear with frame rates that are lower than your refresh rate, it just happens once every X refreshes, with a refresh of 60hz and a frame rate of 30fps you're only likely to tear on 1/2 of the refreshes. It's very hard to notice but it does happen.

Before I explain why, let me talk about double-buffering. Double-buffering is a technique that mitigates the tearing problem somewhat, but not entirely. Basically you have a frame buffer and a back buffer. Whenever the monitor grabs a frame to refresh with, it pulls it from the frame buffer. The video card draws new frames in the back buffer, then copies it to the frame buffer when it's done. However the copy operation still takes time, so if the monitor refreshes in the middle of the copy operation, it will still have a torn image.

Double buffering wasn't designed to mitigate tearing, it's just the way all video cards render, they cannot render straight to the monitor as the draw to the buffer isn't instant, so they draw to a so called back buffer which is not in use, then simply swap the back and front buffers, which can be done instantly.

VSync solves this problem by creating a rule that says the back buffer can't copy to the frame buffer until right after the monitor refreshes. With a framerate higher than the refresh rate, this is fine. The back buffer is filled with a frame, the system waits, and after the refresh, the back buffer is copied to the frame buffer and a new frame is drawn in the back buffer, effectively capping your framerate at the refresh rate.

More or less right but actually the way we describe and see vsync in operation, that is over the periods of seconds and things like average frame rataes, it's a bit false in a sense. What is actually happening low level is vsync is looking at an individual frame render time, if the frame hasn't been renderd in time for the next refresh then the buffer isnt swapped and the same frame stays in place for another refresh, essentially displaying the old frame again (one unique frame being displayed for 2 full refrehses)

What usually happens is frame render time stays approximately the same because in 1/60th of a second not much changes in the game, so we tend to get a whole row of frames that are displayed either once per refresh (@60hz thats 60fps) or each frame not quite being ready in time (30fps). Our real average for any one second could actually be in between 30fps and 60fps if we were to rapidly change the load on the graphics card, we might get one new frame for each refresh for the first 20 frames in an example second, then say an explosion goes off and we have lots more stuff to draw, then each frame might take more time to render than the refresh time so we duplicate each frame twice for the remainder of the second, giving us 20 frames in 20 refreshes then in the last 40 refreshes only 20 unique frames giving us an actual unqiue frame rate of 40fps.

Looking at vsync in terms of FPS (an average) help people understand the concept a bit easier but low level it's actually operating on a frame by frame basis.

Essentially this means that with double-buffered VSync, the framerate can only be equal to a discrete set of values equal to Refresh / N where N is some positive integer. That means if you're talking about 60Hz refresh rate, the only framerates you can get are 60, 30, 20, 15, 12, 10, etc etc. You can see the big gap between 60 and 30 there. Any framerate between 60 and 30 your video card would normally put out would get dropped to 30.

This is true in most practical terms but remember that if the render time is swapping between 60 and 30, or any of these values within a given second then your unique frames in that second will be somewhere between these values. Frame rate counters report frame rates locked at 60/30/20 etc because the render time for each frame tends to stay approximately the same to the previous one unless the load on the graphics card changes very suddenly, and certainly isnt like to change back/fort rapidly within one second.

You could write an game to have a very render heavy shader that is only applied every other frame causing every odd frame to take longer to render (longer than 1 refresh time) and so only half of the frames being duplicated similar to my example above, and you'd have 40fps.

Whether or not you find 37.5FPS smooth doesn't change the fact that the framerate just cut in half suddenly, which you would notice. This is what people hate about it.

Furthermore when vsync is duplicating frames you're getting a larger delay between your input from the mouse/keyboard/controller and you actually seeing that response on screen, the delay is twice as much if vsync is repeating every frame (1/2 the frame rate) which a lot of people percieve as "input lag", it makes a lot of FPS games basically unplayable.

If you're playing a game that has a framerate that routinely stays above your refresh rate, then VSync will generally be a good thing. However if it's a game that moves above and below it, then VSync can become annoying.

Exactly, and most games dont maintain steady frame rates now a days, it can be up and down a lot, and since people want eyecandy and sacrafice frame rates, we often see an average frame rate which is something like 60fps but the min/max (peak/trough) values could easily be 30fps and 90fps

However, triple-buffered VSync really is the key to the best experience as you eliminate tearing without the downsides of normal VSync (unless you consider the fact that your FPS is capped a downside... which is silly because you can't see an FPS higher than your refresh anyway).

this is where I tend to disagree actually, the idea that you cannot see an FPS higher than your refresh isn't entirely true, and this is a bit controversial...so bear with me.

If you're rendering at say 180fps with a 60hz screen (for easy maths) it means you get refreshes that are compromised of 4 frames on average, 60 goes into 180 3 times, however the start of the frame and start of the refresh arent sycned, almost always we're starting a new refresh with the tail end of an old frame.

At the end of a refresh we're left with a composite image between 4 frames, the top part is the oldest, this represents the game state nearly 1 refresh ago, the next bit down is a bit more up to date, and so on until the bottom where the last part of the refresh is based on very new information.

We still have 60 refreshes because our monitor cannot adaptively provide a greater refresh rate (wouldnt that be nice!) however in our refreshes some parts of the screen, the lower areas, have information which is newer. I personally think that is of some benefit.

For example if an object is moving quickly across the scene, and the refresh tears through the middle of it, we can in one frame infer direction of movement and speed of movement of that object without seeing the frame before it or after it. Now we dont play games like that, but I personally believe our eyes pick up on that and thats helpful to gamers, especially those who play competatively.

I hope this was informative, and will help people understand the intracacies of VSync

Yes very, I think you did a good job, a few inaccuracies but for the purpose of explaining how it works to the average gamer it's very well written, good job.
 
Last edited:
I've been running all my games with VSync and Triple Buffering enabled (D3D Overrider utility) for some time now. Maybe it would be unnecessary on 120hz monitors but for 99% of LCDs out there, it eliminates tearing when the game tries to push above 60 FPS. Feels smooth as silk to me.
 
Remember that if you match your frame rate with your refresh rate, for example 60fps on a 60hz monitor you're going to tear on average once per refresh
That's assuming it takes a full 16.7ms to send the frame. If it's refreshing at 60Hz and rendering at 60fps, but only taking 10ms to send out each frame, then it's got a 6.7ms window each refresh interval where it can flip the buffers without causing a tear (so on average, it'd only tear 60% of the time).
 
Remember that if you match your frame rate with your refresh rate, for example 60fps on a 60hz monitor you're going to tear on average once per refresh, that is to say each refresh will be made from 2 composite images and will have 1 tear line.

Normally, yes, but it IS possible that the 60fps happens to line up perfectly with the refresh rate ;)

Double buffering wasn't designed to mitigate tearing, it's just the way all video cards render, they cannot render straight to the monitor as the draw to the buffer isn't instant, so they draw to a so called back buffer which is not in use, then simply swap the back and front buffers, which can be done instantly.

Just a minor correction, but video cards *CAN* render with just a single buffer, it just looks really, really bad. The reason for double buffering is to prevent you from seeing the card doing the drawing.

this is where I tend to disagree actually, the idea that you cannot see an FPS higher than your refresh isn't entirely true, and this is a bit controversial...so bear with me.

If you're rendering at say 180fps with a 60hz screen (for easy maths) it means you get refreshes that are compromised of 4 frames on average, 60 goes into 180 3 times, however the start of the frame and start of the refresh arent sycned, almost always we're starting a new refresh with the tail end of an old frame.

At the end of a refresh we're left with a composite image between 4 frames, the top part is the oldest, this represents the game state nearly 1 refresh ago, the next bit down is a bit more up to date, and so on until the bottom where the last part of the refresh is based on very new information.

We still have 60 refreshes because our monitor cannot adaptively provide a greater refresh rate (wouldnt that be nice!) however in our refreshes some parts of the screen, the lower areas, have information which is newer. I personally think that is of some benefit.

For example if an object is moving quickly across the scene, and the refresh tears through the middle of it, we can in one frame infer direction of movement and speed of movement of that object without seeing the frame before it or after it. Now we dont play games like that, but I personally believe our eyes pick up on that and thats helpful to gamers, especially those who play competatively.

I disagree that you would be able to think and understand the difference, or that you would be able to notice the updated sections independent of the tears.

But you still only get 60 complete frames either way - just that the frames are then composed of multiple snapshots of time instead of a single snapshot. :p
 
Thanks for all the answers guys :)

Are there any discussion on how to minimize input (mouse) lag when using Vsync in this forum with solid proofs/outcome?

Don't know, I have tried everything to get rid of the mouse lag with vsync on. Somebody mentioned how smooth it is with vsync on, yeah, it's smooth as silk, but always way behind the action!!

I don't know any half decent FPS gamer that has vsync on.
 
For mouse lag, try lowering MaxFramesAllowed? Thats the OGL reg key, not sure if D3D has its own... I assume it'd be control panel somewhere?
 
I've got a 120hz monitor LCD and VSYNC... does it sound like I'm covered?

I read through each post on the first page and the last page, this morning. It sounds like I always want Triple Buffering Enabled, too, when available.
 
Don't know, I have tried everything to get rid of the mouse lag with vsync on. Somebody mentioned how smooth it is with vsync on, yeah, it's smooth as silk, but always way behind the action!!

I don't know any half decent FPS gamer that has vsync on.

I actually found a cure for this.

From the NVidia control panel, I enabled Vsync (120Hz) with Triple Buffering and set max frames to render from 3 (default) to 0. No more input lag with Vsync in a medium crowd map (12 people), however, when I get to a map with around 20 people, it hurts the performance, I get graphics stutter (not really a lag just stutter). I would believe the reason is because my actual game Frame Rate drops below 120 (2/120 or 3/120) which goes to 60 or 40. Can anyone explain any reason for the stutter in crowded map? Would enabling max frame to render to 1 or 2 cure the graphic stutter in a crowded map?

I also hope PrincessFrosty can provide me an answer for this.
 
"Max frames to render" is probably the same thing as the MaxFramesAllowed OGL key. I'd try 0 & 1. Actually with Triple buffering, wonder if that would require maxframes 1 to work optimally.
 
Last edited:
i've only used triple buffering once, and it definitly wasn't a positive experience (maybe just the game sucked at doing it...)

Basically i had just downloaded left4dead, and apparently triple buffering is enabled by default. Game felt laggy (not ping, not fps, but input lag, like when you're using a crappy lcd tv to game on and it's not on "game mode").
Took me about 20 minutes of playing with all the video options till i figured it out was just triple buffering slowing it down. Felt really responsive after i turned it off.

Again, that's only one game, haven't tried it anywhere else.
It's a shame though, because im not exactly a fan of screen tearing either, looks like 120hz monitors is the way to go.
 
i've only used triple buffering once, and it definitly wasn't a positive experience (maybe just the game sucked at doing it...)

Basically i had just downloaded left4dead, and apparently triple buffering is enabled by default. Game felt laggy (not ping, not fps, but input lag, like when you're using a crappy lcd tv to game on and it's not on "game mode").
Took me about 20 minutes of playing with all the video options till i figured it out was just triple buffering slowing it down. Felt really responsive after i turned it off.

Again, that's only one game, haven't tried it anywhere else.
It's a shame though, because im not exactly a fan of screen tearing either, looks like 120hz monitors is the way to go.

read what i posted above. change maxrenderframe to 0 or 1 (less than 3 by default). this can be done from the Nvidia control panel. Enable Vsync (triple buffer) and it should be fine.
 
my problem with vsync is the input lag it creates in certain games not the fps limitations.
 
read what i posted above. change maxrenderframe to 0 or 1 (less than 3 by default). this can be done from the Nvidia control panel. Enable Vsync (triple buffer) and it should be fine.

can't find such an option in ati ccc.
 
read what i posted above. change maxrenderframe to 0 or 1 (less than 3 by default). this can be done from the Nvidia control panel. Enable Vsync (triple buffer) and it should be fine.
turning on triple buffering from the Nvidia control panel has NO impact on DX games. you have to use some third party app to force it on for non OpenGL games.
 
That's assuming it takes a full 16.7ms to send the frame. If it's refreshing at 60Hz and rendering at 60fps, but only taking 10ms to send out each frame, then it's got a 6.7ms window each refresh interval where it can flip the buffers without causing a tear (so on average, it'd only tear 60% of the time).

The buffers dont flip at a fixed rate matched with the monitor refresh, they flip once the frame has been drawn, the tearing comes when the monitor is mid-refresh and the buffer flips, not the buffer flipping as the graphics card is drawing to it.

I think there is a minor delay between refresh on the screen, with CRTs there was a small delay where the cathode ray tube would reposition from the bottom of the screen to the top ready to start the new draw, however I believe it takes most of the 16.7ms to actuall draw the frame. If a 60hz monitor could draw the entire screen in say 8ms instead of 16.7 then they'd sell it as a 120hz monitor

But fundamentally Its possible for the buffer to flip as the screen is between drawing in which case you dont get a tear, but thats very unlikely.

Normally, yes, but it IS possible that the 60fps happens to line up perfectly with the refresh rate ;)

Yes it probably happens once every X many hundred refreshes or so, but is unlikely to occur repeatedly, you'd have to be very lucky ;)

Just a minor correction, but video cards *CAN* render with just a single buffer, it just looks really, really bad. The reason for double buffering is to prevent you from seeing the card doing the drawing.

Sure, my point was that double buffering wasnt designed to stop tearing.

I disagree that you would be able to think and understand the difference, or that you would be able to notice the updated sections independent of the tears.

You dont understand it frame by frame but overall movement of objects through the scene is going to seem smoother because you're seeing the scene over a period of time rather than just a snapshot. Kind of like real motion blur captured by cameras with long aperture times, film at 24-25fps seems smooth because each frame is capturing information of the world over the period of the frame which visually shows as motion blur, where as traditional rendering shows snapshots and needs a higher frame rate to seem smooth.

But you still only get 60 complete frames either way - just that the frames are then composed of multiple snapshots of time instead of a single snapshot. :p

Yes you still only get 60 complete frames/refreshes, but you see more information than a single frame is capable of telling, as I said you can infer direction and speed of movement of objects in the scene with one frame. Good for fast moving targets or rapidly chaning viewport direction.

It also helps minimize "input lag" where you essentially getting multiple bits of feedback per refresh rather than just 1, all FPS gamers who have tried vsync know just how awful the input lag feels, it's still noticeable for me even when getting 60 unique frames per second.

120Hz Monitor = no tearing and no need for vsync ;)

Nope.

Don't know, I have tried everything to get rid of the mouse lag with vsync on. Somebody mentioned how smooth it is with vsync on, yeah, it's smooth as silk, but always way behind the action!!

I don't know any half decent FPS gamer that has vsync on.

FPS games are basically unplayable at anything other than casual level with vsync on, the latency between input and feedback is too long, for a single button press thats not really that important but for mouse aiming where your brain takes continous feedback from the screen and uses that information to subtly correct your aiming, that can no longer happen.

I actually found a cure for this.

From the NVidia control panel, I enabled Vsync (120Hz) with Triple Buffering and set max frames to render from 3 (default) to 0. No more input lag with Vsync in a medium crowd map (12 people), however, when I get to a map with around 20 people, it hurts the performance, I get graphics stutter (not really a lag just stutter). I would believe the reason is because my actual game Frame Rate drops below 120 (2/120 or 3/120) which goes to 60 or 40. Can anyone explain any reason for the stutter in crowded map? Would enabling max frame to render to 1 or 2 cure the graphic stutter in a crowded map?

I also hope PrincessFrosty can provide me an answer for this.

Tripple buffering primarily helps with frame rate, not "input lag". It allows the video card to remain busy when it might otherwise just be idle. It's also already been mentioned that tripple buffering in the driver options only applies to OpenGL games, almost all modern games are DirectX based so this option doesn't work. You can force it through other 3rd party tools I believe.

The input lag feeling is introduced when a frame is being displayed and the video card is drawing to a back buffer but the frame isn't drawn in time and the same buffer has to remain in place during the next refresh (same frame being displayed twice in a row)

Nothing can solve the input lag feeling, its inherent with vsync, vsync works by delaying rendered frames from reaching the monitor until the time is right, that delay is what we commonly describe as input lag. The only way to minimize it is to run a frame rate high enough to maintain your monitors refresh rate at all times, for 60hz you probably want an average of 90fps. Or invest in a 120hz monitor and a graphics card that can spit out something like 150-160fps average and vsync to 120hz, that would feel a lot less laggy.
 
Nice post PrincessFrosty. ^^

Explains everything perfectly. I agree with you completely about the lag, nothing ever worked for me. I just can't understand how anyone can play FPS games with vsync on and not notice it :)

K, I had tried that before about changing the max-frames and it didn't work for me at all. No matter which value I used.

back to the old drawing board K ;)
 
Back
Top