Vsync Input Lag Questions

twotone8

n00b
Joined
Aug 23, 2010
Messages
15
There's tons of contradictory information on the net about Input Lag. I just wanted to gauge the popular opinion on the causes and possible solutions of Input lag.

Input lag = amount of time between a button press/mouse movement, and the visible response to said movement on the screen.

Here's some info,and here's some more.

16ms = about 1 frame of input lag at 60hz. The human eye is not supposed to be able to detect anything below about 30-50ms, but that's debatable.



As far as I can gather there are several causes. Here they are in no particular order.

1) Native display lag: Every LCD display is going to introduce some measure of input lag. Personally, my huge Samsung 42' LCD TV adds about 60ms of input lag on game mode. My 23' Acer Monitor only adds about 14ms of Lag. These measurements are derived from the extraordinarily handy auto-calibrate feature in Rock Band 2, which sends a visual cue to the special guitar controller and measures how long it takes to receive that cue.

I've noticed that the 60ms of input lag is quite inconvenient when trying to play games from the PC on the big honkin' TV. 60ms is just enough to make mouse movement feel sluggish, especially when combined with Vsync, but i'll get into that in a bit.

I've heard of people getting better results on large TVs by using the vga input as opposed to hdmi, but to me it seems to make no difference whatsoever.

I've also heard that certain scaling options can contribute to lag, but the information is also pretty contradictory. Supposedly running things at the native resolution of the display is the best way to avoid introducing input lag due to scaling, yet on my huge tv, the only way to reduce the input lag it seems is by running on 640x480 or lower resolutions. Ii think it's because most of the time the tv is running all sorts of post-processing algorithms, but doesn't seem to care when forced into outdated grandpa resolutions. Who knows.


2) Mouse acceleration: also known as "enhance pointer precision" under windows control panel settings. It's on by default on pretty much every computer out there. It basically means that the faster you move your mouse, the faster the pointer travels across the screen. It also means that it takes at least two frames to get a reference as to how fast the mouse is moving before implementing the movement on the screen, adding about 16ms of input lag (as i understand it). Unchecking the box in control panel seems to do the trick. Some games like bioshock seem to want to force this acceleration, adding to the input lag problem. I think it's a good idea to leave this setting off, but to each his own.

Mouse smoothing is a similar phenomenon. It also uses two frames of reference to decide how to implement mouse movement, adding that same 16ms of lag. Pretty obsolete settings meant to smooth out the movement of old dusty mouse balls.(heh)


3) Polling rate: This is a measurement of how frequently your USB device sends information to your computer. The default polling rate for pretty much every usb device out there is 125hz. This introduces about 8ms of latency. Through some pretty scary registry tweaks it's possible to increase this in most devices up to about 1000hz (1ms latency), but tread carefully. I went too deep down the rabbit hole once and ended up breaking my usb controllers, which made it impossible to interact with my computer through anything using a USB connection (thank god for remote desktop, which allowed me to log in and fix the driver issue). There are a few high-end gaming mice that allow you to up the polling rate through software, but those are few and far between.

Here's a link with more info about futzing with the polling rate under windows 7.

NOW HERE'S THE BIGGIE

4) Vsync: This is the main input lag culprit as i've surmised. It's the reason why i'm posting this in the 'video card' section of this forum. Enabling vsync means that your video card will wait for an entire frame to render and display before sending out a new one. On your screen, you'll see one entire crisp image at a time. If disabled, you might be subject to 'screen tearing'. Unfortunately, enabling vsync means your most likely going to experience quite a noticeable amount of input lag, no matter how much you've futzed about with items 1-3 in this list. As long as your video card is able to render more frames than your refresh rate is capable of (generally anything over 60fps), your display will fall behind by at least one frame. This, in combination with other input lag variables, can really ruin your gameplay.

This phenomenon drives me completely bananas.

There is hope, however. I've heard of at least two ways to combat the input lag introduced through vsync. One of which i'm severally confused about and could use some input.

A). Frame limiting: THIS! THING! WORKS! It's also pretty rare. Some games give you an option to manually limit your framerate to whatever you set it to. This functionality, for some reason, is usually buried within .ini files or console commands, but i've found it to be one of the few viable weapons in the war against input lag. Vsync, by it's very nature, limits your framerate to whatever your monitors refresh rate is (usually about 60hz), but it still buffers those one or two frames. By both enabling vsync AND manually setting the framerate to 59 (or 1 frame below your refresh rate), it seems that the frame is displayed as soon as it's finished rendering, just as if your video card weren't capable of rendering over 60fps. It's the miracle cure for input lag, providing you with a seamless picture with no laggy side effects. The only problem is that it's horrifically uncommon. The 'fps_max' console command works for most source games (excluding the l4d series for some stoopid reason), and i've heard there's an .ini variable for most Unreal engine games. Use this. Not only does it work, but it's generally accepted that it works, as opposed to the following.

B.) TRIPLE BUFFERING!!: This is the most confusing thing ever documented in god's green internet.

Here's a great article describing how triple buffering SHOULD work, but i'm not at all convinced.

Supposedly, triple buffering allows your video card to continue to render frames without waiting for one to display. In an ideal world, your video card will buffer two images, one after the other, and send the most recently completed one out as soon as an entire image is displayed. In theory, it SHOULD drastically reduce input lag, but it doesn't always seem to deliver as promised.

There's doesn't seem to be any general consensus whatsoever on whether or not it alleviates input lag or contributes to it. Some say that enabling triple buffering means your display is always showing the most recently completed frame as opposed to one that's been held in a buffer. Some report that triple buffering means that by buffering more than one frame, you're waiting that much longer to see what you're graphics processor has been generating. Confused?! So am I?!?!

Every forum i've been to on this subject has been split pretty much down the middle on triple buffering. Some swear by it, while others curse it's name. One possible contributor to this rift is the fact that two completely different rendering methods are often grouped under the same 'triple buffering' label. One of which being the genuine "glinda the good witch of the north" triple buffering, and the other being the "evil wicked witch of the west" version known as 'flip-queuing'. 'Flip-queuing', as far as I understand it (which isn't very far) puts all of your frames in a big line and has everyone wait their turn. Apparently some games/drivers implement 'flip-queuing', which does have positive side effects like smoothing out movement and making things very fluid and consistent, but label it misleadingly as 'triple-buffering'. All of this inconsistency makes it a nearly impossible subject to test and read up on.

It's also pretty rare for a game to come equipped with a triple buffering option. It's possible to force triple buffering through Nvidia drivers, but i've heard that it only works for OpenGL rendering. For Direct3D, there's a utility called D3Doverrider which comes included in the Rivatuner graphics tweak utility. I've messed with these a few times, but it's difficult to tell if it makes any difference. The only thing I was able to conclude was that it was in no way the magical cure-all some forums (and very strongly opinionated forum posters) had led me to believe.

Please, if anyone out there has any input or information on vsync, triple buffering, or any other tool against input lag, please let me know.

It's extraordinarily difficult to test this stuff seeing as you're dealing with milliseconds. I've spent hours tapping controller buttons and moving my mouse about trying to determine if a tweak has contributed to anything.

Here's a glorified lcd stopwatch which some people use to test for input lag. They set up a high speed camera, set their video card to display the stopwatch on both a crt display and an lcd that they choose to test, and take a snapshot. By subtracting the crt's value from the lcd's, you're able to determine how many milliseconds your display is falling behind.

Here's another handy utility which relies on human response time to gather its information. You just click a button when the stick turns blue, sort of like playing guitar hero. It's a good way to get a general feel on relative display response times, but it's far from perfect.I used to rely on this test, and i learned a lot about vsync (especially in regards to the default 'aero' theme in vista), but it's too reliant on the human factor to be conclusive.

All I want is a clear picture. One with no screen tearing, that pops up in real time like the good ol' NES days of old (as a depressing side note, even NES emulators are subject to vsync input lag).

Is there anything I should know about that might be able to shed some light on the subject or help? What are your opinions on triple buffering and such? Is this issue going to improve as new technology is developed? Is it going to get worse as lcd displays get larger and streaming technology like OnLive gets more and more popular? Are there any 3rd party frame limiters out there that might be able to help? Any info or insight would be greatly appreciated.

P.S. This doesn't seem to be an issue in modern consoles like the Xbox 360 and PS3, at least in regards to the input lag introduced by VSYNC. Everything seems to render in real time, while remaining perfectly synced to the refresh rate of the monitor. What's the deal? If my $200 gaming device can muster it, why can't my $800 pc?


WALL OF TEXT/ TOO LONG DIDN'T READ VERSION:

What's up with input lag? Is triple buffering supposed to help or hurt?
 
I would have thought alternate frame rendering using Sli/Xfire would also add to input lag. By definition you are rendering the next two frames at once (one on each card) which I would have thought means another frame of input lag.
 
Great thread about triple buffering (use it with vsync): http://www.neogaf.com/forum/showthread.php?t=366413

No, it does not contribute to input lag any more than vsync does normally. It does stop vsync from limiting you to 30 fps when you can actually render 55 (but not 60).

Ati Tray Tools allows you to drop the flip queue on ATI cards, just like that nvidia tool does for them.

I've updated everything about my setup to reduce input lag: 120hz monitor, 1000hz razer mouse, no flip queue. I honestly can't say there's a noticable difference from my 60hz monitor, wireless mouse and a 3 frame flip queue in terms of input lag, but I never really noticed it to begin with.

Network lag still bothers me much more than anything else.
 
Hi twotone, very inspiring post. I've messed around with exactly this question a lot in the past. Being mainly an FPS player this issue is particularly important to my experience. The biggest things that I've found to help are the following:

* A good mouse with a high input rate, I have Logitech G5, can do something like 2500 dpi (anything over 1200 is good)
* An adequate video card. 5870, good enough to render plenty of extra frames ahead of time/simultaneously.
* Good CPU, OC'd over 3.0Ghz, i'm at 3.6, although 3.2 is fine as well.
* Remove the enhanced mouse precision option, and/or mouse smoothing as you suggested.
* Increase mouse polling rate to 1000Ghz or higher (if not already there).

I cannot play games without vsync anymore because the tearing kills me. Those things above have given me a good enough experience where i'm not asking myself every time I sit down if I can make it better.

A comment on the last two things, changing the polling rate and removing any mouse smoothing does take getting used to. It will feel weird if you are used to playing with them default. I'm not even talking hours, i'm talking days or weeks. However, you *should* feel this difference to be positive.

You will never be able to do anything about network lag, so be sure you aren't confusing that for the graphical lag you are talking about -- depending on what your favorite game might be.

The biggest single impact for me out of all the things I listed above is a proper mouse. Also, don't ignore your mousepad either. I use a Ratpadz, best surface i've ever come across for a mouse.

I know this explanation wasn't very technical and won't help you understand whats going on internally with the frames, but all put together it creates about as good an experience as you can get with input lag -- which you will never fully eliminate, but that you can grow accustomed to or get used to.
 
thanx for the reply's guys.

apocrypha - i've heard, and somewhat deduced, that vsync doesn't add any lag whatsoever as long as your framerate never goes above 60fps. things display just about as soon as they're rendered in that case. i've never come in contact with a game where the framerate were cut in half, except for bioshock, which had a tendency to jump from 60fps to 30 every so often.

nvidia control panel has a 'max pre-rendered frames' setting which is set to 3 by default. i've always reduced it to 0, but i'm not exactly sure if it makes a difference. some say these 3d settings only apply to opengl games, which aren't very common amongst newer stuff. plus i also read somewhere that this only applies to cpu load as opposed to gpu, but i have no idea what the implications for that would be.

wow, a 120hz monitor?! so vsync would display 120 fps? that sounds wonderful. i'd hope that'd clear some of this nonsense up. any issues with timing in games like fallout 3 and beathazard which rely on a set framerate?

oh, and i just upped the polling rate a few weeks ago on my logitech mx518 (which i adore). it does make a noticeable difference, but it's not immune.

i'm still not completely convinced about what triple buffering does. there are so many strong opinions in either direction. yours is the first i've encountered claiming that it makes no difference whatsoever.


g-money

tearing tears me up too. i don't understand why my roomies xbox never seems subject to input lag OR screen tearing. what magical technology have they developed to avoid it i wonder. as much as it gets to me, i hate lag even more.

oh, and i mostly test input lag in single player games to avoid network lag confusion. at least i know that there's very little i can do about network lag.

i felt born again after i removed mouse acceleration/smoothing. it did take some getting used to, but it answered the age-old question of "how come i always suck as sniper". the polling rate is a much more subtle change, but also welcome. i've heard that console controllers poll 1000 times a second, why are usb controllers stuck at 125 by default. do you think usb3 will remedy any of this?

my computer isn't the best in the world, but it can handle anything you throw at it on medium settings. amd x2 5800+ 3ghz processor, 6gigs of ddr2, good fast sata 1tb hd. geforce 9800gt graphics card.

the one thing i've noticed which has just started to piss me off is a slight graphical hitch every second or so. it's only about 1 frame, and i've happily ignored it 'till now, but i just downloaded the demo of arkham asylum on my xbox and noticed how smooth it looks compared to my pc version. wtf? i have yet to tear this issue apart, but i intend on futzing. hopefully i won't get so obsessed that i end up reformatting my computer again.

thanx for the replies you guys. keep up the good fight.
 
I think a large part of the confusion about triple buffering is that it's difficult to tell if it's on. It's obvious when Vsync is on - the tearing mostly goes away. But how does the user detect that triple buffering is working? If the user is having the halved framerate with Vsync and it goes away when triple buffering is enabled, then that's one way, but when you're not having that problem, how do you know? If you are not having the halved framerate problem, does triple buffering actually do anything different?
 
2) Mouse acceleration: also known as "enhance pointer precision" under windows control panel settings. It's on by default on pretty much every computer out there. It basically means that the faster you move your mouse, the faster the pointer travels across the screen. It also means that it takes at least two frames to get a reference as to how fast the mouse is moving before implementing the movement on the screen, adding about 16ms of input lag (as i understand it). Unchecking the box in control panel seems to do the trick. Some games like bioshock seem to want to force this acceleration, adding to the input lag problem. I think it's a good idea to leave this setting off, but to each his own.

I don't think mouse acceleration adds any input lag. I haven't investigated it, but on the surface it looks like it would just use a floating average and accelerate based off of that - way easier than the buffering method you are talking about and doesn't have any input lag.

Enabling vsync means that your video card will wait for an entire frame to render and display before sending out a new one. On your screen, you'll see one entire crisp image at a time. If disabled, you might be subject to 'screen tearing'. Unfortunately, enabling vsync means your most likely going to experience quite a noticeable amount of input lag, no matter how much you've futzed about with items 1-3 in this list. As long as your video card is able to render more frames than your refresh rate is capable of (generally anything over 60fps), your display will fall behind by at least one frame. This, in combination with other input lag variables, can really ruin your gameplay.

This phenomenon drives me completely bananas.

Not really. If you have vsync on and your video card can exceed the monitor's refresh rate, then your input lag will always be less than 1 frame. While frame A is being sent to the monitor, frame B is being rendered. The card then waits until frame A is fully sent before swapping the buffers so B is ready to be sent and then moves on to rendering C. There is no backlog of frames. And actually, if your card renders at exactly 60 fps, vsync will add *zero* input lag. The input lag comes from when your card exceeds 60 fps, in which case you will have some fraction of 1 refresh rate of input lag (but always less than 1 refresh rate). And if your card can't maintain 60 fps, then you will also get more input lag, but still less than 1 refresh rate of additional lag.

So if you have a 60hz monitor, vsync will add *no more* than 16 ms of input lag (and only up to 16 ms if your card can render instantly). If your card can pull 120fps, then your input lag will basically be about 8 ms due to vsync.

A). Frame limiting: THIS! THING! WORKS!

No it doesn't. It doesn't eliminate tearing, and it has the *exact same* input lag characteristics as vsync when you match or exceed the cap. You get less input lag if you are below the cap, but then again in that case the main source of input lag would be how long it is taking to render.

B.) TRIPLE BUFFERING!!: This is the most confusing thing ever documented in god's green internet.

Here's a great article describing how triple buffering SHOULD work, but i'm not at all convinced.

Supposedly, triple buffering allows your video card to continue to render frames without waiting for one to display. In an ideal world, your video card will buffer two images, one after the other, and send the most recently completed one out as soon as an entire image is displayed. In theory, it SHOULD drastically reduce input lag, but it doesn't always seem to deliver as promised.

There's doesn't seem to be any general consensus whatsoever on whether or not it alleviates input lag or contributes to it. Some say that enabling triple buffering means your display is always showing the most recently completed frame as opposed to one that's been held in a buffer. Some report that triple buffering means that by buffering more than one frame, you're waiting that much longer to see what you're graphics processor has been generating. Confused?! So am I?!?!

That's because there are two entirely different things called "triple buffering". Triple buffering itself is as it's described in that anandtech article, and is fantastic. The problem is that DX doesn't support that at all. Instead, DX uses a render ahead queue, which queues up completed frames and shows them in the order they were rendered. This means that if your card is exceeding your refresh rate, you could end up with 5 frames of input lag. And when the queue is filled (which can be 3 to 7 frames), the card just waits.

P.S. This doesn't seem to be an issue in modern consoles like the Xbox 360 and PS3, at least in regards to the input lag introduced by VSYNC. Everything seems to render in real time, while remaining perfectly synced to the refresh rate of the monitor. What's the deal? If my $200 gaming device can muster it, why can't my $800 pc?

Consoles typically render most shooters at 30 fps with a render ahead queue. They have *massive* input lag compared to a PC. You probably just don't notice because a thumbstick is nowhere near as precise or responsive as a mouse.

So no, your $200 gaming device can't muster it, not even close.
 
I think a large part of the confusion about triple buffering is that it's difficult to tell if it's on. It's obvious when Vsync is on - the tearing mostly goes away. But how does the user detect that triple buffering is working? If the user is having the halved framerate with Vsync and it goes away when triple buffering is enabled, then that's one way, but when you're not having that problem, how do you know? If you are not having the halved framerate problem, does triple buffering actually do anything different?

Exactly, there's one thing I still want to test with triple buffering though. When I was recording on fraps w/ vsync my frames would always drop to 60 and then eventually 30, and I think that has to do with the double buffering. Triple buffering might fix that.

Yes, vsync on a 120hz monitor syncs to 120fps, the extra smoothness is noticeable sometimes, it looks pretty damn nice but when it drops to 60fps it's the same as any other monitor anyway.

Honestly, I wouldn't be concerned about it at all if you haven't noticed any input lag.


If you aren't sure what input lag feels like play Deadspace on PC, turn on vsync in game and nowhere else and you will get the worst input lag ever.

I don't think mouse acceleration adds any input lag. I haven't investigated it, but on the surface it looks like it would just use a floating average and accelerate based off of that - way easier than the buffering method you are talking about and doesn't have any input lag.

I'd have to agree here, although it does feel like it adds lag sometimes. In the Borderlands menus it feels like your cursor has some deceleration and always goes farther than you think it should.

Consoles typically render most shooters at 30 fps with a render ahead queue. They have *massive* input lag compared to a PC. You probably just don't notice because a thumbstick is nowhere near as precise or responsive as a mouse.

So no, your $200 gaming device can't muster it, not even close.

That and they are usually played on TVs and possibly through receivers, I have played on some consoles that feel like upward of 150ms input lag easily.
 
Last edited:
One thing I can never find out much info on is vsync "stutter" that I've really only seen in the newest cards (both ATI and Nvidia) on a 64-bit OS.
I know it's out there and I only started getting it when I switched to Win 7/64, but there never seems to be any mention of a fix.
Googling it always seems to put in me into forums with people having the same thing, but the solution is always "turn vsync off." My Vista 32 experience with the same hardware is always butter smooth, so there has to be something else going on.
Anyone have any input there?
 
One thing I can never find out much info on is vsync "stutter" that I've really only seen in the newest cards (both ATI and Nvidia) on a 64-bit OS.
I know it's out there and I only started getting it when I switched to Win 7/64, but there never seems to be any mention of a fix.
Googling it always seems to put in me into forums with people having the same thing, but the solution is always "turn vsync off." My Vista 32 experience with the same hardware is always butter smooth, so there has to be something else going on.
Anyone have any input there?

Can't say I've ever felt it or even heard of it. What games do you see this on?
 
Can't say I've ever felt it or even heard of it. What games do you see this on?

Far Cry 2, any Half-Life/source engine game, and BF2 are the main culprits. While moving with the keyboard and looking left or right (a circle strafe for instance) there's a jittery movement. In some cases with source games you'll see the jitter just by moving forward near a wall. Unreal engine games do it, but to a MUCH lesser extent.

I'm playing on my TV and a bluetooth mouse/keyboard BUT it still happens with wired devices on a normal PC monitor, too. I've tried everything, but the only fix for me has been turning vysnc off...which pains me because I hate tearing.

Some games don't do it, though. Crysis doesn't appear to do it at all. Nor do Dragon Age, the Witcher, or any "slower" paced games.
 
I played Farcry 2 for a tiny bit, but I play TF2 almost nightly and I've never seen anything like that.

I think I know what you mean though, some sort of stutter where it looks like you've moved backward for a frame? I'd try reseting your 3D settings to default in CCC. I'll check on other stuff when I get home.
 
I've reset every setting I can think of, I have used 2 different (fast) hard drives that have 0% fragmentation, and I've even gone through a complete format, too.
I'm convinced it has something to do with my 5870, my TV, and Win7/64 not working together properly even though they work fine in Vista 32.
 
Domingo, is your refresh rate set to 60hz? Drivers updated? I have practically the same system (i7 920, 5870, Win7 x64) and I've never seen anything like that.
 
thanx for the info guys.

kllrnohj
I wouldn't be surprised if you were right about the acceleration. What originally led me to believe that mouse acceleration contributes to input lag was my frustration in trying to reduce lag in the original bioshock. This is the game that made me realize that input lag was a problem to begin with. With vsync enabled, there were absolutely terrible levels of input lag. The only positive thing was that the forums were rife with information on how to reduce or get rid of it. Of course turning vsync off did the trick, but it was also well documented that if you disabled the bioshock's proprietary method of mouse acceleration through their .ini files, it would pretty much fix the problem. Mouse acceleration might not be too great of a contributor in every sense, but it sure is with bioshock.

I also must attest to using frame limiters. One thing i wish I had specified is that you must use it in conjunction with vsync. Yes 'fps_max 59' when used by itself in source games will still have tearing, but with vsync enabled as well, it looks smooth as silk. It also, in my case, cures the input lag in hl2, tf2, etc. If you set the frame limiter to below the max refresh rate of your monitor, things pretty much render in real time. When i don't use the limiter, i get a noticeable increase in lag for some reason. It might not be the case for everyone, but it sure is for me.

You mention that triple buffering isn't an option when using DX. Does that mean that the 'triple buffering' option in D3Doverrider is actually this 'render ahead queue' you mention? If so, is it not a viable solution for reducing input lag at all? Thanx for the info, you obviously know a bit more about how this works than I do. It's mostly been trial and error/the worlds most annoying guessing game with me.

In terms of the console input lag questions, i'm mostly referring to 2d games.
For instance, I'm a big fan of Castlevania: Symphony of the Night.
On Xbox, the thing renders beautifully at full resolution completely vsync'd with no perceivable input lag.
When being played through an emulator with vsync enabled, there's still a noticeable frame or two of input lag. This goes away with vsync disabled of course, but i'd like to think that my computer could pull off anything the wee little xbox can. It might be the emulator's fault in this case though.

As for the vsync stutter, i'd love to figure out a way for this to go away. I too have noticed it, and I too am using windows 7 64-bit. I don't, however, have any basis of comparison as to how 32-bit operating systems render these things, so it's difficult to figure out the cause. Batman: Arkham Asylum and Mirror's Edge both have a noticeable stutter to them. I used to have a stutter when playing TF2 as well, but I noticed that if i closed my internet explorer as well as my podcast aggregator, the stutter went away completely. I think something might have been using the HD or network resources in the background. Hell, i STILL think that's what's going on in the games that still stutter, i just can't figure out a way to test this.

Thanx for the help.
 
Vsync be nice if it worked but input lag is to much has like 100ms lag behind it not playable for FPS games, eveyr game i have played Vsync has an bad affect on controls

it does sort out the screen tearing thought (MW2 mussel flash last time i payed it was very bad, Vsync on for BF2 was cool as it makes it so you can see mussel flash from Max range but you have the screen lag still)

the lag is more of an issue if the game is running faster FPS then the monitor and the Vsync is trying to keep to, set the game to 59 or 60 or what ever your monitor runs at and it should be smooth
 
Domingo, is your refresh rate set to 60hz? Drivers updated? I have practically the same system (i7 920, 5870, Win7 x64) and I've never seen anything like that.

Yup. 60hz refresh, and I have all of the latest drivers across the board.
 
That's because there are two entirely different things called "triple buffering". Triple buffering itself is as it's described in that anandtech article, and is fantastic. The problem is that DX doesn't support that at all. Instead, DX uses a render ahead queue, which queues up completed frames and shows them in the order they were rendered. This means that if your card is exceeding your refresh rate, you could end up with 5 frames of input lag. And when the queue is filled (which can be 3 to 7 frames), the card just waits.

Interesting. So I take it OpenGL does support "true" triple buffering, which can be enabled via the NVIDIA driver. I assume D3DOverrider (which I always use) just enables the render ahead queue then? In BC2 at least, I'm not sure if that's related to the RenderAheadLimit config variable which I have set to 0 to minimize input lag. I have also done the same in the NVIDIA control panel.
 
One thing I can never find out much info on is vsync "stutter" that I've really only seen in the newest cards (both ATI and Nvidia) on a 64-bit OS.
I know it's out there and I only started getting it when I switched to Win 7/64, but there never seems to be any mention of a fix.
Googling it always seems to put in me into forums with people having the same thing, but the solution is always "turn vsync off." My Vista 32 experience with the same hardware is always butter smooth, so there has to be something else going on.
Anyone have any input there?

If your FPS drops below 60fps, you will end up rapidly alternating between 30fps and 60fps, which will make everything feel choppy and slow and terrible. Triple buffering (either real or render ahead) solves the problem, so try it with one of those.

I also must attest to using frame limiters. One thing i wish I had specified is that you must use it in conjunction with vsync. Yes 'fps_max 59' when used by itself in source games will still have tearing, but with vsync enabled as well, it looks smooth as silk. It also, in my case, cures the input lag in hl2, tf2, etc. If you set the frame limiter to below the max refresh rate of your monitor, things pretty much render in real time. When i don't use the limiter, i get a noticeable increase in lag for some reason. It might not be the case for everyone, but it sure is for me.

That is going to make things much, much worse, or do nothing at all. Quite possibly the game simply ignores the fps_max command with vsync on since it is redundant. Otherwise, your game is going to run at a nice 30fps since you aren't able to hit 60fps, which is going to increase input lag.

When you do that what does FRAPS report your FPS as?

You mention that triple buffering isn't an option when using DX. Does that mean that the 'triple buffering' option in D3Doverrider is actually this 'render ahead queue' you mention? If so, is it not a viable solution for reducing input lag at all? Thanx for the info, you obviously know a bit more about how this works than I do. It's mostly been trial and error/the worlds most annoying guessing game with me.

I would assume D3DOverrider is using the render ahead queue, as real triple buffering must be done manually in DX, but I don't know. It will not reduce input lag at all, and will actually increase it. But it does create smoother action.

In terms of the console input lag questions, i'm mostly referring to 2d games.
For instance, I'm a big fan of Castlevania: Symphony of the Night.
On Xbox, the thing renders beautifully at full resolution completely vsync'd with no perceivable input lag.
When being played through an emulator with vsync enabled, there's still a noticeable frame or two of input lag. This goes away with vsync disabled of course, but i'd like to think that my computer could pull off anything the wee little xbox can. It might be the emulator's fault in this case though.

There are a myrid of possibilities, from the emulator being at fault to you being at fault. It is fully possible that the xbox is having greater input lag but you just aren't noticing it because you are used to the xbox's input lag. Heck, you've already said your TV has some massive input lag, so the lag is there.

Vsync be nice if it worked but input lag is to much has like 100ms lag behind it not playable for FPS games, eveyr game i have played Vsync has an bad affect on controls

Vsync adds *at most* 16ms of input lag on a 60hz monitor, never more. It will never get anywhere close to 100ms unless you have a 10hz display.

the lag is more of an issue if the game is running faster FPS then the monitor and the Vsync is trying to keep to, set the game to 59 or 60 or what ever your monitor runs at and it should be smooth

That doesn't solve the input lag problem at all. Hell, limiting the FPS to the refresh rate is doing the same damn thing vsync is in the first place, just without solving the tearing issue.

Interesting. So I take it OpenGL does support "true" triple buffering, which can be enabled via the NVIDIA driver. I assume D3DOverrider (which I always use) just enables the render ahead queue then? In BC2 at least, I'm not sure if that's related to the RenderAheadLimit config variable which I have set to 0 to minimize input lag. I have also done the same in the NVIDIA control panel.

OpenGL does support real triple buffering, which can be enabled in both the nvidia and ATI control panels. The RenderAheadLimit variable might be controlling the maximum number of render ahead frames, in which case setting it to 0 will effectively disable render ahead entirely, with 1 solving the below 60fps problem while adding ~16ms of input lag when exceeding 60fps. But I haven't used an Nvidia card for a couple of years now, so I'm not all that familiar with the option, I'm just going off of what the name implies.
 
If you aren't sure what input lag feels like play Deadspace on PC, turn on vsync in game and nowhere else and you will get the worst input lag ever.

That's because Vsync in Deadspace locks you to 30fps (half your refresh rate). It isn't vsync's fault, it's retarded developers at fault. The solution seems to be to use either Nvidia's control panel vsync option or D3DOverrider to enable regular vsync.
 
If your FPS drops below 60fps, you will end up rapidly alternating between 30fps and 60fps, which will make everything feel choppy and slow and terrible. Triple buffering (either real or render ahead) solves the problem, so try it with one of those.

Been there, done that. The only option that I know of to force that is with D3DOverrider and it doesn't fix anything.
Funny enough using a gamepad seems to eliminate the issue, but not every game supports one...plus I'd much rather play FPS games with a mouse.
 
Been there, done that. The only option that I know of to force that is with D3DOverrider and it doesn't fix anything.
Funny enough using a gamepad seems to eliminate the issue, but not every game supports one...plus I'd much rather play FPS games with a mouse.

Huh, odd. I haven't seen any issues on Win7 64-bit with Vsync on my rig. Did you force vsync AND triple buffering with D3DOverrider, disabling the ingame vsync?
 
The only game I've noticed imput lag from Vsync was Battlefield 1942. No other game have I noticed any. Well, maybe Deadspace but changing VSync from the game to force on in the CP, along with enabling triple buffering, fixed that.
 
I'm going to make a very basic suggestion to start out with, if you haven't already tried it:

Format your box and start over. This may cure many things including input lag (which may be a result of not just the things you are mentioning). If your issues keep occurring, at least you can truly rule the rest of the system out. I'm only saying this because it seems like you've tried a myriad of solutions to eliminate your input lag and as we all know, a system can easily get cluttered even if you uninstall things you aren't using anymore.

Since you mention stuttering, one other thing to check is your sound drivers. Update these, whether they are on-board or a separate audio card. You could be surprised to know that corrupt audio drivers (or simply badly coded ones) can cause performance lag. This of course when paired with medium to high quality settings in a game, although sometimes even low or any/all settings are affected.

If you are getting stuttering in a source game, its going to be either audio related or potentially hard drive related (cache memory). So needless to say I'm sure you do this, but keep your system defragged (Win 7 has an excellent built in defragger).

I'll go back and read, but did you mention what monitor/lcd you are using?
 
I’m coming off of a format 3 weeks ago, both of my disks are 0% fragmented (primary = a 2TB WD Black, Secondary is a 1.5TB Seagate…I’ve tried running both as a primary), and my drivers are all the latest and greatest for everything. I have no software installed that I don’t use and my Win 7 install was clean on a new partition.
I’m currently using the HDMI audio from the 5870 and the Realtek driver for it, but I did try the baseline ATI driver to see if that did anything different. It’s going through a receiver, however the choppiness is there when plugged straight into the TV. At one time I was using an X-Fi PCIe, with the same issues.
My monitor is actually my TV, which I have to guess is the issue. I can only guess that while Vista vibed fine with it, Win 7 doesn’t. I’ll see if I can dig out an old monitor to test, too. I’ve tried using both HDMI and DVI/HDMI (which scale differently) but the effects are the same.
 
over 3 different systems i have had and always had input lag with Vsync, move mouse and there is an delay before the screen moves (not sure what the delay would be like seems more then 10ms i thought it was min 15-25ms added when vsync was on seem way higher then that)
 
Vsync creates an artifical delay on frames being output so that they're timed with the monitors refresh, so fundamentally this delay is felt as "input lag"

However I don't think it ends there...

I put forward a theory in the last major vsync thread about "seeing" more information than you can display with a set refresh rate. Frames when they tear can portray more information in one monitor refresh than if you just sync one frame with that same refresh.

If for example you have a refresh thats made up of 10 different frames all sliced together then the relative spacing between objects that are disected by those frames tells us something about the movement of the objects relative to the point of view.

For example if you rotate your view and stop mid-refresh, you see that stop when tearing lots of frames together, where as with vsync on we dont see anything inbetween 2 seperate refreshes. You have to wait for the next frame to see that you've stopped moving. I believe that this effect makes games feel more responseive, it makes something like 200fps more responsive than 60fps, even though you can only display 60 full frames on a modern monitor.
 
The problem with tearing is that it adds new information to the top of the screen, and that's about it. There are some things tearing will never show you, and some things it can confuse you about. I'd rather just not deal with it and take my 8ms (max @ 120hz) extra input lag. There are other sources that are much greater contributing factors here.
 
Last edited:
The problem with tearing is that it adds new information to the top of the screen, and that's about it. There are some things tearing will never show you, and some things it can confuse you about. I'd rather just not deal with it and take my 8ms (max @ 120hz) extra input lag. There are other sources that are much greater contributing factors here.

Well, it'd be the bottom half of the screen, but other than that I agree completely. I disagree with Princess's assertion that we can figure things out from the "torn" information. We've disputed that before :)
 
My monitor is actually my TV, which I have to guess is the issue. I can only guess that while Vista vibed fine with it, Win 7 doesn’t. I’ll see if I can dig out an old monitor to test, too. I’ve tried using both HDMI and DVI/HDMI (which scale differently) but the effects are the same.

Depending on your TV, it can have some serious response time lag. TV's aren't designed the same as lcd monitors. For the best gaming experience you want less than 16ms. Most hardcore gamers will have 8ms or less (mine is 2ms). Your TV could have anywhere from 16, 24, 32 or worse! if you add that to your input lag time (it will always be there), that makes it pretty atrocious no matter what else you do.
 
kllrnohj - i went back and tested using fps_max 58 along with vsync in TF2 with fraps enabled, and it does register the frames per second as being at a solid 58, sometimes dipping to as low as 47. here's a pic
http://yfrog.com/9g58fpsj

like i said, i don't know why it works on my system, but i assure you that it does. i think the phenomenon is repeatable in the free source game 'alien source' if you want to test it yourself. simply enable the developers console and vsync, type into console 'fps_max 58', and see if there's any noticeable difference in input lag. i did notice in fraps that if i left 'fps_max' at its default value of '300' with vsync enable, the framerate stayed at around 60, but it jumped back and fourth very rapidly from about 60-64. is it possible that my monitor doesn't have a precise 60hz refresh rate, and that this might be causing my problems?

oh, and in terms of the xbox lag, i only bought the xbox last weekend. i've been obsessing over input lag for about 6 months now, and i couldn't bear to hook it up to my roomies big laggy TV. i have it connected directly to my computer monitor in order to avoid as much lag as possible. my acer 23" monitor has an input lag of about 14ms compared to the tv's lag of 55ms (as measured through the autocalibrate feature used with the rock band 2 fender guitar controller). i ran castlevania in the emulator through the pc while playing it on the xbox 360 version simultaneously, switching the monitor's input from one to the other in order to get the comparison.

i might not have all of the reasoning behind why the input lag is there, but i assure you i have done a good bit of homework.


edit: i have noticed framrates jumping from a precise 60 to 30 in bioshock 1 with vsync enabled, but it doesn't seem to do that in source games.
 
Last edited:
kllrnohj - i went back and tested using fps_max 58 along with vsync in TF2 with fraps enabled, and it does register the frames per second as being at a solid 58, sometimes dipping to as low as 47. here's a pic
http://yfrog.com/9g58fpsj

like i said, i don't know why it works on my system, but i assure you that it does. i think the phenomenon is repeatable in the free source game 'alien source' if you want to test it yourself. simply enable the developers console and vsync, type into console 'fps_max 58', and see if there's any noticeable difference in input lag.

Interesting. I can't detect any input lag in any source games with vsync on, so I can't really investigate what is going on.

i did notice in fraps that if i left 'fps_max' at its default value of '300' with vsync enable, the framerate stayed at around 60, but it jumped back and fourth very rapidly from about 60-64. is it possible that my monitor doesn't have a precise 60hz refresh rate, and that this might be causing my problems?

I don't think it's your monitor. My initial reaction would be that the fps is simply being reported wrong. Does both source's FPS display and fraps agree with the bouncing between 60-64 fps thing? If so, that sort of stutter could be perceived as input lag even though it really isn't.

oh, and in terms of the xbox lag, i only bought the xbox last weekend. i've been obsessing over input lag for about 6 months now, and i couldn't bear to hook it up to my roomies big laggy TV. i have it connected directly to my computer monitor in order to avoid as much lag as possible. my acer 23" monitor has an input lag of about 14ms compared to the tv's lag of 55ms (as measured through the autocalibrate feature used with the rock band 2 fender guitar controller). i ran castlevania in the emulator through the pc while playing it on the xbox 360 version simultaneously, switching the monitor's input from one to the other in order to get the comparison.

i might not have all of the reasoning behind why the input lag is there, but i assure you i have done a good bit of homework.

Then it's probably just the emulator. I assume the 360 version you are playing is an xbox live arcade version of the game or something?

edit: i have noticed framrates jumping from a precise 60 to 30 in bioshock 1 with vsync enabled, but it doesn't seem to do that in source games.

That would be vsync without triple buffering or render ahead queue. If that isn't happening in other games with vsync enabled, it means they are doing triple buffering or a render ahead queue.
 
in that case it might be this render ahead queue being on by default causing my problems with source games. it feels like it anyway. weird.

and yeah, it is the xbox live arcade version of castlevania.

and you were right about the misreporting of framerate. whereas the 'cl_showfps 1' fps count fluctuates between 60-64, fraps reports an even 60.

i'm guessing that without using fps_max 58, tf2 is queueing up two or three frames in a buffer before sending them out, which is why limiting the framerate is working to reduce the input lag. i sure wish it wouldn't by default.
 
I been doing that fps_max 58 for a couple years now in CSS, works well and on many systems and monitors. It'll eliminate that horrid Vsync lag.

In other games, I prefer precise input over video smoothness so I disable Vsync.
 
I'm trying to do some testing right now with BC2 with regards to its VSync settings.

I'm using a GTX 460 with DX11. I noticed that even after I quit D3DOverrider and reopened the game, Fraps still shows my FPS dropping to the 40s and lower 50s. That's using VSync with the in-game menu. I disabled that and tried running D3DOverrider again, disabling triple buffering but enabling VSync from there, with the same results.

RenderAheadLimit is set to 0 in my BC2 config and in the NVIDIA control panel. Any thoughts? I would've thought BC2 was using the render ahead queue with VSync enabled, but it still occurs even with VSync disabled in-game and enabled using D3DOverrider or the NVIDIA driver.
 
hrm. help me wrap my brain around this question. are you saying that you expected different results when disabling d3doverrider apart from the 40-50fps? does something lead you to believe that bc2 is using a render ahead queue?

one thing that frustrates me about this entire process is that it's very difficult to determine which method of 'triple buffering' different programs are actually using. i wish there were a simple test or a resource that could help.
 
Why do you use d3doverrider to force Triple Buffering?

There is a setting in NV's control panel to enable triple buffering. I leave the setting on. Does this not work?
 
Back
Top