There's tons of contradictory information on the net about Input Lag. I just wanted to gauge the popular opinion on the causes and possible solutions of Input lag.
Input lag = amount of time between a button press/mouse movement, and the visible response to said movement on the screen.
Here's some info,and here's some more.
16ms = about 1 frame of input lag at 60hz. The human eye is not supposed to be able to detect anything below about 30-50ms, but that's debatable.
As far as I can gather there are several causes. Here they are in no particular order.
1) Native display lag: Every LCD display is going to introduce some measure of input lag. Personally, my huge Samsung 42' LCD TV adds about 60ms of input lag on game mode. My 23' Acer Monitor only adds about 14ms of Lag. These measurements are derived from the extraordinarily handy auto-calibrate feature in Rock Band 2, which sends a visual cue to the special guitar controller and measures how long it takes to receive that cue.
I've noticed that the 60ms of input lag is quite inconvenient when trying to play games from the PC on the big honkin' TV. 60ms is just enough to make mouse movement feel sluggish, especially when combined with Vsync, but i'll get into that in a bit.
I've heard of people getting better results on large TVs by using the vga input as opposed to hdmi, but to me it seems to make no difference whatsoever.
I've also heard that certain scaling options can contribute to lag, but the information is also pretty contradictory. Supposedly running things at the native resolution of the display is the best way to avoid introducing input lag due to scaling, yet on my huge tv, the only way to reduce the input lag it seems is by running on 640x480 or lower resolutions. Ii think it's because most of the time the tv is running all sorts of post-processing algorithms, but doesn't seem to care when forced into outdated grandpa resolutions. Who knows.
2) Mouse acceleration: also known as "enhance pointer precision" under windows control panel settings. It's on by default on pretty much every computer out there. It basically means that the faster you move your mouse, the faster the pointer travels across the screen. It also means that it takes at least two frames to get a reference as to how fast the mouse is moving before implementing the movement on the screen, adding about 16ms of input lag (as i understand it). Unchecking the box in control panel seems to do the trick. Some games like bioshock seem to want to force this acceleration, adding to the input lag problem. I think it's a good idea to leave this setting off, but to each his own.
Mouse smoothing is a similar phenomenon. It also uses two frames of reference to decide how to implement mouse movement, adding that same 16ms of lag. Pretty obsolete settings meant to smooth out the movement of old dusty mouse balls.(heh)
3) Polling rate: This is a measurement of how frequently your USB device sends information to your computer. The default polling rate for pretty much every usb device out there is 125hz. This introduces about 8ms of latency. Through some pretty scary registry tweaks it's possible to increase this in most devices up to about 1000hz (1ms latency), but tread carefully. I went too deep down the rabbit hole once and ended up breaking my usb controllers, which made it impossible to interact with my computer through anything using a USB connection (thank god for remote desktop, which allowed me to log in and fix the driver issue). There are a few high-end gaming mice that allow you to up the polling rate through software, but those are few and far between.
Here's a link with more info about futzing with the polling rate under windows 7.
NOW HERE'S THE BIGGIE
4) Vsync: This is the main input lag culprit as i've surmised. It's the reason why i'm posting this in the 'video card' section of this forum. Enabling vsync means that your video card will wait for an entire frame to render and display before sending out a new one. On your screen, you'll see one entire crisp image at a time. If disabled, you might be subject to 'screen tearing'. Unfortunately, enabling vsync means your most likely going to experience quite a noticeable amount of input lag, no matter how much you've futzed about with items 1-3 in this list. As long as your video card is able to render more frames than your refresh rate is capable of (generally anything over 60fps), your display will fall behind by at least one frame. This, in combination with other input lag variables, can really ruin your gameplay.
This phenomenon drives me completely bananas.
There is hope, however. I've heard of at least two ways to combat the input lag introduced through vsync. One of which i'm severally confused about and could use some input.
A). Frame limiting: THIS! THING! WORKS! It's also pretty rare. Some games give you an option to manually limit your framerate to whatever you set it to. This functionality, for some reason, is usually buried within .ini files or console commands, but i've found it to be one of the few viable weapons in the war against input lag. Vsync, by it's very nature, limits your framerate to whatever your monitors refresh rate is (usually about 60hz), but it still buffers those one or two frames. By both enabling vsync AND manually setting the framerate to 59 (or 1 frame below your refresh rate), it seems that the frame is displayed as soon as it's finished rendering, just as if your video card weren't capable of rendering over 60fps. It's the miracle cure for input lag, providing you with a seamless picture with no laggy side effects. The only problem is that it's horrifically uncommon. The 'fps_max' console command works for most source games (excluding the l4d series for some stoopid reason), and i've heard there's an .ini variable for most Unreal engine games. Use this. Not only does it work, but it's generally accepted that it works, as opposed to the following.
B.) TRIPLE BUFFERING!!: This is the most confusing thing ever documented in god's green internet.
Here's a great article describing how triple buffering SHOULD work, but i'm not at all convinced.
Supposedly, triple buffering allows your video card to continue to render frames without waiting for one to display. In an ideal world, your video card will buffer two images, one after the other, and send the most recently completed one out as soon as an entire image is displayed. In theory, it SHOULD drastically reduce input lag, but it doesn't always seem to deliver as promised.
There's doesn't seem to be any general consensus whatsoever on whether or not it alleviates input lag or contributes to it. Some say that enabling triple buffering means your display is always showing the most recently completed frame as opposed to one that's been held in a buffer. Some report that triple buffering means that by buffering more than one frame, you're waiting that much longer to see what you're graphics processor has been generating. Confused?! So am I?!?!
Every forum i've been to on this subject has been split pretty much down the middle on triple buffering. Some swear by it, while others curse it's name. One possible contributor to this rift is the fact that two completely different rendering methods are often grouped under the same 'triple buffering' label. One of which being the genuine "glinda the good witch of the north" triple buffering, and the other being the "evil wicked witch of the west" version known as 'flip-queuing'. 'Flip-queuing', as far as I understand it (which isn't very far) puts all of your frames in a big line and has everyone wait their turn. Apparently some games/drivers implement 'flip-queuing', which does have positive side effects like smoothing out movement and making things very fluid and consistent, but label it misleadingly as 'triple-buffering'. All of this inconsistency makes it a nearly impossible subject to test and read up on.
It's also pretty rare for a game to come equipped with a triple buffering option. It's possible to force triple buffering through Nvidia drivers, but i've heard that it only works for OpenGL rendering. For Direct3D, there's a utility called D3Doverrider which comes included in the Rivatuner graphics tweak utility. I've messed with these a few times, but it's difficult to tell if it makes any difference. The only thing I was able to conclude was that it was in no way the magical cure-all some forums (and very strongly opinionated forum posters) had led me to believe.
Please, if anyone out there has any input or information on vsync, triple buffering, or any other tool against input lag, please let me know.
It's extraordinarily difficult to test this stuff seeing as you're dealing with milliseconds. I've spent hours tapping controller buttons and moving my mouse about trying to determine if a tweak has contributed to anything.
Here's a glorified lcd stopwatch which some people use to test for input lag. They set up a high speed camera, set their video card to display the stopwatch on both a crt display and an lcd that they choose to test, and take a snapshot. By subtracting the crt's value from the lcd's, you're able to determine how many milliseconds your display is falling behind.
Here's another handy utility which relies on human response time to gather its information. You just click a button when the stick turns blue, sort of like playing guitar hero. It's a good way to get a general feel on relative display response times, but it's far from perfect.I used to rely on this test, and i learned a lot about vsync (especially in regards to the default 'aero' theme in vista), but it's too reliant on the human factor to be conclusive.
All I want is a clear picture. One with no screen tearing, that pops up in real time like the good ol' NES days of old (as a depressing side note, even NES emulators are subject to vsync input lag).
Is there anything I should know about that might be able to shed some light on the subject or help? What are your opinions on triple buffering and such? Is this issue going to improve as new technology is developed? Is it going to get worse as lcd displays get larger and streaming technology like OnLive gets more and more popular? Are there any 3rd party frame limiters out there that might be able to help? Any info or insight would be greatly appreciated.
P.S. This doesn't seem to be an issue in modern consoles like the Xbox 360 and PS3, at least in regards to the input lag introduced by VSYNC. Everything seems to render in real time, while remaining perfectly synced to the refresh rate of the monitor. What's the deal? If my $200 gaming device can muster it, why can't my $800 pc?
WALL OF TEXT/ TOO LONG DIDN'T READ VERSION:
What's up with input lag? Is triple buffering supposed to help or hurt?
Input lag = amount of time between a button press/mouse movement, and the visible response to said movement on the screen.
Here's some info,and here's some more.
16ms = about 1 frame of input lag at 60hz. The human eye is not supposed to be able to detect anything below about 30-50ms, but that's debatable.
As far as I can gather there are several causes. Here they are in no particular order.
1) Native display lag: Every LCD display is going to introduce some measure of input lag. Personally, my huge Samsung 42' LCD TV adds about 60ms of input lag on game mode. My 23' Acer Monitor only adds about 14ms of Lag. These measurements are derived from the extraordinarily handy auto-calibrate feature in Rock Band 2, which sends a visual cue to the special guitar controller and measures how long it takes to receive that cue.
I've noticed that the 60ms of input lag is quite inconvenient when trying to play games from the PC on the big honkin' TV. 60ms is just enough to make mouse movement feel sluggish, especially when combined with Vsync, but i'll get into that in a bit.
I've heard of people getting better results on large TVs by using the vga input as opposed to hdmi, but to me it seems to make no difference whatsoever.
I've also heard that certain scaling options can contribute to lag, but the information is also pretty contradictory. Supposedly running things at the native resolution of the display is the best way to avoid introducing input lag due to scaling, yet on my huge tv, the only way to reduce the input lag it seems is by running on 640x480 or lower resolutions. Ii think it's because most of the time the tv is running all sorts of post-processing algorithms, but doesn't seem to care when forced into outdated grandpa resolutions. Who knows.
2) Mouse acceleration: also known as "enhance pointer precision" under windows control panel settings. It's on by default on pretty much every computer out there. It basically means that the faster you move your mouse, the faster the pointer travels across the screen. It also means that it takes at least two frames to get a reference as to how fast the mouse is moving before implementing the movement on the screen, adding about 16ms of input lag (as i understand it). Unchecking the box in control panel seems to do the trick. Some games like bioshock seem to want to force this acceleration, adding to the input lag problem. I think it's a good idea to leave this setting off, but to each his own.
Mouse smoothing is a similar phenomenon. It also uses two frames of reference to decide how to implement mouse movement, adding that same 16ms of lag. Pretty obsolete settings meant to smooth out the movement of old dusty mouse balls.(heh)
3) Polling rate: This is a measurement of how frequently your USB device sends information to your computer. The default polling rate for pretty much every usb device out there is 125hz. This introduces about 8ms of latency. Through some pretty scary registry tweaks it's possible to increase this in most devices up to about 1000hz (1ms latency), but tread carefully. I went too deep down the rabbit hole once and ended up breaking my usb controllers, which made it impossible to interact with my computer through anything using a USB connection (thank god for remote desktop, which allowed me to log in and fix the driver issue). There are a few high-end gaming mice that allow you to up the polling rate through software, but those are few and far between.
Here's a link with more info about futzing with the polling rate under windows 7.
NOW HERE'S THE BIGGIE
4) Vsync: This is the main input lag culprit as i've surmised. It's the reason why i'm posting this in the 'video card' section of this forum. Enabling vsync means that your video card will wait for an entire frame to render and display before sending out a new one. On your screen, you'll see one entire crisp image at a time. If disabled, you might be subject to 'screen tearing'. Unfortunately, enabling vsync means your most likely going to experience quite a noticeable amount of input lag, no matter how much you've futzed about with items 1-3 in this list. As long as your video card is able to render more frames than your refresh rate is capable of (generally anything over 60fps), your display will fall behind by at least one frame. This, in combination with other input lag variables, can really ruin your gameplay.
This phenomenon drives me completely bananas.
There is hope, however. I've heard of at least two ways to combat the input lag introduced through vsync. One of which i'm severally confused about and could use some input.
A). Frame limiting: THIS! THING! WORKS! It's also pretty rare. Some games give you an option to manually limit your framerate to whatever you set it to. This functionality, for some reason, is usually buried within .ini files or console commands, but i've found it to be one of the few viable weapons in the war against input lag. Vsync, by it's very nature, limits your framerate to whatever your monitors refresh rate is (usually about 60hz), but it still buffers those one or two frames. By both enabling vsync AND manually setting the framerate to 59 (or 1 frame below your refresh rate), it seems that the frame is displayed as soon as it's finished rendering, just as if your video card weren't capable of rendering over 60fps. It's the miracle cure for input lag, providing you with a seamless picture with no laggy side effects. The only problem is that it's horrifically uncommon. The 'fps_max' console command works for most source games (excluding the l4d series for some stoopid reason), and i've heard there's an .ini variable for most Unreal engine games. Use this. Not only does it work, but it's generally accepted that it works, as opposed to the following.
B.) TRIPLE BUFFERING!!: This is the most confusing thing ever documented in god's green internet.
Here's a great article describing how triple buffering SHOULD work, but i'm not at all convinced.
Supposedly, triple buffering allows your video card to continue to render frames without waiting for one to display. In an ideal world, your video card will buffer two images, one after the other, and send the most recently completed one out as soon as an entire image is displayed. In theory, it SHOULD drastically reduce input lag, but it doesn't always seem to deliver as promised.
There's doesn't seem to be any general consensus whatsoever on whether or not it alleviates input lag or contributes to it. Some say that enabling triple buffering means your display is always showing the most recently completed frame as opposed to one that's been held in a buffer. Some report that triple buffering means that by buffering more than one frame, you're waiting that much longer to see what you're graphics processor has been generating. Confused?! So am I?!?!
Every forum i've been to on this subject has been split pretty much down the middle on triple buffering. Some swear by it, while others curse it's name. One possible contributor to this rift is the fact that two completely different rendering methods are often grouped under the same 'triple buffering' label. One of which being the genuine "glinda the good witch of the north" triple buffering, and the other being the "evil wicked witch of the west" version known as 'flip-queuing'. 'Flip-queuing', as far as I understand it (which isn't very far) puts all of your frames in a big line and has everyone wait their turn. Apparently some games/drivers implement 'flip-queuing', which does have positive side effects like smoothing out movement and making things very fluid and consistent, but label it misleadingly as 'triple-buffering'. All of this inconsistency makes it a nearly impossible subject to test and read up on.
It's also pretty rare for a game to come equipped with a triple buffering option. It's possible to force triple buffering through Nvidia drivers, but i've heard that it only works for OpenGL rendering. For Direct3D, there's a utility called D3Doverrider which comes included in the Rivatuner graphics tweak utility. I've messed with these a few times, but it's difficult to tell if it makes any difference. The only thing I was able to conclude was that it was in no way the magical cure-all some forums (and very strongly opinionated forum posters) had led me to believe.
Please, if anyone out there has any input or information on vsync, triple buffering, or any other tool against input lag, please let me know.
It's extraordinarily difficult to test this stuff seeing as you're dealing with milliseconds. I've spent hours tapping controller buttons and moving my mouse about trying to determine if a tweak has contributed to anything.
Here's a glorified lcd stopwatch which some people use to test for input lag. They set up a high speed camera, set their video card to display the stopwatch on both a crt display and an lcd that they choose to test, and take a snapshot. By subtracting the crt's value from the lcd's, you're able to determine how many milliseconds your display is falling behind.
Here's another handy utility which relies on human response time to gather its information. You just click a button when the stick turns blue, sort of like playing guitar hero. It's a good way to get a general feel on relative display response times, but it's far from perfect.I used to rely on this test, and i learned a lot about vsync (especially in regards to the default 'aero' theme in vista), but it's too reliant on the human factor to be conclusive.
All I want is a clear picture. One with no screen tearing, that pops up in real time like the good ol' NES days of old (as a depressing side note, even NES emulators are subject to vsync input lag).
Is there anything I should know about that might be able to shed some light on the subject or help? What are your opinions on triple buffering and such? Is this issue going to improve as new technology is developed? Is it going to get worse as lcd displays get larger and streaming technology like OnLive gets more and more popular? Are there any 3rd party frame limiters out there that might be able to help? Any info or insight would be greatly appreciated.
P.S. This doesn't seem to be an issue in modern consoles like the Xbox 360 and PS3, at least in regards to the input lag introduced by VSYNC. Everything seems to render in real time, while remaining perfectly synced to the refresh rate of the monitor. What's the deal? If my $200 gaming device can muster it, why can't my $800 pc?
WALL OF TEXT/ TOO LONG DIDN'T READ VERSION:
What's up with input lag? Is triple buffering supposed to help or hurt?