How to identify input lag?

Maddnotez

Limp Gawd
Joined
Nov 28, 2014
Messages
310
How can identify how much input lag a monitor has? Can't find any info when I look at specs. Is it the Contrast Ratio that measures this?

I am on my knowledge quest regarding monitors. I will be getting a new monitor soon and I am just trying to learn everything I can about them to make a better choice when I buy.

Input lag is becoming something I am very concerned about. I see all this talk about response time but what I do not see is alot of talk about input lag which in theory (in my head) should be more important to me.

I see lots of specs on monitors but no one seems to mention input lag. So what if you have a monitor with 1ms response time but with very high input lag VS. a monitor with 5ms response time with very low input lag? Hope you see where I am going with this.

So just generally, where can I find specs regarding input lag when I am looking at monitors? Is is the contrast ratio? If so what are good and bad?

What else should I know about input lag?
 
Last edited:
Input lag is the delay between what is happening in the computer vs what you see on the monitor. Think of it this way: If monitor has one second of lag (which is ridiculous unplayable lag but works as an example) and you order your guy to walk, it takes a second before you see the character move, or stop. High lag for example can make fighting games impossible because when you see the punch coming you cant block it because in reality inside the computer the punch has already hitted you.

Manufacturers do not publish lag numbers, only reviewers measure it. Just stay with monitors around or preferably below 30ms if you are a casual or relatively serious gamer, below 20ms if you need to be comptetitive in twitch shooter games or fighting games. Ideally we should aim for 0ms (which CRT is capable of) but its pretty much impossible due to the way LCD panels work with all the processing/scaling and overdrives they have.
 
Last edited:
Further, input lag is a combination of the signal processing time of the scalar and the pixel response time of the panel.

Signal processing includes anything the scalar must do to the image data it receives to present it to the screen. Any extra processing will increase the amount of time before the image can be presented. Televisions will have more signal processing time due to things like interpolation or tuners/decoders that computer monitors do not have.

The response time is how long a pixel in the panel takes to change color from the state of the previous frame presented to the new one. Manufacturers advertise this, but it is usually just what the typical response time for the panel technology is or what the manufacturer quotes while using features like pixel overdrive.

Pixel overdrive increases the speed at which pixels change color by altering the voltage applied, which often comes at the cost of overshoot (missing the target color) that causes ghosting (trailing image of moving objects on the screen).

Since pixel response time directly affects input lag using it can give a very basic understanding of what kind of input lag to expect from the display. Only testing will pinpoint the exact amount of lag, though. The amount of signal processing varies wildly across displays, so you should not rely on response time to determine input lag.

For gaming purposes you want to try and optimally aim for an input lag time that is 1 frame or shorter for the refresh rate of the screen. At 60 Hz this is 16.67 milliseconds. This means no perceived delay between user input and what is displayed on screen.

A couple of examples of monitors with great input lag:
Acer Predator X34 = 9.20 ms (@100 Hz, 1 frame = 10.00 ms)
ASUS ROG Swift PG278Q = 4.00 ms (@144 Hz, 1 frame = 6.94 ms)
 
Last edited:
Further, input lag is a combination of the signal processing time of the scalar and the pixel response time of the panel.

Signal processing includes anything the scalar must do to the image data it receives to present it to the screen. Any extra processing will increase the amount of time before the image can be presented. Televisions will have more signal processing time due to things like interpolation or tuners/decoders that computer monitors do not have.

The response time is how long a pixel in the panel takes to change color from the state of the previous frame presented to the new one. Manufacturers advertise this, but it is usually just what the typical response time for the panel technology is or what the manufacturer quotes while using features like pixel overdrive.

Pixel overdrive increases the speed at which pixels change color by altering the voltage applied, which often comes at the cost of overshoot (missing the target color) that causes ghosting (trailing image of moving objects on the screen).

Since pixel response time directly affects input lag using it can give a very basic understanding of what kind of input lag to expect from the display. Only testing will pinpoint the exact amount of lag, though. The amount of signal processing varies wildly across displays, so you should not rely on response time to determine input lag.

For gaming purposes you want to try and optimally aim for an input lag time that is 1 frame or shorter for the refresh rate of the screen. At 60 Hz this is 16.67 milliseconds. This means no delay between user input and what is displayed on screen.

A couple of examples of monitors with great input lag:
Acer Predator X34 = 9.20 ms (@100 Hz, 1 frame = 10.00 ms)
ASUS ROG Swift PG278Q = 4.00 ms (@144 Hz, 1 frame = 6.94 ms)



Thanks for that input guys.
 
For gaming purposes you want to try and optimally aim for an input lag time that is 1 frame or shorter for the refresh rate of the screen. At 60 Hz this is 16.67 milliseconds. This means no delay between user input and what is displayed on screen.

Not sure how that logic follows.
 
Interesting topic.

Basically my 2209WA backlight started to die, and I wanted more desktop real estate so ordered a 1440P IPS screen, got the BENQ GW2765HT, which is a 8BIT IPS 1440P screen, all reviews of it were good aside from the processing/input lag which seemed to be rated as mediocre. So I was a bit concerned as the 2209WA had according to reviewers top notch input lag.

It came today, had to deal with nvidia's buggy displayport and eventually gave up now using 1.4HDMI, and did some desktop tests which were scrolling web pages, dragging windows around etc, looking for signs of lag, its a bit more than the 2209WA but not bad enough to be an issue. Then I played some project cars, and if there is lag, its not enough for me to notice, and this is on a monitor that has mediocre lag scores. It was rated at about 30-40ms lag, depending on the reviewer.

This is just my point of view tho, it may be this same display would annoy the hell out of someone else who is more sensitive to lag.

Ironically my old TN remains my slowest monitor, on that thing moving the mouse is a visible trail, scrolling web pages, the text blurs with a shadow etc.
 
I always forget to add the qualifier "perceived..."
That's still wrong though.
If we are using the full scannout time for 60Hz, then the minimum possible latency is 16.67ms - even on a CRT.
If your display has 16.67ms latency, you now have 33.33ms total latency.
Display latency doesn't cancel anything out, it adds to it.
 
Input lag is the delay between what is happening in the computer vs what you see on the monitor. Think of it this way: If monitor has one second of lag (which is ridiculous unplayable lag but works as an example) and you order your guy to walk, it takes a second before you see the character move, or stop. High lag for example can make fighting games impossible because when you see the punch coming you cant block it because in reality inside the computer the punch has already hitted you.

Manufacturers do not publish lag numbers, only reviewers measure it. Just stay with monitors around or preferably below 30ms if you are a casual or relatively serious gamer, below 20ms if you need to be comptetitive in twitch shooter games or fighting games. Ideally we should aim for 0ms (which CRT is capable of) but its pretty much impossible due to the way LCD panels work with all the processing/scaling and overdrives they have.

20ms input lag is a total joke for serious gaming in shooters. Rubbish, trash, unusable, not worthy of a single dollar!!

You might just as well down a few beers before the competition if you think 20 ms is ok...

That's still wrong though.
If we are using the full scannout time for 60Hz, then the minimum possible latency is 16.67ms - even on a CRT.
If your display has 16.67ms latency, you now have 33.33ms total latency.
Display latency doesn't cancel anything out, it adds to it.

What don't you understand about CRT? Whatever is being output by the VGA output on the back of your PC is immediately driving the electron canon.
 
Last edited:
doesn't matter if you're using a CRT. Even with vsync off, you are limited by your framerate. If you initiate a mouse movement, for example, at the beginning of a frame, you have to wait until the next frame for the image to be updated (see flod's image at bottom of this post)


And even if you have a ridiculously high framerate, with vsync off, the screen doesn't update all at once - it scans, line by line. So, suppose you have a program that draws a thin red line across the centre of the screen when you press a mouse button, and suppose you press the button right after the halfway point of the refresh. You now have to wait almost an entire refresh for the line to be drawn. If you press the button right before the halfway point, the line will be drawn immediately.

9cSP1bM.png
 
Let's clear up some things before this snowballs any further. I'm going to quote TruckJitsu over from Smash forums because CRTs are still popular within the Smash Bros gaming community.

First and foremost:
  • All 60Hz Displays - Including CRTs - Lag one full frame before completion
That's right - The CRTs you know and love do not have lagless performance - This is a HUGE misconception because it is poorly explained. When we say "0" lag, we are talking about the START of the frame render. These monitors we're talking about render line by line top to bottom. At the TOP of the frame, it starts rendering at 0.00000067ms (only limited by the speed of light) - which is basically no lag. But by the time it gets all the way to the BOTTOM of the frame, it literally took 16.67ms to get there. How is that determined? By refresh rate! Which brings me to another misconception:

  • Stop saying Refresh Rate != Input Lag
This is just not true - At least in the way people think it's true. Yes, Refresh Rate on-its-own, does not make up for the total display lag. But it's damn well part of the equation.

The equation for total display lag looks like this:

Display Lag = (Input Processing) + (Response Time) + (1000/Refresh Rate)


So when we talk about gaming monitors for example, we are not factoring in Transmission Time (1000/refresh rate) - so when we say some gaming monitor has ~2ms of lag - we are really saying it has (assuming 60Hz) ~18.67ms of display lag. Because all 60Hz monitors lag MINIMUM 16.67ms per frame. Hz is out of 1000 by the way so that's where that number comes from. So when I talk about using a 144Hz monitor for less input lag than a console - that's real shit. Consoles are limited to 60Hz which makes Dolphin a superior system for online play or competitive play in terms of input lag. There is a new technology called OLED that can support refresh rates of 1000Hz. There are no 1000Hz models out but OLED displays do exist and they're like $4,000 atm. Eventually there will be OLED gaming monitors with 1000Hz refresh rates that'll literally FINISH rendering a full frame in 1ms and that will be as close to lag free as we'll get for a long time. Another thing people say is, "it's only a 60FPS game" - yeah so there will be some duplicate frames, but the initial frame will still complete at 9ms instead of 16.67ms.

tl;dr The 1-5ms you see on LCDs are derived from input processing and response time. This is the lag that matters - the other third component to the equation is transmission lag. Transmission lag is directly determined from refresh rate. When we say 2ms - we're not actually saying 2ms - and this is the problem. Over and over and over people reply with the same "response time !== input lag" - that is not what I'm saying at all. The true input lag is ~18.67ms per frame. We don't say 18ms normally because that's INCLUDING transmission lag which normally we don't include. Every 60hz display - including CRTs lag 16.67ms per frame. This is the biggest misconception people don't understand.
 
it often makes more sense to quantify that last component of the equation as a random variable, with a uniform distribution, that can take on values from 0 to 1/refresh, unless, as zone74 qualified, we're interested in full scanout time.

But then again, it depends on context. In the context of a first person shooter, sometimes information in only a portion of the screen height is useful, so we don't need to worry about scanning through the entire vertical height of the screen before our brains can use the current information, in a vsync off scenario. In those cases, frame rate is the main bottleneck for input lag.
 
One thing to mention is that there are such things as "global displays" rather than the more common "rolling displays".
With rolling displays you have lower latency at the top of the frame than the bottom.
So with the CRT, that's effectively 0ms to 16.67ms at 60Hz depending on where you measure it.

With a global display, the frame is buffered in the display and scanout happens instantaneously. Now there may still be some difference between the top and bottom of the display, but it should be <1ms.
Plasma televisions, DLPs, and the OLED displays being used in VR headsets are examples of this.
When you measure input lag on them, for a display which has ~23.33ms of processing delays at 60Hz (like the later Panasonic plasmas) it will measure ~40ms at all positions on the screen. (23.33 display lag + 16.67ms scanout)
An LCD or typical OLED display with ~23.33ms processing delays would measure ~23.33ms at the top of the frame and ~40ms at the bottom.

The reason that global displays are being used with VR headsets, is that the image skews with motion on a rolling display.
If you take a full height but narrow width window and drag it around on your monitor, watch the edge and you should see that the bottom lags behind the top of the window and that straight vertical line appears to bend. I find that it becomes more noticeable at higher resolutions.
Global displays solve this problem.

What I'm wondering is: since we know that G-Sync buffers the frame in the G-Sync module, does it still scanout like a rolling display, or is it globally updated? Even if it's not globally updated, does it scanout quicker?
I wonder if anyone has tested latency at the top and bottom of the frame when G-Sync is active to see if this is improved on a G-Sync monitor.
I'm guessing that they are probably still rolling displays, because they would probably market this as a feature, but I would definitely prefer a display which has ~17.67ms latency at 60Hz (my understanding is that G-Sync adds 1ms latency) if it's 17.67ms at all areas on the display, instead of 1–17.67ms depending on the position.
I guess you could argue that most action in FPS games takes place in the middle section of the screen so changing to a global display is potentially adding ~8.33ms latency, but I think it would be a worthwhile trade.
 
G-Sync on 144Hz monitor refresh display in ~7ms in 'rolling' fashion.
So lag is 0-7ms, averaging at 3.5ms in the middle of screen.

BTW. Analog monitor/TV such as CRT + analog 'rolling' camera have 0ms input lag on whole screen. What is being drawn is exactly the same thing that is being seen on camera. Horizontally moving vertical bar in front of camera would not 'bend' on these even despite being displayed on rolling display. It would however bend when displayed on 'global' display :)

BTW. Most people either never noticed this bending of moved windows or think its Aero effect :)
 
G-Sync on 144Hz monitor refresh display in ~7ms in 'rolling' fashion.
So lag is 0-7ms, averaging at 3.5ms in the middle of screen.
Can you point to measurements for that?
I was hoping that it would either use a faster scanout, or ideally be globally updated.

BTW. Analog monitor/TV such as CRT + analog 'rolling' camera have 0ms input lag on whole screen.
As previously discussed, the bottom of the image at 60Hz on a CRT is 16.67ms behind the top because it's a "rolling" display.
Measured with the Leo Bodnar lag tester:
lag_dell_e773c_vga_10ybrj1.png


Obviously the tester is not measuring the very top and bottom of the screen here to produce a result of 0.8–15ms rather than 0–16.67ms.

What is being drawn is exactly the same thing that is being seen on camera. Horizontally moving vertical bar in front of camera would not 'bend' on these even despite being displayed on rolling display. It would however bend when displayed on 'global' display :)
I'm not sure what you mean by this.
CCD cameras generally use a global shutter, while most CMOS cameras use a rolling shutter.
So CMOS cameras will skew and exhibit the "jello effect", while CCD cameras will not.

If you display footage captured with a CCD camera on a CRT, it will skew due to the CRT scanout.
If you display footage captured with a CMOS camera on a CRT, wouldn'it skew twice as much since the video output contains skew, and the CRT scanout adds even more?

The only way to eliminate skew is to display footage captured on a CCD (or global CMOS) camera on a globally-updated display: Plasma, DLP, or any other global display.

BTW. Most people either never noticed this bending of moved windows or think its Aero effect :)
I find that surprising, it's very obvious to me.
Another place where it really stands out is scrolling on tablet devices. These are often very high resolution (which seems to show it more) and it really stands out when one edge of the screen is following your finger and the other is lagging behind as you scroll down a web page for example.
 
Last edited:
You guys are making it way too complicated.

Input lag = the time between the monitor receiving a signal and sending that out to be displayed.

For serious gaming you don't want any fancy over-overdrive, dynamic contrast or other bull, so it simply should be 0.

Compare it to customs checking your packages that you get from some other country. They let it sit for some time in a warehouse, then check it and let it exit from there. Input lag on monitors is exactly the same; something "sits" on the signal, delays it.
 
Back
Top