Input Lag Measurement Thread.

Snowdog

[H]F Junkie
Joined
Apr 22, 2006
Messages
11,262
It is important to recognize that people have much different tolerance to input lag. It would be nice to have some objective measurement of input lag, rather than "I feel lag" vs "I don't feel lag" bun fights. It would also be nice to collect input lag from a bunch of monitors in one place for reference.

For testing I suggest using inputlag.ext from.
http://tft.vanity.dk/

Measure against a CRT in clone mode. Put your camera at a decently fast exposure ( on cheap compacts, set them on ISO 400).

Take a number of shots. Include two images that show the range of the lag.

I will start the ball rolling with my modest TN Screen:

------------------------------------------------------------------------------------------->

Monitor: Dell 1707fp, Panel type TN, Panel maufacturer Samsung.

Lag Range: 0ms to 16ms (0-1 frame)

Images:
0lagjf2.jpg

1framelagwp1.jpg
 
This thread is not going to stop claims of "I don't feel input lag" or "OMG there's so much it's horrible you idiot". Most reviews already objectively measure the input lag of the monitors they cover, but that doesn't change the entirely subjective nature of whether an end user will notice or have a problem with it.

You're kind of wasting your time, unless all you really want to know is the input lag of monitors that haven't been covered in reviews.
 
I am already aware of the digital versus coverage but there is some question of their methodology and accuracy. So some more results of monitors they already covered would still be good.

But yes, more importantly I am looking for broader coverage as well. I would also like to see lag number for 1080p LCD TVs which are very lacking as well and TVs have reports of high variability on lag between models/settings etc..

No it won't end those arguments. But it would be great to have a database of objective results for those of us who know what kind of lag they can tolerate.
 
Well, I have NEC 2690 which lag is well known. Also now HP LP2480zx, which lag I posted in the thread about the monitor.
 
I didnt run the inputlag utility (yet) but the other monitortest util is fantastic for my Plasma.
By far the easiest to use basic calibration util i have come across.
Thanks :)

I'll get onto the input lag test, gotta go find my camera and a CRT.
As pointed out by Giolon, I havent noticed any input lag but for some reason I think it will be around 33mS.
It will be interesting to see.
 
I personally don't care if it resolves arguments about input lag, but at least we have a "user" benchmark that people can use as a measuring stick. There are some people who freak out over one frame of lag, and some who don't care if there are four.

Here ya go:

BenQ V2400W - April 2008 build date. Cloned on Windows XP clean build, ATI 3650 card, Catalyst 8.6. Results tested and basically identical regardless of port 1 vs. port 2 for the V2400W.

DVI connector, 1600x1200 resolution at 1:1 mapping. Camera set at ISO 200, 1/200 second exposure

20 shots vs. a Compaq P1210 Monitor. Delay counted in ms, delimited by commas.

15, 20, 0, 0, 16, 31, 15, 0, 0, 15, 16, 0, 0, 16, 0, 0, 16, 15, -15, 15

Average of 8.75 ms.

Best shot -15ms (V2400W was ahead)

2691107694_def805514c_b.jpg



Worst 31ms

2690296801_4fff2a2c23_b.jpg
 
I have the V2400W and tested it the same as 10e. I did my tests via HDMI though.

I came in with an average around 8-16ms. I had a few shots that were identical to the CRT and the rest either came in while changing (usually the CRT was stable on say 531 and the LCD was changing to 531 from say 516..usually about halfway done which indicates to me about 1/2 of 16ms or around 8ms) and some that came in always 15ms or 16ms behind. I think I had one shot out of a couple hundred that was more than a frame behind so it's pretty safe to say the V2400W is sub 1 frame of lag.

If you really want to get technical this indacates true input lag on the V2400W to be around 3-11ms... as you have to add in response time (5ms here) to get the total of 8-16ms.

It's a great monitor for gaming and im happy
 
....I'll get onto the input lag test, gotta go find my camera and a CRT.
As pointed out by Giolon, I havent noticed any input lag but for some reason I think it will be around 33mS.
It will be interesting to see.

I have the results.
My Plasma TV (Panasonic 42PZ80B) lags a Dell 15" CRT by 31mS in almost every photo (7 results).
Connected via DVI/HDMI to the TV, DVI/VGA to the monitor, on an 8800GT.
 
I dont get it its just counting?

Different displays take different time to display the same thing.
So using a very fast counter, and a camera you can take an instant in time and see what each displays.
The display showing a lesser time is the slowest and you can tell by how much by subtracting the lesser from the larger time.
 
I made the LCD lag tester at LCDlagtester.goolepages.com The problem I found is that unless you are using some kind of VGA splitter, the results won't be accurate. Moving from one dual screen system to another one with a different video card produced different results for me.

Also, monitors refresh from top to bottom (even LCDs), so you can't just use a timer in the middle of the screen to get accurate results. The refresh of each monitor will be out of sync.
 
I made the LCD lag tester at http://lcdlagtester.googlepages.com/ The problem I found is that unless you are using some kind of VGA splitter, the results won't be accurate. Moving from one dual screen system to another one with a different video card produced different results for me.

Also, monitors refresh from top to bottom (even LCDs), so you can't just use a timer in the middle of the screen to get accurate results. The refresh of each monitor will be out of sync.

I already tool the liberty to post your link :)
Anyway, how does your test work?
 
True,

I find nVidia to have a lot more variance. All my input lag testing is done on ATI. With nVIdia (especially with Vista x64) the second port was always faster. ATI I've never noticed a difference switching back and forth.

I plan to get a DVI splitter and try it out, but I highly doubt I'll see a large enough difference to quantify.

I downloaded your tool a long time ago, but I didn't try it just to keep results consistent with what we've seen before.

Face it, you are just too advanced for us. :)

I made the LCD lag tester at LCDlagtester.goolepages.com The problem I found is that unless you are using some kind of VGA splitter, the results won't be accurate. Moving from one dual screen system to another one with a different video card produced different results for me.

Also, monitors refresh from top to bottom (even LCDs), so you can't just use a timer in the middle of the screen to get accurate results. The refresh of each monitor will be out of sync.
 
I am already aware of the digital versus coverage but there is some question of their methodology and accuracy. So some more results of monitors they already covered would still be good.

Yes, after my having done several tests, I consider digital versus (prad.de, etc.) data a very rough approximation. Many factors may play a role.
BTW their list is not complete.
 
someone want to try this with thei Soyo 24" mva panel? I'd like to get a number to have in mind when comparing to other monitors.
 
Hello,

Here is my unit:

NEC LCD2690WUXI-BK, built April 08, Revision 4F on an ATI 3650 card

Resolution at 1920x1080 @ 1:1 and 60hz, overdrive toggled to "on" within monitor's advanced OSD.

Camera at 1/200 exposure, ISO 200

18 out of 20 shots were readable with ms delay in comma separated format:

16, 47, 32, 32, 32, 32, 31, 47, 32, 47, 31, 15, 16, 31, 15, 16, 31, 31

Average of 29.6667

The NEC is on the left, and the CRT on the right.

Best shot:

2692641312_78f4cec090.jpg


Worst shot:

2691827685_44d9c19107.jpg
 
Here is one to chew on. I wanted to see how the screen location affected updates and I prefer numbers to Yelnats lines, so I ran 3 copies of the inputlag. It looks to me like CRT updates top down. LCD bottom up or perhaps some other strange pattern. Most of the time. The LCD was in synch by the bottom. But I got one interesting anomaly. This was the only time I got a reading bigger than one frame.

tripletimenl8.jpg
 
Here is the BenQ FP241VW,

The camera has been set to 1/320 for this one and I upped the ISO to 400. Since it's CCD based the shots might appear grainy.

Same as before, ATI 3650 on a new installation of XP, DVI connector of FP241VW, resolution of 1920x1200 @ 60fps on both screens with the FP241VW in "standard" picture mode. The FP241VW is set to 1:1 mode for aspect scaling (ie. no scaling)

I re-ran this twice because the results didn't make sense to me. I then reran it at a different resolutions. Then I turned GPU scaling on and off in the CCC (similar to "flat panel scaling" in the nVcontrol panel). I found some weird stuff.

1920x1200 and 1920x1080 with GPU scaling OFF and "centred timings" ON:

1920x1200

16,0,0,0,0,0,0,0,16,0,16,16,31,31,0,0,0,16,15,15 for an average of 8.65 ms. Seems low

1920x1080

0,15,16,0,0,0,0,0,0,0,16,0,15,0,16,31,15,16,0,16 for an average of 7.8ms :confused: Seems lower ?

At 1440x900 with GPU Scaling set to off and "centred timings" set to ON

31,63,47,31,47,47,31,31,47,47,47,31,31,47,47,16,31,31,47 for an average of 39.5ms :mad:

At 1440x900 with GPU scaling set to ON and "centred timings" set to ON

0,16,0,16,0,15,31,16,15,0,16,0,16,16,0,0,0,16,0,0 for an average of 8.65ms.

So it seems with this monitor the scaler gets used even when centred timings are set to ON and the display itself is set to 1:1 mode and causes input lag to rise dramatically (uh 400%).

This reinforces also what we tend to say to people who ask about it, and that's to USE the video card SCALER.

Here's the best and worst with the FP241VW on the left:

Best:

2693619079_176cd76440.jpg


Worst:

2693619221_06a990b43a.jpg
 
Thanks for this tip with the FP241VW!

I thought it felt a little laggy without using the video card (Nvidia) scaler as well as the one in the monitor set to 1:1.....and you've confirmed what I felt. It isn't terrible lag, but is quite noticably worse. With it set using both the cards scaler and the monitors the input lag feels like its basically zero! Without using the cards scaler I can feel a good 2 frames of lag there for sure. I'm pretty sensitive to it.

I wonder why this is happening? Would be nice to just have the monitor not use any scaling (Input lagging) when set at 1:1 on it's own.

This is very strange.
 
Here is the BenQ FP241VW,

The camera has been set to 1/320 for this one and I upped the ISO to 400. Since it's CCD based the shots might appear grainy.

Same as before, ATI 3650 on a new installation of XP, DVI connector of FP241VW, resolution of 1920x1200 @ 60fps on both screens with the FP241VW in "standard" picture mode. The FP241VW is set to 1:1 mode for aspect scaling (ie. no scaling)

I re-ran this twice because the results didn't make sense to me. I then reran it at a different resolutions. Then I turned GPU scaling on and off in the CCC (similar to "flat panel scaling" in the nVcontrol panel). I found some weird stuff.

1920x1200 and 1920x1080 with GPU scaling OFF and "centred timings" ON:

1920x1200

16,0,0,0,0,0,0,0,16,0,16,16,31,31,0,0,0,16,15,15 for an average of 8.65 ms. Seems low

1920x1080

0,15,16,0,0,0,0,0,0,0,16,0,15,0,16,31,15,16,0,16 for an average of 7.8ms :confused: Seems lower ?

At 1440x900 with GPU Scaling set to off and "centred timings" set to ON

31,63,47,31,47,47,31,31,47,47,47,31,31,47,47,16,31,31,47 for an average of 39.5ms :mad:

At 1440x900 with GPU scaling set to ON and "centred timings" set to ON

0,16,0,16,0,15,31,16,15,0,16,0,16,16,0,0,0,16,0,0 for an average of 8.65ms.

So it seems with this monitor the scaler gets used even when centred timings are set to ON and the display itself is set to 1:1 mode and causes input lag to rise dramatically (uh 400%).

This reinforces also what we tend to say to people who ask about it, and that's to USE the video card SCALER.

Here's the best and worst with the FP241VW on the left:

Best:

2693619079_176cd76440.jpg


Worst:

2693619221_06a990b43a.jpg


Hmm, wait I don't really get it..

So as long as we run native resolution on computer games were going to get around 10ms average?
 
I like this thread, subscribed.

When testing, are you all using both ports on the video card? Or are you using vga/dvi splitter? I will probably be buying Samsung's T240HD within the next 2 months. I def want to test the input lag on it. Digital Versus says around 50ms, which I find hard to believe for a TN panel.
 
I like this thread, subscribed.

When testing, are you all using both ports on the video card? Or are you using vga/dvi splitter? I will probably be buying Samsung's T240HD within the next 2 months. I def want to test the input lag on it. Digital Versus says around 50ms, which I find hard to believe for a TN panel.

It is the first monitor I've heard of with a TV Tuner in it, so that may have some weird affect on it.
 
I finally got Samsung’s T240HD. The test method I did was plugging each monitor into its own port and running clone mode. So it’s up to you if you want to trust my results or take it with a grain of salt.

System:
Intel x6800
P5W DH
2gb G.Skill GBHZ
8800 GTX – driver version 177.83
Windows XP

Monitors:
ViewSonic E70
Samsung T240HD
Resolution @ 1920 x 1200. I also have a batch done at 1280 x 1024 with and without monitor scaling, but I probably won’t bother calculating them out. If someone wants to, I can send the pictures (~60mb each folder so 120mb total).

30 pictures taken with Canon 870IS
Average top: 28ms
Average middle: 30ms
Average bottom: 29ms
Average: 29ms

29ms seems a lot better than Digital Versus’ 48ms average.

The shots posted are the timer for the middle of the panel.

Best 15ms
IMG_0116.jpg

Worst 47ms
IMG_0110.jpg
 
Philips 42" 1080p LCD (5ms response)
Sony GDM-FW900 24" Widescreen CRT

1920x1080 clone mode with the lcd tv on the output 2 via dvi->hdmi cable plugged into port 1 on my tv.

the crt is from output 1 via dvi->db15 converter

30-40ms lag! youch!

input2.jpg

input.jpg

input3.jpg
 
A question has come up in this thread: how is input lag measured on a 30" screen at its native resolution of 2560x1600, when there are few, if any, CRTs that can run at 2560x1600? Even the FW900 runs at only 2304x1440. Does running in clone mode normally allow different resolutions on the the two monitors?

That thread's author is attempting to do the input lag test on a Gateway XHD3000. Doing the input lag at a lower resolution than native isn't valid, because the monitor's built-in scalar introduces input lag - the lower the resolution, the more input lag.
 
clone mode can only be used with one resolution between both monitors, they can not be set differently from one another.

my fw900 can do 2560x1600@60hz btw. :p:D not that i would run at that resolution cuz it looks like crap...
 
Wish I could borrow someone's old CRT for a while to test out this Viewsonic VX2265WM.... coming from an LG L227WTG, I can definitely feel a difference input lag-wise. The Viewsonic has a better refresh rate obviously at 120hz while my L227WTG was at 76hz, both of which were at its native resolution of 1680 x 1050.

The thing is though, while I do enjoy the benefits of 120hz, I have to maintain that 120 fps @ that native resolution which is a little hard to do with my 8800GTS 320mb... I feel like when I dip below 120 fps, I start to feel a little bit of lag but I'm not sure if it is input lag or anything else. I never had this feeling with my LG monitor. It also sucked to go from 10000:1 to 1000:1 contrast ratio, yarg :|
 
Are you really sure about that? Could you doublecheck?

yep. i tested it a while back using nvidia control panel/custom resolution. for some reason nvidia control panel wont let me use the resolution because the test "fails" although i can still use my computer in 2560x1600. here is a screenshot i just now grabbed for an example

not as bad looking as i remembered! wonder if i could select this res in a game... :)
 
Must have been something Sony wasn't sure it could always do 100% correctly then. When I had one, the resolutions listed would only go up to 2304x1440, but it never occurred to me that I could force it to go higher. Not that I wanted to after one look at that resolution. 2560x1600 at 22.5" is a 0.189mm pixel size, and people complain about text being too small at 0.260mm.
 
Back
Top