input lag

phaseone

n00b
Joined
Mar 14, 2013
Messages
7
Hi. I know that the samsung 2233rz is an old model but that's what I'm looking to purchase. I want to get some feedback on let's call it "input lag theory" ...

I've noticed that nearly every review site focuses on the "average" input lag readings. Here's my concern: if we neglect to consider the impact of the "bottom end" reading or the "worst" reading we may fail to judge the true impairment our screen's lag is producing (in relation to hardcore gaming of course)

In the case of the 2233rz the readings tend to be somewhere in between 10 and 15ms avg. Now that sounds great but unfortunately it seems to produce a 30 or 32ms lag on its bottom end. Now, if that measurement comes up say, 5 or 20 times in a second, I would be concerned. But then, of course, if it were to come up only once after 2 or 3 seconds I doubt it would be cause for concern.

The samsung 2233rz seems to be one of the few gaming monitors that has such a high bottom end, but again how frequent is the monitor producing that bottom end?

http://www.prad.de/en/monitore/review/2009/review-samsung-2233rz-part9.html#Lag
http://www.tftcentral.co.uk/reviews/samsung_2233rz.htm

I don't know the specifics on how input lag is measured so I hope the tech experts can weigh in and help me find some comfort in this purchase I'm likely to make soon.

Thanks
 
Lag doesn't fluctuate. It's the testing methods that are flawed. If lag fluctuated like that, there would be obvious jitter. Neither of those sites did a proper test for that monitor. It's likely that monitor has no lag at all other than pixel response times, or if there is lag, it's probably no more than one frame (8.33 ms at 120 Hz).

Here are the main problems with most timer-based tests:

Most timers are not guaranteed to update at any particular interval, especially that Flash-based Flatpanels.dk timer, which has a tendency to update erratically and doesn't update often enough to create an accurate result. I've seen pictures that show the timer didn't update for 20-30 ms, which is enough to miss an entire frame.

Monitors refresh from top to bottom, but mirroring two displays through the video card is also not guaranteed to be synchronized. It's still possible to do an accurate lag test with mirroring, but you have to be able to see where each monitor is refreshing.

Testing one part of the screen with a timer doesn't let you see where each monitor is refreshing. If a picture is taken right after the timer was refreshed on one monitor but the other monitor is a few milliseconds behind due to pixel response times and/or desynchronization from mirroring, it will appear to be one frame behind (or two frames behind if the timer missed a frame). That's how a 1-2 ms difference can appear to be 15-30 ms at times.

Using a timer also makes it hard to see if a new frame is starting to appear because pixel response times are not instant, so people end up seeing the stronger older number even if the newer number has already started to appear. The blending and tearing of numbers can also make numbers hard to read.
 
Are you sure they don't fluctuate? I wouldn't expect a fluctuation in the case of measuring the numbers on a stop watch because those are simple and consistent productions. But when the screen is being made to produce complex images such as the ones in modern video games, and made to produce those images as fast as possible with different pre-processing techniques, I'd think there'd be both input lag as well as variations.

I appreciate your technical explanations very much and I too am very skeptical about testing methods, even though, I'm not hardly familiar. I wonder how it is that different sites end up with similar results? Could that be a result of dishonesty or is it possible these guys are using testing methods that do produce accurate information?
 
That's a really long article, I did read a lot, thanks.
Please allow me to pose my question once more, perhaps with less clutter this time.

Very thorough websites such as the one linked to above give figures for samsung 2233rz input lag as such:
average fluctuations from 0 to 16ms input lag
maximum lag readings of 32ms

Please, someone tackle these readings: obviously there is doubt in the findings, obviously testing methods are flawed, but good review sites are producing this very result, what should I take from it?

I don't like the 32ms result. But if someone were to confidently explain to me that this reading is likely a fluke miscalculation or something, I'd feel more confident purchasing this monitor. Likewise, if someone had information that this high lag reading is not produced "often" by the this monitor I'd again feel more confident.

Thanks for any response.
 
You should take that the tests performed were unfortunately using substandard testing from sites that normally use standard testing methods. As you see from the sites as they get more info on how to do it correctly they all are changing over to 'scope. Which means their older tests can't be trusted which means no-one can tell you anything unless they can test with the better method. Someone already explained the entire test should be seen as a fluke. Someone with the screen will have to weigh in.
 
Any value obtained using the Flatpanels stop watch timer is usually totally wrong and often laughable, as are most values one will come across online since they are usually obtained using the flat panels timer.

PRAD used the flatpanels timer to test the 2233rz back in 2009 because no one knew any better at the time.

The SMT Tool is currently the only timer+camera method proven to be accurate, IF done correctly. PRAD has been using an oscilloscope since mid 2010 to measure the signal delay and pixel response times.

Even if done correctly the SMT Tool still poses a problem because we don't know what % of completion the camera is capturing the pixel transitions since the numbers become visible very early during the pixel transition.

From my research there is no 100% proven method, in the PRAD article, the author mentions that opening up a monitor and solder the oscilloscope probe tips to a specific part of the monitor which I can't remember at the moment. One problem with combining the measured oscilloscope values (signal delay+pixel response times) is that panels with super slow pixel response times but minimal signal delays have misleading "total input lag/delay," values.

Let's say Bill Nye The Science Guy measures the well known to be delay free Samsung F2380 which has 25ms+ pixel response time average and gets a 1ms signal delay for a "total lag/delay," of 26ms.

Bill Nye then measures the pixel response times on the Dell U2713HM which has a 16ms signal delay and 6ms-ish pixel response time average for a "total lag/delay," of 22ms.

Which display feels more like a CRT? The Samsung F2380 does and ToastX will testify to that.

So should pixel response times and the measured signal delay be combined? I don't think so from my experience with tons of displays, but I also do not understand technology. I can tell you if a monitor looks good, has lots/no ghosting or input lag but I an not write a detailed paragraph about how LCD technology works.

I don't think so...I also think that a subjective evaluation goes a long way. All one needs to do is get the feel for a CRT and a display that has been proven to have a 1 frame delay and compare them to whichever unknown display they are testing. If TFT Central had done this they would not have published half of the ridiculous results they obtained with the flash based timer (Flat Panels).
 
Last edited:
Thanks again for the contributions guys.

I'm glad to know tftcentral is not a good place to get input lag readings from. Could you suggest how much I should trust the readings at digitalversus.com?

I am currently looking at the Dell 2209WA since it seems to be one of the few monitors that don't (according to the readings) spike above 20ms and has a low average delay as well. As I've been explaining I fear that if we ignore those big spikes we are doing ourselves a big disservice because over the long run, let's say.. after a full minute of gaming (or even much less, a few seconds) one will have been hampered very consistently by those high spikes regardless of what the average is.

Again, the Dell 2209WA seems to be one of the few outside of the Samsung 226BW and some new models like the Benq xl2410t that produce a "consistently" low input delay.

Please, those knowledgeable readers weigh in on the input lag results at digitalversus.com
Goto: http://www.digitalversus.com/lcd-monitor/face-off
Click "Add"
search for 2209WA
select it and click enter
then select "input lag vs CRT" from the menu.

It will show a detailed graph of their readings compared to whatever other monitor you choose to add.
ps: I just discovered digitalversus results for the 2233rz now and they look better from measurements on other sites ie: no high 30ms spikes. This make me even more interested in your advice on the accuracy over there at digital versus.

Plz, thx :p
 
ps: NCX: I think you're correct in questioning the importance of pixel response in comparison to input delay. Pixel response is already low on pretty much every panel, it's input delay that makes or breaks a monitor imo (for the hardcore FPS gamer of course)

Though, I don't know if it's possible to measure input lag without the pixel response affecting the result.
 
Digital Versuses uses the flash based timer.

There are no lag spikes.

Any value you come across that was not obtained with an oscilloscope or the SMT Tool is wrong, and many of TFT Centrals SMT numbers are way off vs. PRAD's oscilloscope measurements, but I do not know why.
 
Last edited:
If as you say there is no fluctuation in a monitors input delay could you explain prad's wording here:

"25 percent of the measurements indicated a minimum lag of 10 ms. A further 25 percent demonstrated a lag of 30 ms or 2 fps. In 41 percent of the measurements, we measured a lag of 20 ms and just 9 percent of the measurements had a lag of as much as 40 ms or 2,5 fps. From these results, were able to establish an average input lag of just about 1,5 frames per second. This would sound good but for the 9 percent chance that lags of up to 2,5 frames may arise."

I've come across wording like this in many other places. It seems to indicate "delay fluctuations arise"

Assuming you and toasty are correct that there are no delay fluctuations, then the average readings are likely the important ones because the fluctuations are simply flaws in the testing methods.... am I correct? Still, please offer some explanation as to prad's wording....
 
Assuming you and toasty are correct that there are no delay fluctuations, then the average readings are likely the important ones because the fluctuations are simply flaws in the testing methods.... am I correct?

No. You can't simply average the figures obtained from a flash based timer. They are random and imprecise. You're averaging values of frame length only (0, 8.3ms, 16.7ms, 33.3ms etc). Flash didn't have an accurate timer (maybe it does now). The flash timer itself fluctuates.

Still, please offer some explanation as to prad's wording....

We thought the Earth was flat at one point. Prad and everyone else thought using a Flash based timer was a good idea. As mentioned above, ignore every flash based timer input lag measurement ever made by anyone, regardless of who they are.
 
Back
Top