Bought TFT central z2440 review

Joined
Oct 1, 2008
Messages
7
Overall seem better than U2410 but 20 input lag compare to U2410M 10 Input lag. I havent fully compare and read both of the review yet.
 
Is TFT Central's input lag measure to be trusted? I heard it called into question in other threads.

Because aside from that, the ZR2440w seems to be a slightly better version of the U2412M.

Does he comment on the AG coating?
 
they have recently changed to a new method as of that ZR2440w review i believe, using SMTT v2.0 which is much more reliable than traditional stopwatch methods. also an updated and improved version from SMTT v1 which is good. produced by the guy who did all the studies everyone refers to on Prad.de as well.
 
I didn't see any comment about AG Coating. Basically, z2440 is the improve version of U2412M but with higher input lag. All i wonder is can this mon can play BF3?
 
TFT central does not measure input lag properly. The 2412m has 2ms of input lag when measured properly by PRAD.de. Unless the ZR2440 has a resolution scaler it will not have any input lag either. Even if it does most modern displays with resolution scaling only have around 1 frame of input lag (18ms) which will only be bothersome for players coming from a lag free display.
 
I'm interested on how it pits against the u2412m on contrast ratio/color reproduction/IPS glow. Tftcentral also said they're putting it up against the zr24w which is CCFL. Was again wondering about color characteristics on that as well.

They also have a consensus at the end which is very revealing of the author's general opinion.

I don't think they measure it wrong, because there are a lot of ways to measure it. From what I remember tftcentral measures input lag from the moment the signal leaves the GPU to the moment the new frame is displayed on the screen. In any case there is no "right" answer between "does it have 15ms lag or 2ms lag". No LCD monitor has total 2ms total input lag, even the CRTs have of the order of ~1ms if you count everything.

There's a classification (it's usually noted in reviews just before listing the input lags). It's another guideline that should be useful.
 
Last edited:
I'm interested on how it pits against the u2412m on contrast ratio/color reproduction/IPS glow. Tftcentral also said they're putting it up against the zr24w which is CCFL. Was again wondering about color characteristics on that as well.

They also have a consensus at the end which is very revealing of the author's general opinion.

I don't think they measure it wrong, because there are a lot of ways to measure it. From what I remember tftcentral measures input lag from the moment the signal leaves the GPU to the moment the new frame is displayed on the screen. In any case there is no "right" answer between "does it have 15ms lag or 2ms lag". No LCD monitor has total 2ms total input lag, even the CRTs have of the order of ~1ms if you count everything.

There's a classification (it's usually noted in reviews just before listing the input lags). It's another guideline that should be useful.

My biggest interest in this screen is if it can handle 16:9 properly from consoles, unlike the ZR24W and U2412M. The contrast is probably above 800:1 as the latest LG LED IPS panels have been up there.

For lag, they say Class 1, 2, 3. Class 1 is effectively no perceptible lag. Class 2 is under 2 frames, Class 3 above 2 frames, from memory.

This would be considered a "class 2". If the lag is showing no maximum or minimum they have updated their testing procedures. Lag is consistent and should not peak or fall.

20 ms input lag it totally unacceptable.

Next!

The U2412M has like 9ms of total lag, equal with the U2311H/U2312HM, and I have never seen you say one good thing about the U2412M.

In addition, TFTCentral is known to use out-dated methods to measure input lag.
 
Last edited:
My biggest interest in this screen is if it can handle 16:9 properly from consoles, unlike the ZR24W and U2412M.
In its specs, it says: "1:1 scaling supports full HD 1080p letterboxing". So I guess it handles 16:9 properly from sources like consoles and dvd/bd-players. That's a very nice and important feature.
 
Does it say anything about color reproduction when comparing CCFL & LED backlighting? Tftcentral states they reviewed it up against ZR24W which is a CCFL backlit screen.
 
TFT central does not measure input lag properly. The 2412m has 2ms of input lag when measured properly by PRAD.de. Unless the ZR2440 has a resolution scaler it will not have any input lag either. Even if it does most modern displays with resolution scaling only have around 1 frame of input lag (18ms) which will only be bothersome for players coming from a lag free display.

They are using SMTT 2.0 which is the same as 1.0 but supports no V-sync. I don't know how that would impact results though outside of not having sync'd frames.

In my mind this might throw things off more, but I'm no expert.

I have never seen such a bad case of post-purchase rationalization before (he owns a U2312HM).

I don't get it either. It's completely irrational :confused:

Anyway, I picked up the review for a waste of five minutes and three dollars.

Here are the salient points:

1) Input lag of 20ms using SMTT 2.0 which is an updated version capable of working without V-Sync. Seems legit.

2) Low DE94 of 2.0 in default factory settings but with a slightly low/warm/reddish white color temperature, opposite of the Dell U2412M which is slightly cold. Same 1000:1 or higher contrast ratio as U2412M.

3) Same sRGB coverage as Dell U2412M and low power usage

4) Good pixel responsiveness with overdrive toggle on/off. They said the responsiveness was equal to, or better than the U2412M and ZR24W.

So if you have a console, this monitor should work well as long as one HDMI is sufficient. For extreme PC gaming it might not be 100% great, but 20ms of input lag is not bad.

AG coating was similar to U2412M and ZR24W from the review.
 
Unless the HP has rez scaling (I have not read TFT centrals review to find out if it does) I doubt it will have any input lag. I believe PRAD's review is coming out this month so we shall soon find out.
 
AG coating was similar to U2412M and ZR24W from the review.

I also bought the review, and got the impression that the AG coating was better (or "lighter" and "less grainy" as the review said) than the U2410. It was not directly compared to the U2412M.
 
So if you have a console, this monitor should work well as long as one HDMI is sufficient.

Does the monitor have 1:1 or 16:9 scaling options? It's something that the U2412M lacks, which means that console content would have to be stretched (distorted) to 16:10.
 
Still looks like my next monitor purchase will be the 2412..

The upside over the Dell just doesn't seem enough to warrant the increase in price
 
So this has a 20ms input lag? That is not so bad. The old ZR24w has only 10. Can the xbox 360 work with this?
 
PRAD's review is up, input lag is 18.3ms (it has some scaling features) and its response times are slightly slower than the Dell U2412hm since there is some overshoot with the overdrive activated.
 
Last edited:
PRAD's review is up, input lag is 18.3ms (it has some scaling features) and its response times are slightly slower than the Dell U2412hm since there is some overshoot with the overdrive activated.

How about contrast, color (subjective views, factory performance), any direct comparison to other monitors ? (u2412m, zr24w).
 
They are using SMTT 2.0 which is the same as 1.0 but supports no V-sync. I don't know how that would impact results though outside of not having sync'd frames.

In my mind this might throw things off more, but I'm no expert.

Hi!

I am the developer of SMTT and SMTT 2.0. SMTT 2.0 is not "the same as v1", it has some remarkable improvements (see chart below).
SMTT Website: http://smtt.thomasthiemann.com

Both versions require V-sync to be disabled. Please read the articles that describe why it is necessary to disable v-sync.
In fact V-Sync does NOT synchronize the output of analogue and digital signals.

See examples --> http://www.prad.de/en/monitore/specials/inputlag/inputlag-part18.html

All V-Sync does is keeping the screenbuffer long enough to feed both monitors with the same content. The implied delay may take up to 16 ms.



What is measured by SMTT?
SMTT measures Display Lag + "some response time".

So you have to understand the measured values!
If you care how much time is needed until you notice a change of the display content this is the most important value.
If you just want to get the raw signal processing time inside the monitor you have to use an oscilloscope. But even then it can't be done precisely with the measurements methods that are often used, as they still rely on optical sensors that are placed on the panel. There are no measurement tips applied to the electronics of the panel and the measurement delay of the used optical sensors are often unknown.

PRAD gives you oscilloscope values for lag (without technical details how they were acquired - but should be fine) AND the response time. Sum these times up and you will get something that represents the experienced lag.

Depending on the response time the experienced lag will differ. You won't see new content on screen at once when the electronics inside the monitor has zero lag but the response time adds a lag of ~16 ms before the liquid crystals will start to turn to a new position.

I'd like to add a graph that shows the performance of SMTT and SMTT 2.0:
ComparisonofDisplayLagMeasurementMethods.PNG


The oscilloscope measurement is done with the high-end equipment that is described in the PRAD article linked above. This measurement implies about the half response time of that monitor.

There is no standard that says what has to be measured for the "display lag" or "input lag". There is no rule that forces testers to imply or to seperate the response time from the raw processing time.

But don't fool yourself by taking measurements that show the raw processing without response time and believing that this would be the time you see a new picture.
If you do it you misunderstand the readings.

Please be a little bit more precise and look out for the given information what is actually measured and compare on an equal basis.

SMTT 2.0 is a good and reliable choice for the given value. It will give you a result that represents what is really seen on screen. More detailed measurements can be done with oscilloscopes and some knowledge only. But don't trust measurements that have never been described in detail and just appear from nowhere.
 
:) Thanks for reading. One of the goals of the initial report about input lag was to whipe wrong assumptions and misunderstandings concerning the "input lag" issue. The linked article is a rather poor translation from the German original. Sorry for that.

I could sum up my last posting like this:

There are two important facts about Input lag measurements that you have to know:
1.) What is represented by the value a test/review shows and
2.) How reliable is this value.

{end of summary}

1) You need to know what is measured (processing time + (half/full) response time or without response time) to know if you can compare it with other measurements.

2) If the given value is not reliable because of known errors within the measurement method (single stopwatch with or without v-sync) you can't compare it to anything else as it is most likely random.


I don't want to overpraise SMTT. A well done oscilloscope measurement is superior to SMTT and any other photo-based test.
But SMTT 2.0 is "very close" to an oscilloscope if you add the response time. And it is much better than any plain stopwatch test or a "time code" (which is nothing else than a prerecorded single stopwatch).
 
Thank you for the info.

I wonder if my constant harping about TFT Centrals lag testing methodology had anything to do with the recent changes:p
 
Mr. Smith is looking for a monitor as a general upgrade for the monitor he has now.
Practical goal: to find out if a prospective replacement is worth money.

Question #1 (Multiple choice)

Which one of these reviews should Mr. Smith base his choice on?

1. Review #1: input lag measured with oscilloscope is XX ms
2. Review #2: input lag measured with stetoscope is YY ms
3. Review #3: input lag measured with kaleidoscope is ZZ ms
4. Review #4: input lag is hardly noticeable in direct comparison with a CRT

#1?
#2?
#3?
#4?

Question #2 (True or False)

Mr. Smith is pretty happy with XX ms input lag on his monitor as tested last year.
The review below gives Mr. Smith precise info for comparison and making the right decision:

Review: input lag measured with oscilloscope is YY ms.

True?
False?
 
These "questions" are absurd as they imply that only oscilloscope measurements would be useable and these measurement would be automatically perfect.
Both assumptions are not correct.

Propositional logic tells us that false assumptions lead to true answers, no matter how ridiculous they are. So "True" would be the correct choice.

Even though the posting seems to be a try to discredit my postings I'd like to clarify it by answering to the posting:

If "Mr. Smith" is looking for a new monitor, he should watch out for comparable standards in the measurement process.
Using an oscilloscope does neither mean "including response time" nor "raw processing time only". The latter will nowhere be measured correctly as no review site opens the monitor and solders measurement tips to the electronics of the monitor. So even their "raw lag time only" measurements will be done with a photosensor on front of the TFT.
So the holy grail #1 implies some error margin as well.
Let's continue with a Review that uses the raw signal processing time of the monitor and declares it as input lag (quite correct so far as there is no standard definition of "input lag"). Your Mr. Smith compares 5 Monitors with the following results:
1.) 3ms
2.) 6ms
3.) 7ms
4.) 9ms
5.) 5ms

So he selects the unit that is listed as "1.)".
What the review does not reveal (or Mr. Smith does not care about) is the pixel response time of the panel. That looks as following:
1.) 13ms
2.) 11ms
3.) 9ms
4.) 9ms
5.) 8ms

Summing up to total Lag + response time of:
1.) 16ms
2.) 17ms
3.) 16ms
4.) 18ms
5.) 13ms

Oh, what a mess! The raw signal processing time is just one piece of the puzzle that has to be taken into account.

In fact Mr. Smith won't recognize the difference at all but he will believe that his monitor is by far the best as the review with the raw signal processing time told him. And some "Mr. Smith"s may even show their low lag timings off in a discussion forum to tell others that their monitors are ridiculous slow and not worth to be purchased, that their own decision was the best possible and only their oscilloscope based values are reliable - not knowing or ignoring the truth.

-

For the second question: "Mr. Smith is pretty happy with ... input lag on his monitor."
-> So why upgrading? Why bothering about any further tests if he is already happy?

To be true input lag is nothing to be happy about and it is desirable to be minimized but it is only an issue if the lag is really high.
Noone of the input-lag-fanaticism people ever consider the fact that a plain screen refresh at 60Hz takes about 16ms.

So if something "important" happens at the lower end of the screen at that point of time the first pixels starts with a new refresh, it will take almost these 16ms until the area that shows the action starts to be refreshed. Even with a CRT!
OK, there are some happy people that have high end CRTs that run acceptable resolutions at 100Hz. But even in this case the afore mentioned 16ms will be reduced to 10ms. So there is always some kind of lag and as long as there is no brand new display technique (including graphics cards that transmit the picture "at once") you will never be able to reduce the lag to zero.
Instead of looking for a monitor that has 1 or 2 ms lower input lag I'd suggest to take a closer look at the display quality (e.g. color reproduction, etc).
And instead of blaming measurement techniques to be far out as stetoscopes and kaleidoscopes I'd suggest to read something about the techniques first.
 
Thank you for the info.

I wonder if my constant harping about TFT Centrals lag testing methodology had anything to do with the recent changes:p

Probably not. If you ever stopped complaining about trivial stuff you'd either be dead or the world is about to end, so we'd have bigger issues ;)

BTW the article on Prad discusses indirectly why the DigitalVersus/LesNumerique test based on Flash/Shockwave is basically useless. It cannot refresh at a consistent rate regardless of what frame rate it's supposed to.

The [H]ardforum denizen have been bitching about poor input lag testing for years.

Thanks for the excellent responses Thomas!


I remember wondering back in 2008 why the output of my 8800GTS 640mb at the time was not sync'd between two ports. One port was always a little faster than the other. I think it was always the second port.

The article on Prad helps confirm that.
 
Last edited:
@derGhostrider

Thank you but you haven't helped Mr. Smith.
Let's help him before confusing him even further with input lag + response time "puzzle" (summing up cows and chicken). :)
 
This is actually an interesting conversation, and I appreciate the contributions from all parties, even though I don't game and response time is low on my list of monitor criteria.

It seems that the challenge is going to be in developing a set of useful assumptions or rules of thumb that can be applied when looking at both a new monitor (ie, one that hasn't been subject to any controlled response tests) & a monitor that has been subject to controlled testing.

To further confuse the terminology here, the overall response lag is comprised of processing lag & refresh lag. Is there anything else that factors into the time it takes from the video card sending a signal and the monitor displaying it?

The processing lag is determined by the rate at which the monitor's electronics interpret the signal. This is generally going to be the scaler -- are there any other functions in a monitor that can introduce significant processing lag? Is there a baseline below which no monitor can go for processing lag? If you assembled a dozen different monitors, all with no scalers, would the processing lag be identical or similar for each? Are there scaler chips that are used across a range of monitors, and if so do they perform similarly across that range?
 
This is actually an interesting conversation, and I appreciate the contributions from all parties, even though I don't game and response time is low on my list of monitor criteria.

Absolutely.

To further confuse the terminology here, the overall response lag is comprised of processing lag & refresh lag.

Yes, it's confusing (at least).
To help clarify this confusion please pay attention to how exactly input lag and response time are visibly manifested in the screen. What would be your descriptions?
Mr. Smith will appreciate it.
 
Is there a baseline below which no monitor can go for processing lag?

Yes.
The minimum possible Lag is very close to 0 ms. At least it is smaller than 1ms.

The minimum processing that has to be done is on a single-pixel-basis:
- Taking the sent 10 bit TMDS signal that represents 1 color component and decrypt it to the 8 bit color value.
- applying the corresponding voltage that reflects this 8 bit color to the sub pixel.

All three colours (RGB) are transferred synchronized on three seperated wire pairs.

If you want to argue with higher precision than ms you may start to calculate the time that is needed to transfer the required 10 bit for each pixel when they are transmitted at the usual bitrate of 1.6Gb/s as seen in the example below.
1 bit is represented by a signal that takes less than 1 ns. 6,25*10^(-10) s = 625 ps to be precise. (0,625 ns or 0,000000625 ms if that helps to understand how short this time is)

And it is quite tricky to capture such high frequency signals. This is an example of 1920x1200@60Hz (1.6Gb/s) taken from the DVI-connector. The signal quality is quite poor in this example due to poor soldering and shielding.
dvi-trigger-problem_3.png

This is the point of time where the blanking area ends and the very first pixels of the screen are transmitted. One of the narrow spikes (look at the right) represents one single bit. The wide spikes at the right represent several 1s in a row. There are 16 bits displayed within one unit of the major scale on the x-axis (10ns, divided into 5 smaller parts representing 2ns sections).

To transmit one pixel (10bit) 6,25 ns are needed. Add some time for the way the signal has to travel, add some time for the decryption, add some time for the final voltage to be applied.

That is the minimum time that has to be spent. It's negligible.

I have already measured 0.625+-0.037 ms as inputlag for a monitor - taken with a very fast photosensor (that has by far less lag than the measured time) and a 12.5 GHz oscilloscope. So within these measured 0.625ms there is still some amount of pixel response time as I did not solder the tip to the electronics.

To sum it up as final answer: It is possible to create monitors that have (almost) no delay for the signal processing.


But:
The time needed for the first pixel to reach "at least 50%" brightness has been measured with 5+-2ms.
(See: http://www.prad.de/en/monitore/specials/inputlag/inputlag-part25.html - minimal estimation of input lag)
That's almost 10 times more than the anticipated processing time and influenced by the pulsed backlight. (see the figures of the posted article!)
It's not a defective monitor. The backlight of monitors with CCFL backlight are always pulsed. At that time I did not have a TFT with LCD backlight to test.

There are many technical details about TFTs that people usually don't know. And they don't have to know if they just want to use it. That's why some are trying to find easier but still reliable access to the features that may be of interest.

So with almost zero lag in the signal processing the response time will be the major aspect for the perceived input lag of the monitor.

----

This is generally going to be the scaler
Is it?
Maybe it is one of the most popular reasons, but there are many monitors that try to enhance the image quality. They manipulate the color, contrast, brightness and more. While a plain general color-shift may be done on the fly contrast enhancements need to detect contrasts. To detect contrasts you need at least some part of an image to do the processing. This will most likely add additional lag.
There is no manufacturer known to me that offers real measurement values and in depth descriptions what is done inside of the monitor.
Some offer graphs that should show that each component is doing colour calculations at very high precision but without time specifications.
And I once tested a monitor that claims to have a "gamer mode" that should remove any input lag - but that mode just reduced the lag. (If I remember right the lag was reduced from ~32ms to ~16ms).
As long as it is not promised by the manufacturer to process everything in less than 1 ms I'd not give much for vague promises like: "ideal/best for gamers".

---

Is there anything else that factors into the time it takes from the video card sending a signal and the monitor displaying it?
The signals are travelling almost at he speed of light over short distances like 2m. That doesn't care. So there is just the processing within the monitor, the pixel response time and a slight deviation caused by the backlight pulse width.
Even though this deviation is beyond (human) recognition it adds a slight error margin to the measurement.

--
Error Margins
That's one of the reasons why I do not understand some measurement results that are published as reliable tests:
Values are published for signal processing time and pixel response time without any given error margin.

Their readers misunderstand these values as being "precise without any error" but they are not. Repeat these tests several times with a temporal resolution that is "high enough" and you will get several results, an average value and a standard deviation that you should use as margin of error. That's maybe not the full truth as there may be other error sources as well, but it will most likely give a clue how reliable the measurement method may be. If the deviation is much higher than the result it's getting fishy...
 
Perhaps I messed up with the terms I chose above, using words with common (if ambiguous) usages in the monitor universe.

What about "controller lag" and "panel lag." Is it right to assume that all of the processing, including scaling, color correction & emulation, and pixel mapping are handled by the controller, which is probably going to be produced by the monitor manufacturer, beit Dell, HP, NEC or similar?

Does the panel, say one of the LG eIPS models, do any processing? If not, is the time it takes for a pixel to reflect color changes going to be generally reliable for all panels of similar make and model? For that matter, is lag going to vary based on where the measurement is taken on the panel, whether dead center or at the periphery?
 
What about "controller lag" and "panel lag." Is it right to assume that all of the processing, including scaling, color correction & emulation, and pixel mapping are handled by the controller, which is probably going to be produced by the monitor manufacturer, beit Dell, HP, NEC or similar?
I am not aware of details to internal signal processing within the monitor. They may use several DSPs to process the signal in several steps or they may offer an all-in-one-chip that does all steps at once or in multiple turns. My studies focused on the result not the reason.

Does the panel, say one of the LG eIPS models, do any processing?
As far as I know the bare panel is just the combination of a glass plate and an array of thin film transistors that cause the liquid crystals to flip their orientation and some wiring to connect it to the rest of the electronics. Nevertheless it may be sold as a unit with basic controller chips that handle the assignment of voltages to each subpixel. Due to wiring limitations they can't be swiched all at the same time and are rastered comparable to the way the signal is transmitted.

If not, is the time it takes for a pixel to reflect color changes going to be generally reliable for all panels of similar make and model?
The time may shift from monitor model to monitor model and from brand to brand as the same panel may be used with different kinds of overdrive (most important) or without any overdrive but with slightly different voltage "distributions" (how fast will the target voltage be applied: Edge steepness, overshoot, etc)

For that matter, is lag going to vary based on where the measurement is taken on the panel, whether dead center or at the periphery?
I don't think so as the used techniques do not differ from the center to the periphery:
All bits are manipulated by the same circuit using the same algorithm in the same way.
The (sub) pixels in the panel are accessed with a fixed frequency one by one, line by line. If several pixels would show more or less lag the frequency would have to be adjusted on a subpixel basis. Which is not just quite strange to believe: It does not fit to the results on screen.

Old versions of SMTT had a "flicker" test that I used for oscilloscope measurements. It offered the option to change the screen (or screen buffer) content at a fixed rate. As soon as set to frequencies above the refresh rate of the monitor each displayed picture showed parts of several rendered frames. 60 Hz refresh rate, 600 FPS = 10 horizontal stripes on screen with alternating colour.
If pixels at the periphery would occupy more (or less) processing time than the pixels at the center the first and last stripes would allocate less (or more) space on screen than the others. All pixel lines reach the left and right edges but the first and last lines consist of periphery pixels only.
I could not see any difference between those areas. The stripes seemed to be equally distributed.

I'd say that it is reasonable to assume that 2.3 million pixels or 6.9 million subpixels (1920x1200) are treated equally in terms of the drive for each subpixel of the panel (applying the voltage) as long as there is nothing suspicious observerd that contradicts to this assumption.
 
Back
Top