High speed photos: 120hz LCD vs CRT

Ritorix

[H]ard|Gawd
Joined
May 5, 2002
Messages
1,825



So I pitted my Acer 120hz LCD against an older LCD and a CRT, with a high-speed camera.

I have no DVI splitter cable, so this was in clone mode. Two different DVI outputs from the same video card. Cloning is known to add input lag, as is running non-native resolutions. So as a test of actual input lag this fails, as will soon become obvious. Knowing this all going in, hell its neat so I did it anyway. :D

Acer235hz vs old Samsung 204T

The old Samsung was not known for its responsive screen, but I was curious how this would turn out. Results: there was not always a difference, some frames were exactly the same. But when there was, it was always 33ms in the Acer's favor. Weird.


33ms difference
Exposure time here is 1/250th of a second, looks fairly normal.




Acer235hz vs MAG986FS

Next the old CRT is dragged out of a closet for this test.


Faster camera speed now...only 1/1000th of a second, 1ms. Things start getting weird looking.


If you look closely, you can see the Acer ghosting from 944 to 977. The CRT is already a frame ahead, at 38 seconds 10ms, ghosting 977 and the 7 from '37'.


Agreement!





Exposure time is now 1/8000th of a second.




So why the magic number, 33? Most likely reason is 33ms = 30FPS, so I maxing out the capability of my poorly chosen flash-based timer. On the plus side, it also means it never was 2 or more frames behind, or we would be seeing 66, 99, etc.

I will have to do a 'round two' later with a proper DVI splitter and decent timer. The larger issue on non-native res will not be resolved with this equipment. Anyone have a recommendation on timer?
 
Last edited:
Huge thanks man for investigating into this matter and sharing your results to the community. +1

Hopefully you'll be able to get your hands on some better equipment, then things will get very interesting. :)
 
High speed camera... you mean a Canon EOS-1D Mark II :p

Thanks exif data.

Anyways, really interesting test here! keep it up!
 
Ritorix, if you want very precise results, try this tool:
Refresh Rate Multitool — http://www.hardforum.com/showthread.php?t=1423433

Cloning does not add input lag as long as the LCD's (or LCDs') native resolution is used. What it does do is allow the two outputs to have slightly different refresh rates, for example 120.13 Hz and 119.84 Hz, in which case the two outputs would continually be going in and out of phase. If you happen to take photographs when they're not in perfect phase, it looks like input lag (but it isn't). The same thing happens if you do not use clone mode. Therefore, if you don't take this into account, you'll always get a randomized term added to your measured input lag (not to mention inaccuracies in the flash-based timer).

If you use the tool I linked above, it will be visually obvious when the two monitors' inputs (the video card's two outputs) are out of phase; you'll see tearing. You can wait for them to come into phase before taking a batch of photos. The tool is designed to be used in Clone mode, however, it's not a good idea to use Clone mode unless your CRT can be driven at the native resolution and refresh rate of the LCD. Otherwise, if you position the "logical" monitors so that one is on top of the other (in software, not physically) you can stretch a single RefreshRateMultitool window over both monitors even if they are at different resolutions.

At 120 Hz, each movement of a vertical bar from one space to the next will correspond to 1/120 second, so take that into account when translating the photographs into measurements.

I used this methodology (in Clone mode at native resolution) and found that my 3007WFP-HC has only about 0.2 ms of input lag relative to a CRT.
 
Just out of curiousity. Why does everyone seem to use a stopwatch and not a frame counter for these tests?

Seems to me like a framecounter would be the most accurate, as it's pretty simple to lock on to v-sync and get a perfect counter, rather than relying on the internal clock and stuff like framebuffer-delay being constant.
 
Just out of curiousity. Why does everyone seem to use a stopwatch and not a frame counter for these tests?

Seems to me like a framecounter would be the most accurate, as it's pretty simple to lock on to v-sync and get a perfect counter, rather than relying on the internal clock and stuff like framebuffer-delay being constant.

I've never seen a "frame counter". I sounds like it would work good and give proper resutls. But I don't think I've ever seen such an application.
 
Just wanted to comment on the yellow tint on some of the pictures. I'm also getting this on my AW2310 in some cases. Probably how it refreshes or something.
 
please run this again once you locate a timer that updates faster then 120hz...
 
High speed camera... you mean a Canon EOS-1D Mark II :p

Thanks exif data.

Yeah I'm not the camera geek, thats my fiancee's equipment. She started telling me all the technical details but I just wanted fast pictures. ;)

Its 'high speed' in the sense that it can take one picture very fast. What it cant do is a rapid series of pictures, it does like 8fps or so but each one can be 1/8000th of a second.


Ritorix, if you want very precise results, try this tool:
Refresh Rate Multitool — http://www.hardforum.com/showthread.php?t=1423433

Cloning does not add input lag as long as the LCD's (or LCDs') native resolution is used.

Thanks for the tool, I will need something more accurate on the software end. If I just had a way to display the windows time in a large font, constantly updating, I would do that too. At best I will get down to an 8ms timer difference (120fps for my 120hz) instead of the 33ms (30fps) I had in the first tests. The limiting factor will always be the frame buffer of the graphics card, which will go as fast as my refresh rate. Even if my timer goes at a true 1000fps, the graphics card will only take snapshots of that as it draws the frame.

Unfortunately, the native resolution issue probably cannot be solved. I have a CRT that goes up to 1600x1200, an LCD that does the same, then the Acer does 1080p and cant do 1600x1200, so I settle at like 1600x1080. Since that was the same way some review sites measured, I wanted to at least see if I could confirm that and maybe compare lags at different non-native resolutions - is it worse at 640x480 vs 1024x768 vs 1600x1080? Might just be an unsolvable problem with the equipment I have, but I gave it a try.

Another issue I'm not even touching is LCD lag with different color outputs (they give a grey-to-grey measurement for a reason!).

Just wanted to comment on the yellow tint on some of the pictures. I'm also getting this on my AW2310 in some cases. Probably how it refreshes or something.

You mean the yellow tint like here?
http://img690.imageshack.us/img690/4889/highspeed3.jpg
Thats a result of the backlighting on the LCD. I think. It doesnt look like that unless you take a fast picture.
 
Last edited:
Hell if I know, at some point it became more about the neat pictures ;)

I can conclude that under the circumstances (non-native, clone mode) I had no more than 1 frame of lag at 30hz (the draw speed of the flash timer I used). I went through a pretty large number of photos and the numbers were never above a 33ms difference (draw time of 1 frame @ 30fps).

The digitalversus test of the same Acer (also in non-native and clone mode) spiked up to 50ms, which I did not reproduce as it would have shown to me as a 66ms difference in the timers (2 frames).
 
and again with ports / master display swapped. (hint, hint)

I had the Acer on DVI 1, CRT on DVI 2. I dont think either port gets an advantage or we might see the CRT lag behind the LCD in the best case. CRT was also going through a DVI to VGA adapter, since my video card has no VGA output.
 
...I dont think...

You don't know until it's tested. Also, there could be lag with one display set as the "master" or primary or whatever term applies to clone mode. By switching both (port and display assigned to master) we can eliminate both possibilites.
 
Thanks for the tool, I will need something more accurate on the software end.
Have you tried the tool? Have you found it to be inaccurate in any way? For me, it was perfectly accurate and synchronized to the video refresh rate. When I start it up, the bars start by moving slowly, but then quickly speed up and stabilize, at which point the bar moves exactly one notch to the right on each LCD/CRT frame, wrapping around after it reaches the rightmost notch. I verified its perfect accuracy by filming it at high speed (120 Hz with a video camera), and sure enough, there were no gaps in the cadence — every two frames in the 120 Hz video, the bar advanced one frame on screen (my LCD is 60 Hz).

It doesn't display numbers, but all you need to do is count the number of times the bar moves to the right (accounting for the fact that it wraps around) and divide this by the refresh rate (e.g. 120 Hz) to get a time measurement.

If I just had a way to display the windows time in a large font, constantly updating, I would do that too. At best I will get down to an 8ms timer difference (120fps for my 120hz) instead of the 33ms (30fps) I had in the first tests. The limiting factor will always be the frame buffer of the graphics card, which will go as fast as my refresh rate. Even if my timer goes at a true 1000fps, the graphics card will only take snapshots of that as it draws the frame.
The thing is, displaying a timer on screen using fonts (i.e., numerals) is fundamentally flawed. The numbers can blend together, making reading of the timer subjective. This problem is reduced by displaying at least three timers, one on top, one in the middle and one at the bottom, but this doesn't eliminate the problem; it only reduces it.

And flash-based timers probably have an imperfect cadence, adding jitter, because they're not designed to count frames and don't implement v-sync. The same problem would apply if you displayed the Windows time by any means.

Case in point: One of your photos has the LCD ghosting from 37.944 to 37.977. That's a difference of 0.033, whereas if it were synced to the LCD refresh rate the difference should only be 0.008 or 0.009.

Unfortunately, the native resolution issue probably cannot be solved. I have a CRT that goes up to 1600x1200, an LCD that does the same, then the Acer does 1080p and cant do 1600x1200, so I settle at like 1600x1080. Since that was the same way some review sites measured, I wanted to at least see if I could confirm that and maybe compare lags at different non-native resolutions - is it worse at 640x480 vs 1024x768 vs 1600x1080? Might just be an unsolvable problem with the equipment I have, but I gave it a try.
If the native resolution issue can't be solved, then use the other technique I described. Run the LCD at its native resolution, but run the CRT at the highest resolution it can do that matches the refresh rate of the LCD but not its resolution. For example, run the LCD at 1920x1080@120Hz, but run the CRT at 1280x960@120Hz. Arrange the "logical" monitors so that one is on top of the other; for example if you're using Windows XP, you'd go into Control Panel -> Display -> Settings, and drag Monitor 2 placing it directly underneath Monitor 1. Then start up RefreshRateMultitool, and resize its window so that the smaller monitor is filled with one half of RefreshRateMultitool's window, and the other half of RefreshRateMultitool's window occupies the entire vertical height of the larger monitor, but only partially occupies the larger monitor's horizontal width.

The monitors obviously don't have to be physically arranged to be on top of each other. Just arrange them so that you can take a photograph with both monitors fully inside the camera's frame. And wait until the tearing disappears (floats into the overscan) before taking a round of photos (most likely you will get tearing, that periodically floats from the bottom to the top of one of the monitors, or from the top to the bottom, because the video card won't match both of its outputs precisely in refresh rate).
Another issue I'm not even touching is LCD lag with different color outputs (they give a grey-to-grey measurement for a reason!).
That's why using RefreshRateMultitool is so precise; it allows you to completely ignore "response time" and focus on input lag. With the vertical bars that RefreshRateMultitool displays, you can look in your high-speed photographs for the exact point at which the black begins to fade into gray; whereas when using a timer displayed on-screen using some font with numerals, there's no way to precisely see when white starts fading to black, or whatever color combination you use.

EDIT: Excellent, just noticed that Druneau has tried this technique:
http://www.hardforum.com/showthread.php?t=1499273
I'll read that thread now...
 
Last edited:
Here is a "slow speed photo" of how a mouse cursor looks in 60hz vs 120hz. An indicator of why every 120hz reviewer find Windows experience smoother in 120hz. The faster you move your mouse, look around in an FPS game, or pan around on a RTS game for instance, the more FPS you need for it to look smooth. It is most noticable when large parts of the screen moves fast. Also if your FPS is variable, it is better to have high Hz so when your computer finally manages to update the framebuffer, it "hits" closer to a screen refresh.

60hz_vs_120hz_smooth_mouse_cursor.jpg
 
Back
Top