120Hz / VSync curiosities

Gabriel5

n00b
Joined
Sep 27, 2009
Messages
31
I have some inclinations as to why I'm observing what I am, but I don't really understand what's going on and I'd appreciate it if someone that understands more about the problem than I do would explain what's happening.

So, I have a somewhat robust computer (in sig) connected to a LN53630A Samsung HDTV. I really like frame interpolation in games because it makes everything look more fluid. However, one of the the downsides is that it makes everything a little less responsive, and that can be a problem in many games for obvious reasons. So I fiddled with some settings and made the observation that VSync affects 'responsiveness' (what I'm calling it, akin to input lag) with 120Hz enabled.

(Notes: everything is above 60FPS all the time. HDMI connected to port#2, the fast one. When 120hz is off, game mode is enabled on the TV. Tests were conducted in LFD2.)


Test # 120Hz__VSync____Results
1______on____on______Very smooth, but lagggy
2______off____off_______Normal, responsive
3______on____off_______Very smooth, and responsive (1< ; ~2,4)
4______off____on_______Identical with 2, as far as I can tell

I don't understand why "VSync off" makes 120Hz more responsive. I figure that the TV only accepts 60Hz anyway, and a much higher frame rate would have no impact. Or is it just that the frame interpolation is simply more accurate somehow?

What am I missing?

Thanks.
 
Your TV isn't running "120hz" true out of the input although this is a debated issue , its running 60hz attached to you're computer. Frame interpolation and refresh rate are 2 very different things.

Refresh rate ( this is a sorta explanation for LCD's since they are judged on response time vs refresh rate) : "The refresh rate (most commonly the "vertical refresh rate", "vertical scan rate" for CRTs) is the number of times in a second that display hardware draws the data. This is distinct from the measure of frame rate in that the refresh rate includes the repeated drawing of identical frames, while frame rate measures how often a video source can feed an entire frame of new data to a display."

Frame Interpolation : "Motion interpolation is used in various display devices such as HDTVs and video players, aimed at alleviating the video artifacts introduced by framerate conversions in fixed-framerate displays such as LCD TVs. Films are recorded at a frame rate of 24 frames per second (frame/s) and television is typically filmed at 25, 50, 30 or 60 frames per second (the first two being PAL, the other two from NTSC). Normally, when a fixed framerate display such as an LCD screen is used to display a video source whose framerate is less than that of the screen, frames are often simply duplicated as necessary until the timing of the video is matched to that of the screen, which introduces a visual artifact known as judder, perceived as "jumpiness" in the picture. Motion interpolation intends to remedy this by generating intermediate frames that make animation more fluid. This feature has received many criticisms from many consumers because it made special effects in movies appear less realistic, and decreases the quality of the cinematography of the film."

Frame interpolation introduces input latency because the frames are doubled thus increasing the amount of work completed before a picture is displayed. Its the pit fall of this technology for gaming and its why it sucks to leave it on when you game.
 
Last edited:
Chances are when the v-sync is off, because of screen tearing, the motion interpolation algorithms are ineffective, and the monitor falls back to repeating frames, so it's spending less time doing the calculations, reducing lag. On the flip side, it shouldn't be much smoother than 60hz, unless this mode also affects the panel driver in some other way.
 
Back
Top