Trend: TV Input lag continues to decrease

realworld

Limp Gawd
Joined
Apr 24, 2016
Messages
484
Have you guys noticed that this year's TVs are producing some record setting input lag numbers?

Using Rtings.com as reference:
In 2015, most TVs have input lag of around 30-40+ ms. In 2016, we've seen it dropped to 20-30+ ms. This year, >10ms have been common in TVs, even in HDR mode.

For 2018, with the stupidly fast HDMI 2.1 interface, can TVs drop down to gaming monitors-level of input lag? Find out on the next episode of Dragon Ball Z...
 
I've had a 40" TV for a monitor for the last 3 years. Just can't go back. Any 4K less than 40" is dumb, but now as the OP mentions, we're finally getting large screens with low input lag.

Maybe one day, they'll make a screen as good as a CRT in this regard, lol, as if...
 
I wouldn't call it low but it has gotten lower. If they can get them 10ms or less then I'd be more impressed.
 
Maybe one day, they'll make a screen as good as a CRT in this regard, lol, as if...

I think people forget the brilliance of the simple design that went into the old analog displays. It was all circuitry - no logic chips were needed. Oh sure, there was some digital signal processing, but it was mostly used to store memories for settings of certain circuits like focus, convergence, and geometry. Digital displays added a whole new level of complexity to the design that slowed things down.

EDIT: Even CRT monitors with digital to analog converters thrown in the mix (Xbox One > HDFury > CRT Monitor, for example) are faster than the fastest digital displays because the only thing the converter is doing is converting the digital signal into analog RGB, and the CRT takes care of the rest. The digital to analog conversion only adds a couple of milliseconds I believe. /EDIT

I think eventually we'll get to the point where digital displays are fast enough but that may still yet take awhile.
 
That is great to hear. This is the reason I bought an NEC 46inch monitor/tv for business's. It had VERY VERY low input lag.

Been eyeing a 4k TV. Maybe 2018 is the year I finally do.
 
I think people forget the brilliance of the simple design that went into the old analog displays. It was all circuitry - no logic chips were needed. Oh sure, there was some digital signal processing, but it was mostly used to store memories for settings of certain circuits like focus, convergence, and geometry. Digital displays added a whole new level of complexity to the design that slowed things down.

EDIT: Even CRT monitors with digital to analog converters thrown in the mix (Xbox One > HDFury > CRT Monitor, for example) are faster than the fastest digital displays because the only thing the converter is doing is converting the digital signal into analog RGB, and the CRT takes care of the rest. The digital to analog conversion only adds a couple of milliseconds I believe. /EDIT

I think eventually we'll get to the point where digital displays are fast enough but that may still yet take awhile.

They've been able to make low lag panels for a good decade no problem, the problem is that they choose not to bother for obvious cost savings. As you noted, CRTs are inherently superior in this regard, that is, cost cutting a CRT screen could never possibly increase input lag. For LCDs however, it's a 'feature'.
 
They've been able to make low lag panels for a good decade no problem, the problem is that they choose not to bother for obvious cost savings. As you noted, CRTs are inherently superior in this regard, that is, cost cutting a CRT screen could never possibly increase input lag. For LCDs however, it's a 'feature'.

Out of curiosity what actually causes slow input lag? I've always wondered if it was the fact that TV manufacturers used software implementations instead of hardware implementations (aka - have a general DSP programmed to handle video-processing logic versus including a purpose-built video processing chip).
 
Probably due to overuse of processing, if motion interpolation is anything to go by since it is very heavy in load and creates ridiculous input lag. They are most likely optimizing their stuff better nowadays.
 
I picked up an open box samsung 65" KS8000 at best buy that i just could not pass on the price. It has an input lag of 22ms and absolutely love it for gaming at 4k. I can't wait to see what 2018 brings for TV technology
 
Out of curiosity what actually causes slow input lag? I've always wondered if it was the fact that TV manufacturers used software implementations instead of hardware implementations (aka - have a general DSP programmed to handle video-processing logic versus including a purpose-built video processing chip).

Yes, it's all the picture processing that TVs do that causes lag. At >1080p things weren't too bad, but at 4K it's a much bigger problem since the sheer amount of pixels require hefty hardware to do all this full screen processing.

Typically LG are the worst for input lag. It's not even a consideration for them, it's squarely image quality. Samsung/Sony tend to the best for TVs, and were unbeatable back when they had a joint panel making partnership. Sadly, those days are gone, along with the best LCD panel made to date, the mighty PLS!

My old Sony 40" 1080p screen has a PLS and it's tough to beat. I'm still waiting...
 
Yes, it's all the picture processing that TVs do that causes lag. At >1080p things weren't too bad, but at 4K it's a much bigger problem since the sheer amount of pixels require hefty hardware to do all this full screen processing.

Typically LG are the worst for input lag. It's not even a consideration for them, it's squarely image quality. Samsung/Sony tend to the best for TVs, and were unbeatable back when they had a joint panel making partnership. Sadly, those days are gone, along with the best LCD panel made to date, the mighty PLS!

My old Sony 40" 1080p screen has a PLS and it's tough to beat. I'm still waiting...
I posted the following quote in a previous thread, check out the following info on the TCL P607 series with 15ms input lag and 4:4:4 chroma:

TCL P607 series;

http://www.rtings.com/tv/reviews/tcl/p607

http://www.avsforum.com/tcl-55-p-series-model-55p607-4k-hdr-roku-tv-with-dolby-vision-first-look/

http://referencehometheater.com/review/tcl-p-series-uhd-tv-review/

The 55" is available now, whereas the 50" and 65" will be available any day or week now.

Bestbuy had the model # changed to the P605 to avoid price matching (it is the same TV).

It has a 72 zone local dimming backlight, deep native contrast and low input lag ~ 15ms for 4K and even 1080P gaming, and both resolutions will be 4:4:4 on the PC
 
G-Sync monitors have lower latency than a CRT if you are using V-Sync to avoid screen tearing.
I don't know how anyone finds screen tearing acceptable.

Good point! Vsync can add a ton of lag depending on the game.
 
Yup. I recently bought a Sony CRT for arcade games, and it's more laggy in MAME with vsync on than my G-Sync setup.

Motion blur on LCDs still sucks cock, though.
 
G-Sync monitors have lower latency than a CRT if you are using V-Sync to avoid screen tearing.
I don't know how anyone finds screen tearing acceptable.

For me, the input latency from v-sync fundamentally ruins games to the point they're not worth playing, it's just that simple.


I tried a G-Sync screen but then discovered an inherent flaw and had to return it:

Basically, my goto game is Guild Wars 2. A game where framerates in big events will be a choppy 30-40fps on any PC. However, with a 60hz screen and v-sync off, my mouse and UI is still smooth as butter and so whilst the game rendering is choppy, my experience isn't as I'm able to quickly click on things around the screen at full speed.

Not so with G-Sync on. In this case, as the framerate is fluctating alot so too is your mouse movement! It was a massive shock to discover, and made the game frustating and annoying to play. It didn't take a minute to spot, and obvious to the tech when you think about it, and yet, you don't read about it anywhere?

That's probably investor bias, imho... :whistle:


Anyway, picked up a Sony Bravia KD55XD800 on Prime day for £600! Arrives at the weekend, I'll report back on all the nitty gritty. 30ms input lag from what I hear, so I'll see how that goes...
 
Last edited:
As an Amazon Associate, HardForum may earn from qualifying purchases.
For me, the input latency from v-sync fundamentally ruins games to the point they're not worth playing, it's just that simple.
I mean, there are ways of reducing the latency from V-Sync, like reducing the flip queue size to 1 and using a framerate limiter, but it's true that it is going to add latency since it has to buffer frames.
I just don't consider screen tearing an acceptable alternative.

I tried a G-Sync screen but then discovered an inherent flaw and had to return it:
Basically, my goto game is Guild Wars 2. A game where framerates in big events will be a choppy 30-40fps on any PC. However, with a 60hz screen and v-sync off, my mouse and UI is still smooth as butter and so whilst the game rendering is choppy, my experience isn't as I'm able to quickly click on things around the screen at full speed.
Not so with G-Sync on. In this case, as the framerate is fluctating alot so too is your mouse movement! It was a massive shock to discover, and made the game frustating and annoying to play. It didn't take a minute to spot, and obvious to the tech when you think about it, and yet, you don't read about it anywhere?
It depends on the type of games that you play.
Games which use a mouse cursor and raw input are a special case, because raw input renders the cursor at your refresh rate.
Since G-Sync synchronizes the refresh rate to the framerate, if the framerate is bad the mouse input will be bad too.
I would disable G-Sync for that one game, instead of avoiding G-Sync altogether, since they're typically still using high refresh rate panels.

I'm not sure how feasible it would be, but a "High refresh G-Sync" mode might be the solution to something like that.
So if your game is running at 30-40 FPS on a 240Hz panel, it updates the screen at say 5x the framerate for 150-200Hz to keep the raw mouse input updating quickly, instead of 30-40Hz.

Anyway, picked up a Sony Bravia KD55XD800 on Prime day for £600! Arrives at the weekend, I'll report back on all the nitty gritty. 30ms input lag from what I hear, so I'll see how that goes...
You'll probably hate it, since that's almost as much lag as V-Sync adds to a CRT.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
For me, the input latency from v-sync fundamentally ruins games to the point they're not worth playing, it's just that simple.


I tried a G-Sync screen but then discovered an inherent flaw and had to return it:

Basically, my goto game is Guild Wars 2. A game where framerates in big events will be a choppy 30-40fps on any PC. However, with a 60hz screen and v-sync off, my mouse and UI is still smooth as butter and so whilst the game rendering is choppy, my experience isn't as I'm able to quickly click on things around the screen at full speed.

Not so with G-Sync on. In this case, as the framerate is fluctating alot so too is your mouse movement! It was a massive shock to discover, and made the game frustating and annoying to play. It didn't take a minute to spot, and obvious to the tech when you think about it, and yet, you don't read about it anywhere?

That's probably investor bias, imho... :whistle:


Anyway, picked up a Sony Bravia KD55XD800 on Prime day for £600! Arrives at the weekend, I'll report back on all the nitty gritty. 30ms input lag from what I hear, so I'll see how that goes...

Surprises me too, nvidia issues seem to always get a pass

http://www.overclock.net/t/1597089/...o-99-so-vsync-never-kicks-on/10#post_25094298
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
For me, the input latency from v-sync fundamentally ruins games to the point they're not worth playing, it's just that simple.


I tried a G-Sync screen but then discovered an inherent flaw and had to return it:

Basically, my goto game is Guild Wars 2. A game where framerates in big events will be a choppy 30-40fps on any PC. However, with a 60hz screen and v-sync off, my mouse and UI is still smooth as butter and so whilst the game rendering is choppy, my experience isn't as I'm able to quickly click on things around the screen at full speed.

Not so with G-Sync on. In this case, as the framerate is fluctating alot so too is your mouse movement! It was a massive shock to discover, and made the game frustating and annoying to play. It didn't take a minute to spot, and obvious to the tech when you think about it, and yet, you don't read about it anywhere?

That's probably investor bias, imho... :whistle:


Anyway, picked up a Sony Bravia KD55XD800 on Prime day for £600! Arrives at the weekend, I'll report back on all the nitty gritty. 30ms input lag from what I hear, so I'll see how that goes...

That makes no sense, even in that kind of situation you'll still have a better experience with g-sync. I play a lot of games at 30-40fps and it's way better than with v-sync off on a 60hz (or higher) screen. I have played gw2 but also other MMOs so I have first hand experience with this exact scenario. It's smoother, responsive (no one will ever notice the extra 2-3ms it might have vs v-sync off) and of course tearing-free. You might want to read some in-depth testing and explanations such as http://www.blurbusters.com/gsync/gsync101-range/
Of course with g-sync you'll become much more sensitive to any smoothness/lag issue that was always there but just wasn't noticeable before.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
That makes no sense, even in that kind of situation you'll still have a better experience with g-sync. I play a lot of games at 30-40fps and it's way better than with v-sync off on a 60hz (or higher) screen. I have played gw2 but also other MMOs so I have first hand experience with this exact scenario. It's smoother, responsive (no one will ever notice the extra 2-3ms it might have vs v-sync off) and of course tearing-free. You might want to read some in-depth testing and explanations such as http://www.blurbusters.com/gsync/gsync101-range/
Of course with g-sync you'll become much more sensitive to any smoothness/lag issue that was always there but just wasn't noticeable before.
It's because it's a cursor-based game, and raw mouse input renders the cursor at the refresh rate instead of the game's framerate.
So on a 240Hz display with G-Sync disabled, the cursor updates at 240 FPS regardless of framerate.
When you enable G-Sync the refresh rate is synchronized with the framerate so if the game drops to 30 FPS, the cursor also update at 30 FPS.
The solution is to disable G-Sync for that game's profile, not to ditch G-Sync displays - unless that's the only thing you play.
 
Back
Top