Christopher Nolan Declares War on Motion Smoothing, Other Crappy TV Settings

Can they just spearhead getting rid or crappy 60htz imput pcbs that manufactures clearly misrepresents the way LCDs work. 120htz shouldn't be a thing unless it accepts a signal from a device at that rate. It's marketing BS and a racketeering charge waiting to happen.
There are many native 120hz panels out there. I have 2, an OLED and a QLED (although to ur point the QLED claims 240hz lol). If you think you are getting 120hz on a budget panel, well, you deserve what you get.
 
I like the '120hz' feature on my lg 3d tv. It says 120 on the box, and doesn't actually have the feature. (Also, I totally watched two movies in 3d with the glasses, but it was cheap, so)

It would be nice if some day soon we could avoid 3:2 pulldown for 24fps sources; thankfully I can't usually tell, but every once in a while, I watch something where the judder is annoying. But that requires real 120hz or variable refresh, both of which cost money.
 
i-do-declare.jpg


& agree.
 
I don't know what netflix and prime do to their video files, but most of it looks just fine on a 60Hz PC monitor.
 
Proper frame interpolation is an extremely intensive process, so whatever these tvs are doing is certainly a cheap hack.

And it really shows, frame interpolation fluctuates wildly, sometimes it hits the native 52hz of my panel, sometimes it doesnt, utter garbage.
 
Actual 60hz content? Great.
Motion interpolation? Seems like I should be having a seizure due to the stuttering mess on screen. Boggles my mind how people can't see it, but it really does just look like a stuttering slideshow. There's nothing "fluid" about it.

Friend of mine told me about some media player plug in once that used the GPU to handle it on a PC... Uninstalled that shit after about 30 seconds.

This almost reminds me of when wide screen TVs first got common and people would always stretch the damn image on 4:3 content. Even with a car commercial on screen and pointing out with a dinner plate that the wheels were oval because of the stretched image, some people just swore up and down they couldn't see it.
 
Actual 60hz content? Great.
Motion interpolation? Seems like I should be having a seizure due to the stuttering mess on screen. Boggles my mind how people can't see it, but it really does just look like a stuttering slideshow. There's nothing "fluid" about it.

Friend of mine told me about some media player plug in once that used the GPU to handle it on a PC... Uninstalled that shit after about 30 seconds.

This almost reminds me of when wide screen TVs first got common and people would always stretch the damn image on 4:3 content. Even with a car commercial on screen and pointing out with a dinner plate that the wheels were oval because of the stretched image, some people just swore up and down they couldn't see it.

Funny, my parents didn't like stretching because of this reason. Instead they zoomed in, cutting off the top and bottom of the screen. Trying to watch it and knowing you were missing shit was hilarious.
 
I always have thought part of the reason these tvs have the motion interpolation settings was because they never put proper 24hz support into them so to avoid frame judder they instead make it look fake. (Yes I know there are other “benefits”.)
 
The motion interpolation on my Sony XBR 65X900E is pretty cool. You have to reduce the display refresh rate or else it tries to interpolate too many frames and causes artifacts, however.

It's really good for some content. I enjoy the effect when watching that Eureka TV show because this show is naturally fun and goofy. It's also awesome for nature documentary (e.g., Planet Earth) type of content.

But for some movies or TV shows I turn it off. It makes some types of movies look unrealistic or perhaps too playful. For instance, I prefer watching Fincher's films using the native 24 hz.

There's seldom such a thing as universally bad or universally good. All y'all who only prefer 24-hz are old farts. Framerate (including interpolation) should depend on the type of content that's being shown.
 
Last edited:
Funny, my parents didn't like stretching because of this reason. Instead they zoomed in, cutting off the top and bottom of the screen. Trying to watch it and knowing you were missing shit was hilarious.
Yeah, those people existed too. For whatever reason, letterboxes on a 4:3 TV didn't bother them(or at least they'd never say anything), but boxes on the side and people just went dumb.

About the stretch thing though, I wasn't talking about "old people", rather people in their late teens and early 20s at the time.

Getting back on topic that nasty motion smoothing junk... I've seen people aged 15-70 that claim it looks fine or somehow "better". Even trying to watch sports with any sort of motion interpolation just looks like a mess.
 
People actually like it or cannot tell the difference are probably the same group of tasteless imbeciles who like pineapple on pizza.
 
  • Like
Reactions: PaulP
like this
I do partially agree that many settings should be off by default. Let the user decide. I adjust my devices according to the content being displayed. How about they make 24hz t.v.'s? They could market them as direcftors edition displays. I'm sure there's a market for them and people like Nolan(and their egos) could add their signatures to them for certification.

I do, however, have issues with the notion of 24 fps film making. At this point it's a totally nostalgic throwback to how film started over a century ago and has no real need or relevance now. 24 frames looks like crap now if you've become accustomed to 60 or higher. It's not art, just lazy. I know there's a physical limitation for film in that 60 frames would require 3x as much film to shoot a movie, so obviously a financial issue there as well as physical storage. On the digital side it's also obvious from the number of 2k DI's that most companies are not even close to 8k digital mastering let alone 4k/60fps so technology, while available, isn't affordable or being adopted en mass by the makers. Thankfully companies like Seagate/WD have done amazing jobs of huge platters with increased speeds so on that side of things storage is pretty affordable for even the most modest of film making budgets.
 
As lostin3d said above 24fps is more nostalgia. Switching to high-res and frame rate requires every part of production to up their game. Makeup and prosthetics become more obvious, set design and details become more important, mistakes show up so much easier.

As an example the first time I watched Walking Dead on a 4k screen I could see which actors wore contact lenses, the moulage looked even more plastic, and some set elements that were supposed to be cement were obviously painted plywood.

What I really want though is filmmakers to stop shooting action handheld and at least go for a steadicam mount. Shakycam sucks, when I run the whole world doesn't shake violently but it does in movies apparently.
 
Maybe if film makers started shooting in 120fps we wouldn't have this problem. Death to 24fps bullshit.
 
If only Nolan would pay as much attention to audio as he does to video. The 5.1 clipped audio mess in his movies is unacceptable, Dunkirk for example.
 
I don't know what netflix and prime do to their video files, but most of it looks just fine on a 60Hz PC monitor.
From my experience so far on my Apple TV 4th Gen, all the movies and shows I have watched on Netflix are playing at their native framerate. I wasn't able to experience this until earlier this year when Apple finally released an update to allow the 4th Gen Apple TVs to output native framerates for TVs that support it (the Apple TVs used to convert everything to a 60hz format). As such, whenever a show or movie comes on my TV's display output changes according to its framerate. So far, I have seen: 24hz, 25hz, 50hz, and 60hz content on Netflix and my TV (Samsung 51" F8500 Plasma). While the Apple TV did a pretty decent job doing its own pulldown conversions when it forced everything to be 60hz, it wasn't perfect. Now being able to watch them in their native output formats, the experience is even better and the motion seems more natural.
 
I despise it, but mainly because it no longer looks good with video games. There was a time when it made console games fake 60fps and actually look okay doing it. Back in the Xbox 360 era. Now it looks like complete trash. I'm assuming it has something to do with frame timing or games interpolating just to hit 30fps. That and the input lag has gone through the roof with it on.

For movies and TV it has always looked and worked horribly.
 
There are many native 120hz panels out there. I have 2, an OLED and a QLED (although to ur point the QLED claims 240hz lol). If you think you are getting 120hz on a budget panel, well, you deserve what you get.

Agreed when concerning price, however the panel can be 120htz on many TVs but you will find cheaper input PCBs and gimped HDMI specs out there that can only accept a 60htz native input signal on high end expensive HDTVs, you almost have to do a butt-ton of research and in some cases wonky workarounds to make it work, currently there are very few HDTVs that will do it, Samsung's QLED 120htzFreesync input is only on select models, honestly OLED for the price absolutely should accept anything you can throw at it, but a lot of the early models were gimped to make them affordable by consumers and they cut corners in areas they could. Ironically I found a few that can support 1080p 120htz but they can't or won't accept QHD 120htz for some odd reason.

Now that the consoles can support freesync and can output higher 1080p Htz refresh rates I can see some people catering to that market, the largest downfall you need to watch out for is the chroma sub sampling on current standards 120htz 4k isn't really possible without DisplayPort without lowering the sub chroma to 4:2:2 or even 4:2:0 on HDMI 2.0b, until HDMI 2.1 is widely accepted.

When I bought my HDTV I had them hook up my laptop I brought to see on HDMI what the set was capable of, still sadly DisplayPort isn't adopted that often. I ended up buying a 49inch QLED HDR Samsung 4k for 800$ direct(personally the smallest I could buy, I don't like oversized Tvs defeats the purpose of higher resolutions), So you can buy something decent for a reasonable price, but anything typically under the 1k range is not a great idea unless you are on a hard budget. Personally I'd rather spend 1k$ on a monitor.
 
That and the over zealous sharpening making fringing and noise reduction.
 
auto dimming is another annoying feature. My TV has it, but you cant turn it off. So when ever a dark scene shows up. It becomes so dark you cant see anything.
 
some TV's can auto detect the content and turn it on and off accordingly.. my father in laws newish 4k TV does this so I don't bother with it much.. it's on for sports stuff and off for netflix. I also find it less offensive as early versions of the effect (like on my own TV)..
 
My biggest problem with "smooth motion" is that it's only "smooth" for short bursts, then you get stuttering and weird framerate inconsistency. Very distracting.
 
I recently switched from a DLP projector to a Sony 4K LCD TV and had to enable the motion smoothing. With the motion smoothing off any high-contrast edges would appear to strobe/stutter/wobble their way across the screen, very distracting. Upon further inspection I can see the same effect on smaller LCD screens but I've never had an LCD as big and high contrast as the TV. With the motion interpolation turned on slightly the effect is almost eliminated without significantly affecting the feel of the content. Turn it up further and it of course turns into a "too smooth" mess.

I never noticed on the projector but I suspect the projector's color wheel provides enough of a strobe effect to trick my eyes.
 
I'd be happy if the pre-fab "cinema" mode most tvs had turned this shit off, kind of like how the gaming mode disables a lot of post processing on the TV.
 
the problem is that most people are not videophiles and don't care about getting the best image and sound quality...people are happy watching movies on their iPhones, $300 TV's and streaming vs UHD or standard Blu-rays...they also use crappy built-in TV speakers...I currently have an LG C7 OLED for my TV and always disable all those crappy motion settings etc...you should also try and get your display professionally calibrated
 
auto dimming is another annoying feature. My TV has it, but you cant turn it off. So when ever a dark scene shows up. It becomes so dark you cant see anything.
I'm surprised you don't have the option to turn it off. I turned off the localized dimming on our TV because it doesn't have enough zones to pull it off properly. If your TV only has a single zone it's no surprise it doesn't work well.
 
I do partially agree that many settings should be off by default. Let the user decide. I adjust my devices according to the content being displayed. How about they make 24hz t.v.'s? They could market them as direcftors edition displays. I'm sure there's a market for them and people like Nolan(and their egos) could add their signatures to them for certification.

Plenty of TVs do 'proper' 24p these days, by just repeating a frame 5x. You can't really do 24hz as it flickers too much. Even 48hz is unwatchable. I was last into high end A/V in the plasma days, and people generally considered Panasonic plasmas to be fantastic except for shitty 24p implementation. I still have my Pioneer Kuro, which tripled 24p to 72hz and is still the best motion resolution with 24p I have ever seen. LCD and OLED have too much motion blur without gimmicks like frame interpolation.
 
24p is going to be the go to standard for a long time. Most people prefer it... and I would say the majority of people making movies prefer it. Nolan is hardly alone.

Peter Jackson is the only director that has really bucked the establishment and tried shooting a few movies at 48p... and it was generally panned. People where not a fan of 48p showings of the Hobbit. Same complaint it looked like TV.

The only thing I could ever really see working would be some sort of on demand higher frame rate tech. Sort of like how Nolan himself blends imax footage into many of his films with different aspect ratios. I think a tech that runs 24p standard with a flag that can bump specific scenes up to 48p would work really well. It would give directors like Nolan another tool they could use when they say have a car chase or something where they want to do an ultra smooth matrix like effect on purpose, and then drop back to 24p.

I was one of the lucky ones that was able to see the Hobbit at 48FPS in 3D. It was the most amazing movie watching experience I've ever had. For the first time ever, 3D didn't give me a headache.
 
  • Like
Reactions: ChadD
like this
Ok Nolan why dont you start filming in 48/72/120fps instead of 24fps, which was only chosen because that is the absolute min fps needed to record audio on real film media. How about we stop using 3-5k cameras and start introducing film grain again, which limits frame to frame resolution back down to 600p.
 
Back
Top