This one’s a real question, specifically to readers who:
- Have LCD HDTVs with 120Hz or 240Hz refresh rate and the option of interpolating new frames (which can go by any number of “smooth” or “blur-resisting” or similar names).
- Actually watch HDTV movies, preferably from Blu-ray but maybe even upconverted from DVD.
Here’s the question or questions:
- Do you use the frame-interpolation option?
- If you’re a movie buff, do you find that its “video-like” look harms your appreciation of the movie?
Here’s the thing. We don’t have an HDTV yet. When we get one, it will almost certainly be an LED-backlit LCD model, which also means it will almost certainly have at least a real 120Hz refresh rate (and either a 240Hz rate or the “pseudo-240” fast-switching backlighting option).
The home theater/av magazines I read mostly have reviewers who believe that the judder in film–the fact that, at 24 frames per second, film action isn’t actually smooth (you’re seeing the flicker, at least subconsciously)–is what makes it film: That smoothing out that judder by adding interpolated frames somehow damages the flick, turning it into video.
That’s not a universal view–and I’m less than fully convinced that every director and director of photography really *wants* a flickery movie. Sometimes, yes–I believe that Woody Allen’s b&w films are probably intended to be seen with all the flicker of the original. But many times, I suspect, the director and DP deal with what’s feasible. I feel the same way about the notion that, in all pictures (as opposed to certain stylized pictures), the grain of the film should be visible.
So: How about you? I haven’t actually had the chance to make the decision yet. If (when) I do, I’ll try it both ways on a variety of flicks…but I’d love to gain the experience of those who’re already there.
Oh: If you’re thinking of giving me a sermon on how the creator’s work must be honored, don’t bother–unless you can prove to me that all those directors viewed judder as a positive, not simply the reality of film-based moviemaking. Just say “I would never use the smoothing feature on movies” and let it go at that.
Hi Walt: I put your question to two people – and here are their answers (with who they are in parens before the answer):
Person 1 (My Dad who owns an LCD HDTV and is very particular about image quality. He watches quite a few movies – but is mostly a sports buff.):
“I don’t really notice the absence of jitter in old movies now – which I like to watch. That said, I did did notice it the first time I watched an older movie that was typically jittery. At first, I wasn’t sure what was different but then I figured it out. I don’t miss it. I find jitter really annoying – not artistic.”
Person 2 (My co-worker who runs our library’s Film Theatre. She has worked in the industry for 20 years and has directed films. Our Theatre has HD.):
“Flicker can be artistic – but most directors will build it in to the movie and not rely on your technology to produce it. I certainly don’t when I direct. I tend to leave smoothing turned on – but I will turn it off in some cases for personal viewing of art house movies.”
Hope this helps.
GeekChic: A great response–thanks!
More responses from others?