Originally Posted by Dan Hitchman
This discussion of 48 fps filmmaking giving the illusion of everything being sped up gives me pause.
Look closely at regular TV. It's ~30 fps in the U.S. and motion from sports and the news seems normal and not sped up at all. I wonder if some of these motion artifacts are due to how current digital cinema projectors are handling the 48 fps signal from the digital files. Could some be unintentionally applying a kind of frame interpolation to the native signal rather than preserving the native 48 fps data? I ask this because 24 fps movies actually do
look sped up using motion interpolation on TV sets with this feature because they are adding frames that were not there in the first place. This is different from just doubling or tripling frames in order to reduce flicker. Digital projectors, after all, are just giant TV's.
If a digital projection system is handling the 48 fps cadence correctly, motion should look smoother than a native 24 fps film and yet not sped up. Remember too, that 48 fps was added to many of these projectors' firmware as an afterthought. They weren't designed with HFR in mind. Could this be the culprit?
If you stopped and thought about it, you'd see the flaw in the logic/premise above.
Whether it's native 48fps OR creative frame interpolation, NEITHER one actually speeds up the motion. Here's a simple way to think of it:
If there's an actor running across the 100 unit wide screen (feet, meters or inches, the unit doesn't matter) and it takes them 2 seconds to traverse the width of the screen, they're traveling 50 units per second, and with standard movies, takes 48 frames.
Whether it's playing in 48fps OR with FI, it may mean 96 frames or even more to display that scene, BUT it still only takes the exact same 2 seconds to traverse the screen, and they're STILL travelling across the screen at a speed of 50units per second. There are just more frames in the 2-second span.
Apparently, the increase in frames (and reduced blur?) causes some people's minds to interprete what they see as 'sped up'. When I FIRST tried Frame Interpolation years ago, for the first 3-5 minutes, my mind told me what I was seeing was 'weird', and different from what I was used to seeing and therefore, expecting. I could definitely see where the term 'Soap Opera Effect' came from as the movie went from the typical 24fps movie look I'd grown accustomed to, to looking like a shot-in-video 'The Making Of...' documentary of the movie.
For me though, it only took about 5 minutes to adjust to the different look and truly appreciate the huge difference in smoothness vs judder (and motion blur) and I haven't looked back since.
Having been used to FI for years and having watched the HFR and 24fps versions of The Hobbit, I had no adaptation problems to either and as mentioned, there was no sped up motion. One was just far clearer and smoother. Besides logically speaking if anything was sped up, the movie's running time would shorten. It doesn't. It's just how some minds interpret the difference in what they're viewing.