Originally Posted by Joe Bloggs
backlight scanning it will strobe/flicker.
Yes, if it's below flicker fusion threshold. So it will always remain a problem for 60Hz sources. (Which is not applicable to my computer-specific scanning backlight project which will always run above 60Hz native, unless you're one of the people sensitive to 120Hz flicker, which some people indeed are.)
Here's what wikipedia says about blurring in the frame rate section:
I know proper motion blur isn't the same as blur from the display, but wikipedia still makes a valid point (though I don't think it's totally correct - especially on the first sentence - since they can look more fluid than 24 fps film with a 180 degree shutter). But a bit of blur (when the frame/display rate isn't high enough not to use any without strobing) will allow for more realistic/smoother looking motion - including in games.
Yes, but keep in mind that it depends on context of what is "natural". Natural motion blur is motion blur that the eyes does itself; not the display doing it for you, unless it's for a good reason intended by the situation, etc. There are many situations, especially video games and computer use, that you want to let the eyes add motion blur for you instead
. Real life has no frames per second (you could technically consider it infinite frames per second). So there's no motion blur except what the human brain adds for you. That's 100% natural. So it would posist, to not let the display force blur upon you
(e.g. by display limitation such as LCD) for the specific *situations* that warrants it -- e.g. first-person 3D shooter games, scrolling/parallax games (Nintendo style motion, including nogastilistic and modern variants), etc.
For games you don't need artifical GPU motion blur unless it's for dramatic effect (Fortunately, it is often a setting that can be turned off) -- sometimes it's only there to help make the motion look more natural in low framerate situations, but inappropriate for many other situations when your system is powerful enough. Yes, GPU motion blur can benefit 30fps@60Hz situations, by lessening the double-image effect. Other times it's artistic for certain moments for drama effect, like motion blur that only shows up in superspeed mode or when you're injured in a video game. That's a fun use of software-controlled motion blur. By eliminating motion blur from the display -- then the complete control over motion blur is left to both the source (software control; video game director intent) and destination (human vision system and how it adds blur for you), leaving the blur away from the display equation, eliminating the display from being the weak link in director intent and user intent. Of course, it's fair to let the display be configurable for motion blur (as you say) by the various interpolation/scanning modes.
Also, the intent can depend on the director.
Some directors intended film to have the "24fps" look; with built in motion blur.
Other directors intend to have the fast live action look (e.g. red bull air races, soccer, NASCAR, etc.)
So it all depends on what the intent is. Etc.
Also, Einstien says, it's all relative -- what defines "natural"? Natural real life has no frames per second
. For motion, the concept of "frames" and "frame-rate" is an artifical invention since the zoetrope's of the mid 19th century, and the first film experiments in the late 1880's and early 1890's. There's as of right now, no practical way to convey motion without the artifical invention of "frames per second". There's no practical method of media in the last 150 years that gives you infinite frames per second (to eliminate the artifical invention of frame rate). Motion being added to the display, for you, is also artifical too -- there are certain material (and situations) where I prefer to let the human vision system do all the motion blur, rather than the display forcing motion blur upon you. Again, it depends on the material and conditions. The lack of display-forced motion blur makes FPS adventures much more immersive and natural for SOME people for SOME games. There are some videogame afficanados who stick to CRT, for precisely the reason of immersiveness. You've got natural human-eye-provided motion blur, with zero motion blur forced upon you by the display itself. Often this is the lesser of evil and more natural (when at a sufficiently high refresh rate), the strobe effect (Which disappears if you adjust to the point above flicker fusion threshold). Of course, not all gamers like it that way -- some prefers the flicker-free feel of a display, just like some people a super-sensitive to DLP rainbows, while others are not. What's more natural for you: The lack of display-forced motion blur?? The lack of strobe effect?? The lack of motion interpolation?? The lack of latency?? Pick your poision... It's all relative.
But yes, I agree with you. It's often natural to add motion blur. But it's not always natural in all situations. All depends on the material and what the director intended, etc. Natural is all relative. There's no possible global world-agreed definition of "natural" for video in a "one size fits all" for all purposes, because there's no such thing as a infinite-framerate perfect-holodeck holographic recording of real life indistinguishable from real life, etc.
Now, back to context of the original topic, "how do we make 30fps look more natural at 60Hz". (Despite fps and Hz being an artifical way of delivering images to human eyes). Adding more artificial motion blur can help mask the double frame effect, yes (unless you hate blur). Adding motion interpolation, can help, yes (unless you hate interpolation and/or input lag). Avoiding 30fps@60Hz, if you have the game available at full frame rate on another platform such as PC too, can help, yes (unless you don't have the platform, or can't afford platform). Quite a choice of options; which method is most natural -- many with separate 'unnatural' disadvantages -- it's all relative here, too.