Originally Posted by 8mile13
"filmic'' means a number of things, of importance here seems descriptions like ''pertaining to or characteristic of motion pictures'' ''related to films or cinema'' suggesting that film ( 24fps blu-ray flatscreen movies) should look like the movies in the cinema, for which the movie people, specifically directors, are reponsible. It is they who determin how a movie should look and it is no coincidence that they are the ones who are most fond of that look.
Why shouldn't the movie people determin how a movie should look i ask you? It is their job
...and make no mistake, when the time comes and 48/60fps replaces 24fps they will be the ones in charge...
Again: when you display 24 FPS material at 24Hz it looks significantly
different to what most people are used to.
Displaying it at higher refresh rates, as is required by today's brighter displays, destroys the motion quality.
It goes from being completely fluid to being a juddering mess.
Scenes where the camera does horizontal panning are just awful to watch if you are trying to follow an object in that scene.
This is not something that is done by choice, no modern display can support a native 24Hz output. The only way I've found to do it is to use a CRT with black frame insertion because inserting black frames on a CRT behaves differently from other display types. (it's not "switching to black" it just doesn't attempt to display anything for the black frames)
You get a bad portrayal of the source material by default
on modern displays (and projectors) if you shoot 24p, it's not something that directors get to choose.
Unfortunately this has been done for so long now that it's become "tradition".
What people consider to be the "traditional" look of film is now this bastardized presentation at ≥48Hz rather than its intended look.
Most directors aren't going to risk shooting at higher framerates because it's going to cost more, they don't want that to be what their film is known for, and because there's a very vocal minority that are strongly against technological progress. Unfortunately a lot of that minority seems to be people that work in the industry.
It doesn't matter why the film fails, it will be known as a failure because they took a risk on shooting HFR.
It doesn't matter that The Hobbit was a bad film with terrible CG to begin with, a huge amount of the blame was placed on it being released in HFR.
Movies are already a huge gamble, so why take the risk if it's going to cost more money and you're going to have a lot of the people you're working with complain about it?
I'm thankful that there are at least some directors that are pushing things forwards and shooting HFR despite all the negativity towards it.
When there is literally no good way to display 24p source material without it having a significant flaw, I just don't understand the fondness that people have towards it.
It doesn't matter what display you're using, it is going to have severe flicker, severe judder, or severe interpolation artifacts - or some combination of the three.
There is no fix for this, other than shooting at a high enough framerate that you can show it on a low-persistence display without significant flicker. (ideally 72 FPS or higher)