AVS Forum banner
  • Our native mobile app has a new name: Fora Communities. Learn more.

Frame & refresh rate from the movie set to the tv set?

1360 Views 4 Replies 3 Participants Last post by  Joe Bloggs
Hey everyone, I've searched through the forums to try to find an answer to this, but I've found conflicting posts. Basically I'm shopping for a new TV and I'm wondering whether to get a 120hz or 60hz set. Without going into opinions on the virtues of each, I'm curious about the actual factual stuff.


I know that most movies are filmed 24 frames a second. A lot of people say that because of this, 120hz sets display bluray movies better than 60hz sets because the source doesn't need 3:2 pulldown.


I've heard other people say that although the movies are recorded at 24fps, edited at 24fps and projected in theaters at 24fps, the actual bluray outputs are actually stored at 1080p 30fps. And alongside this, I've heard people say that bluray players actually output at 30fps 60hz.


So with this in mind, if the 3:2 pulldown has been hard-coded onto the disc, it would surely make the 'pulldown issue' a moot point when it comes to deciding on a TV? Again, I don't want to go into the other pros/cons of each, I just want to stick to the actual technical facts of bluray movies originally filmed 24fps. I know that films on cable tv is actually 60hz interlaced.


Does anyone know what format the movies are stored as and what most bluray players output as? Also - as a sidenote - assuming that bluray movies are stored as 30fps and output by players at 60hz, do 120Hz TVs then successfully reverse the pulldown, so that the fields in a 1sec sequence get converted from from AABBB to AAAAABBBBB?
1 - 5 of 5 Posts

Quote:
Originally Posted by leeuk321 /forum/post/20771536


I've heard other people say that although the movies are recorded at 24fps, edited at 24fps and projected in theaters at 24fps, the actual bluray outputs are actually stored at 1080p 30fps.

Not true. In fact, Blu-ray doesn't even support 1080p30.

Quote:
And alongside this, I've heard people say that bluray players actually output at 30fps 60hz.

They can output at 60Hz, yes. But since the second generation(?) of Blu-ray players, they can send 1080p24 straight from disc to screen if the TV supports that input (and a 120Hz model should, with appropriate frame duplication - just turn off the "smoothing" interpolation).

Quote:
So with this in mind, if the 3:2 pulldown has been hard-coded onto the disc, it would surely make the 'pulldown issue' a moot point when it comes to deciding on a TV?

Only the case for movies that were shot on film and put on Blu-ray at 1080i60, which are a minority and typically low-budget/foreign releases. The standard format for movies on Blu-ray is 1080p24. You can check this thread for in-depth technical details on specific releases.


I believe that answers your other questions too.
See less See more
Most films are stored on Blu-ray at 23.976 fps. Some are at 24.000 fps or 1080/50i and at least one film was at 1080/60i.

Blu-ray players can output at 60Hz (59.94?) or most recent Blu-ray players can also output at 24Hz (and/or 23.976Hz?). European ones (and some in the US) can also output at 50Hz.
Thanks for your replies, it's cleared a lot of things up. So if I understand it correct, the movies are stored at 24p, some players can then output it at 24p, and then 120hz tv's duplicate the frames to give a 5:5 pulldown.


I've heard that some 120hz tvs will actually display at 6:4 pulldown if they are fed a 60hz signal (essentially duplicating the 3:2 pulldown from the player). I wonder if this is unavoidable, or if some tv sets can reverse the 3:2 pulldown successfully from a 60hz signal to display it at 5:5 pulldown?


Also, is it possible for a tv set to display for example 48hz as well as 120hz? I don't mean 'accept' a 48hz signal, but actually display it at that rate. Or would that involve a completely different set of electrical gadgets and wizardry inside the tv to display at multiple refresh rates?

Quote:
Originally Posted by leeuk321 /forum/post/20774844


Also, is it possible for a tv set to display for example 48hz as well as 120hz? I don't mean 'accept' a 48hz signal, but actually display it at that rate.

I think some Plasmas can show 24Hz content at 48Hz (some people say they flicker I think).


But if an LCD TV, with a constantly on back-light could output at 48Hz as well as 120Hz, assuming both were displaying 24p content and had the same response time, I don't think there'd be any difference in what was shown - or if there was a difference, I don't think there'd be any advantage picture-quality wise in an LCD HDTV outputting 24p content at 48Hz instead of 120Hz. Though with The Hobbit films being shot around 48 fps, there would be an advantage in a TV that could display them at a multiple of 48 fps - in 3D too - and a source that could output them like that.
Quote:
Or would that involve a completely different set of electrical gadgets and wizardry inside the tv to display at multiple refresh rates?

I don't see why it would need lots more complicated stuff in the TV to use multiple output refresh rates - but maybe it's just simpler for the TV manufacturers to output at one refresh rate, and a higher refresh rate probably allows better picture processing (eg. smoother frame interpolation, etc.), as well as a higher number to use in marketing.


Also, for displaying 3D - the shutter glasses type, you'd need a higher refresh rate than 48Hz otherwise you'd get a lot of flicker (since for displaying 24p content on one, one side of the glasses would be blanked for at least 1/24th of a second I think).


I do think they are also restricted by the HDMI standard (which has limited 'standard' refresh rates, and where switching between rates can cause a glitch/delay) - I know you're talking about output refresh rates, but what refresh rates they can accept is related.
See less See more
1 - 5 of 5 Posts
This is an older thread, you may not receive a response, and could be reviving an old thread. Please consider creating a new thread.
Top