Originally Posted by sneals2000
In the US most scripted drama (apart from soaps) and many sitcoms are shot on film or 24p video at 23.98frames per second (from now on abbreviated to 24).
No US broadcaster broadcasts 24p natively and instead they have to broadcast at 60Hz (59.94Hz in reality) SD and 1080i HD broadcasters use interlaced 60Hz, 720p broadcasters use progressive 60Hz. Interlaced 60Hz is sometimes described as 30fps (29.97fps) BUT because of interlace each frame is made up of two entirely independent fields sent 1/60th second apart - and consisting of even and odd lines. They DON'T have to be two halves of the same source frame though - which is important.
24p stuff is thus shown at 60Hz using 3:2 pull-down (usually). This means that one 24p film frame is shown for 3/60 of a second (3x720/60p frames or 3x480/1080/60i fields) and the next 24p film frame is shown for 2/60 of a second (2x720/60p frames or 2x480/1080/60i fields)
This pull-down is added either during the telecine transfer (common in 480/60i SD production) OR the film is transferred at 24p and the 3:2 pulldown added on playout to a 60i or 60p VTR when the broadcast master is created. (Telecine is just the name of the piece of equipment that scans film frames to video in real-time. Non-real time transfer devices are more commonly known as film scanners. Telecine DOESN'T mean 3:2 in the broadcast world - in Europe 2:2 is the format used by telecines...)
This means that in 720p you are sending 3 frames more than you need to, and in 480/1080i 1 field more than you need to. (In fact many MPEG2 and H264 encoders detect this redundancy and don't waste bandwith sending redundant fields/frames BUT this isn't universal)
SO - when people record TV shows shot at 24fps but broadcast at 720/60p or 1080/60i (or 480/60i) they often try to reverse the 3:2 pulldown so that you end up with a file just containing the original 24p frames, rather than the entire broadcast stream (completed with redundant, repeated frames). Often 1080/24p stuff created from 1080/60i with 3:2 removed is then scaled to 720p AIUI.
HOWEVER this isn't always done perfectly AIUI (I'm not a downloader)
It can also create issues where :
1. The 3:2 cadence isn't consistent (which can happen when film content is edited in the 60Hz domain rather than in the 24Hz domain)
2. 60i or 60p motion is mixed with 24p content - such as rolling/crawling captions - or some special effects. (The latter was the case with Star Trek : TNG - which was shot 24p on film, but the effects were added in the 60i domain)
ALSO - if you are watching 24p content you really need to watch it on a TV with 24p connections - or at 60Hz if you can cope with 3:2 judder. You REALLY don't want to try watching it at 50Hz (*)
(*) XBox Media Center (a hack for the original Xbox) has a neat 4% speed-up option that will replay 24p content at 25p with 2:2 pulldown, outputting at 50Hz, which works really for 24p SD content I believe.
I'll have a look at that clip tonight or tomorrow on a 24p display and see how it looks. One downside of 24p capture is that it suffers from temporal aliasing, so motion has to be well controlled to not break up into separate images. Most camera operators are very skilled at this - but some special effects artists are less experienced (though most are fine)
Great info. There's a lot I been learning in the last few days if anything just purely down to interest. I certainly may not have a production set up or anything close but My TV should be able to cope with 24/25/30/50 and 60Hz video. Its a Pioneer PDP-428XD - you don't have that exact model in the US but it is capable of displaying the following
Video signals supported
720 (1440) x 576i@50 Hz
720 x 576p@50 Hz
1280 x 720p@50 Hz
1920 x 1080i@50 Hz
720 (1440) x email@example.com Hz/60 Hz
720 x firstname.lastname@example.org Hz/60 Hz
1280 x email@example.com Hz/60 Hz
1920 x firstname.lastname@example.org Hz/60 Hz
1920 x 1080p@24 Hz
1920 x 1080p@50 Hz
1920 x 1080p@60 Hz
The PC used to output the MKV via HDMI can use the following video resolutions
1) 1280x720 - 50 Hz (interlaced)
2) 1280x720 - 60 Hz
3) 1920x1080 - 24 Hz (interlaced)
4) 1920x1080 - 25 Hz (interlaced)
5) 1920x1080 - 30 Hz (interlaced)
6) 1920x1080 - 50 Hz (interlaced)
7) 1920x1080 - 60 Hz
Now, this list is what Windows tells me the mode are (Screen Properties --> Settings --> Advanced Button --> Adapter Tab -> List All Modes Button). However, the TV tells me that I has a 720p input on setting (1) and 1080p on settings (3) and (6),. I trust the TV rather than windows and to be honest it also make sense - i.e. Blue ray @24 fps is progressive and both 50 Hz setting would indicate progressive sources (at least, that what I think).
The most likely setting is (3) 1920x1080/24, but I still see judder.
The particular clip looks like a "virtual" pan - i.e. computer generated