IMO, the type of display and/or video processor you're using would play an important role in 24p (film or tape) judder. 24p-based material is delivered over 1080i sources by using 2-3 pulldown--adding TV fields (1/60-sec half-frames) and changing the speed slightly so 24 frames per second matches the 1080/60i (30i) that stations and other sources use. Displays/processors with good reverse 2-3 pulldown can weave each 24p frame back together from only the original TV fields (half-frames). Then, if they're displayed at even multiples of 24 fps, such as 48, 72, 96, 120 fps, judder (from displaying 'extra' TV fields)shouldn't be visible. Not many displays have such 2-3 reversal and even-24p-multiple display yet. Instead, most display 24p-based material at 1080/60p (or a scaled resolution other than 1080) which still has extra TV fields.
But 1080/60p is correct for non-24-based material from TV cameras. That includes live 1080/60i or typical travelogues/documentaries video taped at 1080/60i (30i). With these sources, which match the broadcast rate, 1080i's two 1/60-sec TV fields per 1/30-sec TV frame can just be deinterlaced (30 fps) and displayed twice (60p or 60 fps, which differs significantly from capturing images originally at 1080/60p). That's for fixed-pixel displays. For 1080i CRT-based displays operating in interlace mode, 1080i material is displayed as 1/60-sec TV fields, as it's delivered, and the eyes/brain merge two fields into 1/30-sec TV frames. The fast 1/60-sec refresh rate conveys rapid motion well, and that refresh rate minimizes flicker.
My 1080i CRT RPTV displays the extra 2-3 pulldown TV fields of 24p-based programming since it displays all HD at 1080/60i. Judder seems to vary, although suspect it would take direct A-B comparisons with a display using 2-3 pulldown removal (and 24p-even-multiple display) to confirm how much judder is present. One program series, Discovery's "Sunrise Earth" is video taped at 24p and seems moderately blurred (mostly static scenes) with my 1080/60i CRT display; grass blades or twigs, distant rocks, etc., aren't very crisp looking, although the color and scenes are nevertheless great looking.
That's in comparison with programming taped at 1080/60i, which appears very crisp since it's recorded, broadcast, and viewed at 1/60-sec TV fields per second on my screen. Motion also appears smoother with such programming captured each 1/60-sec, sometimes labeled 'wow' HD--providing inadequate bit-rate delivery and excessive filtering from multicasting or other HD processing doesn't rob resolution and cause blocking artifacts.
Movies captured at 24 fps (~1/48-sec exposure) have widely varying judder and blurriness displayed at 1080/60i with my CRT RPTV. Since "Sunrise" scenes are nearly static, it seems necessary to differentiate between judder caused by the extra TV fields of 2-3 pulldown, and the added judder from movement. (I've also noticed stuttering-type motion artifacts by a malfunctioning cable-STB MPEG-2 decoder, and some fixed-pixel displays might be decoding/deinterlacing 1080i poorly.) But lots of additional elements have to be factored in with films. The quality of prints optically telecined to tape/disks varies widely. Cinematographers deliberately filter their cameras for dramas and selectively focus (blurring backgrounds) to concentrate attention. Motion and camera movements may be restricted for the 24-fps capture, or 'overcranked' at higher fps when less blurring of rapid motion is desired or for slow-motion sequences. TV productions can risk using original negatives for telecines, not prints, providing crisper HD images, since rapidly-made TV production film segments aren't as 'precious' as major film feature segments.
So, to pinpoint judder-type artifacts, either a descriptive comparison in these forums with what others have seen might be needed, or home comparisons with other methods of display: fixed-pixel at 60p with/without 2-3 pulldown reversal, or, say, 72 fps with 2-3 reversal. -- John
But 1080/60p is correct for non-24-based material from TV cameras. That includes live 1080/60i or typical travelogues/documentaries video taped at 1080/60i (30i). With these sources, which match the broadcast rate, 1080i's two 1/60-sec TV fields per 1/30-sec TV frame can just be deinterlaced (30 fps) and displayed twice (60p or 60 fps, which differs significantly from capturing images originally at 1080/60p). That's for fixed-pixel displays. For 1080i CRT-based displays operating in interlace mode, 1080i material is displayed as 1/60-sec TV fields, as it's delivered, and the eyes/brain merge two fields into 1/30-sec TV frames. The fast 1/60-sec refresh rate conveys rapid motion well, and that refresh rate minimizes flicker.
My 1080i CRT RPTV displays the extra 2-3 pulldown TV fields of 24p-based programming since it displays all HD at 1080/60i. Judder seems to vary, although suspect it would take direct A-B comparisons with a display using 2-3 pulldown removal (and 24p-even-multiple display) to confirm how much judder is present. One program series, Discovery's "Sunrise Earth" is video taped at 24p and seems moderately blurred (mostly static scenes) with my 1080/60i CRT display; grass blades or twigs, distant rocks, etc., aren't very crisp looking, although the color and scenes are nevertheless great looking.
That's in comparison with programming taped at 1080/60i, which appears very crisp since it's recorded, broadcast, and viewed at 1/60-sec TV fields per second on my screen. Motion also appears smoother with such programming captured each 1/60-sec, sometimes labeled 'wow' HD--providing inadequate bit-rate delivery and excessive filtering from multicasting or other HD processing doesn't rob resolution and cause blocking artifacts.
Movies captured at 24 fps (~1/48-sec exposure) have widely varying judder and blurriness displayed at 1080/60i with my CRT RPTV. Since "Sunrise" scenes are nearly static, it seems necessary to differentiate between judder caused by the extra TV fields of 2-3 pulldown, and the added judder from movement. (I've also noticed stuttering-type motion artifacts by a malfunctioning cable-STB MPEG-2 decoder, and some fixed-pixel displays might be decoding/deinterlacing 1080i poorly.) But lots of additional elements have to be factored in with films. The quality of prints optically telecined to tape/disks varies widely. Cinematographers deliberately filter their cameras for dramas and selectively focus (blurring backgrounds) to concentrate attention. Motion and camera movements may be restricted for the 24-fps capture, or 'overcranked' at higher fps when less blurring of rapid motion is desired or for slow-motion sequences. TV productions can risk using original negatives for telecines, not prints, providing crisper HD images, since rapidly-made TV production film segments aren't as 'precious' as major film feature segments.
So, to pinpoint judder-type artifacts, either a descriptive comparison in these forums with what others have seen might be needed, or home comparisons with other methods of display: fixed-pixel at 60p with/without 2-3 pulldown reversal, or, say, 72 fps with 2-3 reversal. -- John