Originally Posted by cybrsage
Yeah, eventually we will have the ability to move past the roots of NTSC - especially now that everything is digital. The move from analog to digital would have been a horrific time to do it and I am glad we did not. Now that analog TVs are all but gone we no longer have to worry about FM radio interference and therefor no longer need to drop slightly below 60Hz and 24Hz to fix the problem. I think we can do it with the true 4K when it comes out, after HDMI 2.0 is created. I hope so at least. All new material can be 24/60Hz while the sets also support the older 23.xxx/59xx
standard. As long as the sets support both, the media does not have to.
You make some good points. I'd probably be ok with a 24/60 system. A 120Hz display would handle both of those with ease. But you know some filmmaker is gonna want some other "oddball" frame rate that doesn't easily fit that scheme (like 48p). So it would be nice to have a system that's flexible enough to accomodate some other rates.
23.976p, 24p and 25i/p all essentially compete for the same space though, namely traditional film content. So it would be sort of nice to effectively phase one or two of those rates out, and be able to watch traditional film material at it's native 24 fps rate without any speed-ups, slow-downs, or 3:2 judder... now that we're no longer limited by the frequencies of power generators and radio signals as you point out above.
There's alot of native 25i (50-field/sec) PAL/SECAM legacy content that wouldn't easily fit into a 24/60 scheme though. I suspect Benny Hill, AbFab, Upstairs/Downstairs, As Time Goes By, and most other video-based Britcoms are probably 50-fields/sec. And it would be nice to be able to watch those at their native rate as well. So maybe some sort of a hybrid 100/120Hz or 96/100/120Hz system would work better?... Not really sure though.
Digital audio is still a bit of a mystery to me btw, so I don't fully understand how digital recordings can be sped-up or slowed-down to 23.976p or 25i/p without some loss in audio quality. If the audio source is analog (ie optical or tape track), then I suspect it is/was fairly easy to compensate by adjusting either the physical speed of the playback/recording device (ie telecine), or the sampling rate when converting to digital formats. If the audio source is digital to begin with though, and captured/recorded at a certain number of samples/sec., then surely it has to be resampled to accomodate the slightly different video frame rates. Maybe the sampling rates are so high on professional audio gear though that the difference is undetectable when sampled down to 48 kHz?
Edited by ADU - 6/16/13 at 11:39am