Another great interview. Having industry people is enlightening.
I'm concerned that the consumer electronics industry will not be as patient with the evolution of UHD as HD and cut back, but on the other hand UHD panels may become the standard as they are getting easier to make.
Many current displays have a wider gamut than 709 which requires color management. Display firmware upgrades could include a future UHD gamut increase and possibly HDR.
As mentioned cinema 4096 and UHD 3840 have an aspect ratio mismatch so vertical scaling is probably required. For horizontal, since this would be done on the creation side, professional scaling would be used. Add distribution encoding and I doubt it's an issue.
A common misconception is that 25/30 fps interlaced video has less smooth motion than higher frame rates such as 48 fps in cinema or the 50p/60p standards. The motion update rate in 2:1 interlace is at field rate (twice the frame rate). 25i and 30i are updated at 50hz and 60hz respectively. The shutter exposure time is the field duration.
The "film look" not only involves the slower frame rate but also the typical 1/2 frame duration shutter exposure time. 24 fps usually uses 1/48th second. This faster shutter time decreases motion blur but adds a stuttered look (judder). In live type video the frame duration and shutter time are usually nearly equal so judder is minimized. With 100+ fps, the issue of motion blur should be greatly diminished, though it should be interesting if someone decides that a faster shutter time makes it look more cinematic.
Judder can add distance to the storytelling which can aid suspension of disbelief. Higher frame rates may require a filmmaker to change their visual methodology of storytelling. If that does become popular, one might say Soap Operas led the way and "Soap Opera Effect" could be positive instead of pejorative. The increased frame rate TVs can display do not change the original shutter exposure time which means there is a mismatch between frame rate and motion blur. It may be too smooth.
JPEG2K requires a high bit rate because it's an intraframe DWT (Discrete Wavelet Transform) encoding scheme. Besides cinema, JPEG2000 is also sometimes used for TV backhauls by companies such as FOX. JPEG2K can offer better quality while the MPEG varieties offer better efficiency. MPEG encoding might use longer GOPs for high frame rates, as well as perhaps easier motion compensation as less moves between frames. JPEG2K may require much additional bandwidth for high frame rates.
Anything else? I think I should change to decaf.
Last edited by TVOD; 06-14-2014 at 08:29 PM.