Joe Kane at Samsung QLED/HDR10 Summit

joe kane

Video guru Joe Kane gave a fascinating presentation at the Samsung QLED/HDR10 Summit two weeks ago. He started by reminding us that content creators were caught completely by surprise at the emergence of 4K/UHD TVs. Also, for studios to be interested, the new video system needs to be about more than simply increased resolution.

Joe maintains that UHD offers the opportunity to start from scratch. With black and white, color, and even HD, there was a single standard based on the behavior of CRT TVs, and new display technology had to look like CRT in order to conform to that standard. UHD allows for new possibilities and capabilities; anything is fair game.

However, HDMI presented another limitation. According to Joe, UHD resolution was originally going to be 4096×2160—the same as commercial 4K—but because of HDMI bandwidth, that was dropped to 3840×2160, which is 6% fewer pixels. (This resolution does have one advantage—it’s exactly twice the horizontal and vertical resolution of HD, making it easy to upscale HD to UHD.)

Regarding color gamut, many presentations at the summit talked about BT.2020 as a gamut to strive for using quantum dots and RGB lasers. However, Joe pointed out that it really doesn’t work as a display gamut and should not be pursued as such. RGB laser projectors can achieve the full BT.2020 color gamut, but instead, they use multiple wavelengths for each primary to avoid speckle and metamerism, thus reducing the gamut. As a result, Joe and many others view BT.2020 as a “container” that holds a smaller gamut. (This is the same idea behind ACES, the Academy Color Encoding System, which is now being used as a production, mastering, and archiving container.)

Joe advocates that UHD should completely replace HD, but it should include 1080p and 720p as well as 2160p and even higher pixel counts, such as 5120×2160 (2.37:1) and even 7680×4320. It should allow 4:4:4 RGB color, a source bit depth up to 16 bits, more options for the EOTF and grayscale, and frame rates up to 120 fps.

Possible color gamuts should include BT.709, DCI/P3, Adobe RGB, and 1953 NTSC, as seen in the CIE chart above. (Joe included that last one because quantum-dot companies often compare QD color capability to NTSC, even though no one actually uses it.) He’d also like to see support for several different grayscales, including D65 (the standard everyone calibrates to these days), D55 (best for watching B&W content), D60 (a better fit to digital-cinema content), and D50 (used in print photography). In addition, several EOTFs (electro-optical transfer functions) should be supported, including BT.1886 (gamma 2.4), PQ, HLG, and gamma 2.6 to match digital cinema.

So, how can one system support multiple sets of primary colors, grayscales, peak luminance levels, EOTFs—in fact, multiple versions of HDR? Joe pointed out that we’ve seen this problem before, when digital projectors were being installed in theaters before the DCI standards were established. The answer then—and now—is metadata.

As an example, Joe pointed to the Samsung SP-A900 projector, which he helped design years ago. To calibrate it, you measure its native primary colors and white point, then input that information into the projector. Using that info, the projector calculates what it needs to do to get from the native capability to the desired capability. Ideally, the display’s native primaries are outside the target gamut, and they can be “pulled in” to reach the target by adding one or two of the three colors to the third color. The SP-A900 only did this with the primary colors and grayscale, but measuring and altering the native EOTF was possible as well. (SpectraCal is working with LG to implement a similar process using its CalMan calibration software and LG OLED TVs.)

Joe’s final point was to advocate for a single-master approach to content creation. Studios would need to create only one master file that retains the full capability of the capture device, and the content would be stored as 16-bit half-float RGB. (“Half float” is a digital-number format that represents greater detail in the shadows and highlights without requiring inordinate amounts of storage. It also allows negative numbers, which can be useful in large containers like XYZ and ACES.) This master file would be delivered to consumers, and metadata would tell each display how to convert what’s in this container to what’s needed by the display. One point of concern here is that metadata can be lost in the signal path during mastering, a problem that must be fixed before Joe’s vision can be realized.