Joe Kane/Samsung HDR Demo at CE Week 2015

High dynamic range took center stage at CE Week as Joe Kane discussed his vision for the future of television content and reproduction.

At CE Week in New York last month, video guru Joe Kane presented a wonderful demonstration of high dynamic range (HDR) several times during the show. (Thanks to John Bishop of Bishop Audio Services for the photo above of Joe doing his thing at CE Week.) He started by emphasizing that the rules of video—up to and including HD—are based on CRT (cathode-ray tube) capabilities. Any new display technologies had to look like CRT—even though CRT displays were disappearing—and if they offered greater capabilities, they had to be dumbed down to match the limits of CRT.

The transition to Ultra HD began with the same limitations. TV manufacturers started making what amounted to HDTVs with four times as many pixels, but the content-creation community was left out at first, so there was no standardization of other aspects of image quality, such as dynamic range and color gamut. As a result, content creators continued to use standard dynamic range with a peak brightness of 100 nits, power-law gamma, and the BT.709 color gamut—the same specs as HD, which are more or less based on CRT capabilities—to create UHD content. And UHDTVs must be “reigned in” to conform to these specs, even if they can do more, to reproduce that content as the creator saw it in the mastering process.

Then, in August 2012, the opportunity to evolve beyond CRT in both content and reproduction appeared in the form of the ITU-R (International Telecommunication Union-Radiocommunication Sector) BT.2020 specification. This spec includes a much wider color gamut, new equations for converting RGB to YCbCr, 4:4:4 and 4:2:2 color subsampling, and other enhancements that go way beyond the capabilities of any CRT.

However, BT.2020 is not perfect by any means. For example, it says nothing about high dynamic range or an EOTF (electro-optical transfer function)—the function currently served by gamma—though it does include specs for 10- and 12-bit coding, which is vital for any implementation of HDR. According to Joe, its biggest contribution is opening the door to reconsider everything, to put all parameters on the table, paving the way to a vastly improved set of standards for the next generation of video content and display—and beyond.

Many experts believe that HDR requires a new EOTF, which determines how a display responds to the brightness information in a video signal. Gamma just doesn’t cut it—even with 12-bit resolution, banding between adjacent brightness values is visible at low light levels. The best-known candidate for a new EOTF is called PQ (Perceptual Quantizer), which was introduced by Dolby as part of its Dolby Vision HDR system and has now been standardized by SMPTE (Society of Motion Picture and TelevisionEngineers) as ST 2084.


With 10-bit resolution, BT.1886 gamma and the PQ curves are above the Barten Ramp Threshold throughout the practical brightness range, which means that banding will be visible in the image. BTW, as you can see, there is a 10,000-nit and 1000-nit PQ curve in this and the following graph, which reflects the fact that Dolby was trying out different peak-brightness levels early in the development process. But as I learned from Joe and Dolby, PQ is now tied to a peak brightness of 10,000 nits, as discussed below.


With 12-bit resolution, BT-1886 gamma is above the Barton Ramp Threshold at luminance levels below about 8 nits, which means that banding will be visible in low-luminance images. The 12-bit PQ curves are below the Barton Ramp, which means no banding will be visible at any luminance level.

One of Joe’s most interesting points—and something I hadn’t known before—was that PQ uses a fixed reference for peak brightness, whereas gamma is relative to whatever the display is capable of. The peak brightness of PQ is typically defined as 10,000 nits, 100 times more than the current standard of 100 nits for peak brightness in the mastering process. This is way beyond the capabilities of any current display technology—except for a custom display built by Dolby that focuses all the light from a digital-cinema projector onto a 24″ screen!

Dolby’s own Pulsar HDR LCD monitor has a peak brightness of 4000 nits, but it’s liquid-cooled (!), and the company’s second-generation HDR monitor maxes out at 2000 nits. Samsung’s SUHD HDR-capable TVs can reach 1100 nits, but only in small areas of the image—with a full-screen white field (or, say, a scene dominated by bright snow), the peak brightness drops to around 300 nits in order to avoid drawing an inordinate amount of power. These TVs are not unlike plasma in this regard.

Because PQ uses a fixed peak-brightness of 10,000 nits, content graded for a peak brightness of 1000 nits uses only 80% of the range of brightness values, while grading for a peak brightness of 300 nits (typical for current OLED TVs) uses only 60% of the brightness range—higher values are limited to the same light-output level. As a result, Joe would prefer a relative EOTF; as he puts it, HDR is really about contrast, not absolute light output.


If content is graded for a peak brightness of 1000 nits using PQ—which is likely, at least for now—brightness values above about 80% will be limited to 1000 nits, since PQ is tied to a peak brightness of 10,000 nits. If the content is graded for a peak brightness of 300 nits—a common peak brightness for OLED—values above about 60% will be limited to 300 nits.

Joe also talked about bit depth, pointing out that moving from 8 to 10 bits increases the number of brightness code values by a factor of 4. In BT.2020, black is defined as 64, nominal white is 940, and peak white is 1019, though the spec also says that video data can exist between 4 and 1019. (Values 0-3 and 1020-1023 are reserved for timing data.) Interestingly, 8-bit black is defined as 16 and white is 235, both of which are exactly one quarter of the 10-bit values. There is a self-imposed upper limit of 235 in most 8-bit content, and there are discussions of imposing an upper limit of 940 in the 10-bit world. As Joe notes, it is quite ironic that HDR content might not use all the dynamic range available to it in the signal. He would prefer the nominal-white value to be 1019, allowing more of the range to be used routinely.


The range of brightness values using 10 bits is four times the range of 8 bits.

Another critical aspect of HDR is color—not necessarily a wider gamut (though the wider P3 gamut is used in almost all currently available HDR content), but more saturated colors at higher (and lower) brightness levels than standard dynamic range can manage. As the overall brightness increases, colors become less saturated and converge on white; similarly, as the overall brightness decreases, the colors converge on black. With higher dynamic range, colors can remain saturated over a wider range of brightness levels.


As a display approaches its maximum or minimum brightness, the color gamut converges on white or black. With high dynamic range using more than 8 bits, colors remain saturated at higher and lower brightness levels than 8-bit standard dynamic range can manage. For example, this means the sky can remain more blue if it’s a brighter part of the image.

Since everything is “on the table” as we make the transition to UHD, Joe advocates what he calls a container approach—that is, capture and store as much information as possible (highest dynamic range, greatest bit depth, widest color gamut, etc.) in a standardized “container” format, and then derive what the current display technology can accommodate as the delivered content. When display technology improves, new metadata can be applied to the existing container so information can be extracted to more closely fit the characteristics of the new technology.

In addition to all this great info, Joe also demonstrated still photos and video he had captured in HDR, displaying them on a Samsung UN65JS9500. The photos had been shot on a Nikon D800E DSLR with a native resolution of about 8K and a dynamic range of about 14 stops, which is equivalent to 14 bits. The video footage was shot on a SonyPMW-F55 digital-cinema camera with a native resolution of 4K and a dynamic range of 14 stops.

Joe doesn’t know the spectral sensitivity of the D800E or F55, though he says the F55 is probably very similar to the F65, whose spectral sensitivity is shown below. He’s convinced that the D800E can capture at least Adobe RGB and probably more, but Nikonwon’t reveal that info, so he intends to profile the camera himself.


The Sony PMW-F55 digital-cinema camera has a very wide spectral sensitivity, much larger than P3. The F65 probably has much the same sensitivity.

Of course, all this material had to be downconverted for the 10-bit/P3 Samsung TV, but it still looked amazing. However, there were some differences between different pieces of material. For example, the photos from the D800E were downsampled from 8K to UHD, and they looked sharper than the native 4K footage, demonstrating that capturing at higher resolution than you’ll end up with is better than capturing at the final resolution to begin with. And one of the shots from the F55 was captured at 1080p/60 and upscaled, which looked softer than the rest of the footage that was captured at 2160p/24.

After Joe’s presentation, representatives from Samsung showed clips from Exodus and The Maze Runner in 4K HDR—the same files that are available to download on M-Go—from a UHD Video Pack server on the JS9500 and in HD SDR from an Oppo BDP-83 Blu-ray player on a UN65JU7100 sitting just below the JS9500. The engineer with the remotes was a wizard at getting the two sources synchronized! The HDR versions were clearly better, with greater contrast and richer colors.

After we all got back from CE Week, I visited Joe at his house, where he has a Samsung UN85S9 and UN78JS9500 set up in his living room. Of course, the S9 is an 8-bit UHD panel and not capable of displaying HDR, but the JS9500 is fully capable of HDR with a 10-bit panel and a wide color gamut (WCG) that encompasses over 90% of the P3 gamut. Sitting in a chair positioned exactly 1.5 times the screen height from the center of the JS9500, I watched a UHD clip from Oblivion graded in HDR and P3 color and then in SDR and 709 color using two TV presets calibrated for each clip.


Joe Kane’s living room has two Samsung flat panels, a UN85S9 (left) and UN78JS9500. Notice he has placed chairs directly in front of each set at the optimal distance for UHD resolution.

In that orientation and at that distance, my objections to curved screens disappeared, replaced by a complete sense of immersion (though no one else could experience it that way at the same time). The HDR/P3 clip was stunningly gorgeous, while the SDR/709 clip looked much duller with less color. Even played sequentially rather than side by side, the difference was obvious—the HDR image put the SDR version to shame. I can’t wait for HDR/WCG content and displays to become commonplace!