HDR10+ at CES 2018

hdr10+

Most TV manufacturers now embrace Dolby Vision high dynamic range (HDR) as well as HDR10 and HLG. The only major exception is Samsung, which has steadfastly refused to adopt Dolby Vision in its TVs and UHD Blu-ray players. Then, about a year ago, Samsung introduced a new format called HDR10+, which adds dynamic metadata to HDR10.

Why is this important? HDR10 uses static metadata, which includes only one value for the maximum and average luminance level of the entire program (movie, TV show, etc.). It’s certainly better than standard dynamic range, but if the program contains some scenes that are very bright and others that are very dark—which, of course, most programs do—they won’t necessarily look their best. The display applies tone mapping based on the static metadata, which doesn’t take into account the specific characteristics of each scene.

By contrast, Dolby Vision uses dynamic metadata, which represents the maximum and average luminance levels for each scene—or even each frame—of a program. As a result, the display applies tone mapping to each scene based on its specific characteristics, optimizing how they look on the screen.

For all HDR formats, the content, source device, and display must all implement a given format in order to work. For example, the UHD Blu-ray specification requires HDR10 for all HDR content, while Dolby Vision and Technicolor are optional. Studios and streaming providers with HDR content started using HDR10, but now several also offer Dolby Vision as well. And of course, the display must also support the chosen format.

If Dolby Vision results in a superior HDR image, why won’t Samsung adopt it? The only reason I can think of is that Dolby charges a licensing fee to use Dolby Vision. It can’t be that much for a large company, since Sony, Vizio, TCL, Hisense, Philips, Oppo, Apple, and others have implemented Dolby Vision in their products. Still, Samsung refuses to support Dolby Vision.

Instead, Samsung came up with a different solution: It added dynamic metadata to HDR10, calling it HDR10+. What’s the difference between HDR10+ and Dolby Vision? HDR10+ is an open, royalty-free standard—now codified as SMPTE ST 2094—that requires no licensing fee to implement. Another difference is that Dolby Vision content starts out with 12-bit precision, while HDR10+ starts out with 10-bit precision. In addition, Dolby claims that Dolby Vision is graded manually by trained experts, while HDR10+ is graded automatically, though I haven’t verified that for myself. And even if it is true, I don’t know that one method is necessarily better than the other.

When I first heard about HDR10+, I was concerned that it wouldn’t take off unless many other content creators and hardware manufacturers got on board. Also, Dolby Vision was already well entrenched in the marketplace.

Now, it seems that my initial concern was unfounded. In September 2017, Samsung, Panasonic, and 20th Century Fox announced an HDR10+ certification and logo program. Also, Amazon Prime Video became the first streaming provider to deliver content in HDR10+, including its entire HDR library. Warner Bros Home Entertainment has also announced it will encode content in HDR10+.

At CES 2018, I learned that Panasonic and Samsung will introduce TVs and UHD Blu-ray players that support HDR10+. Even better, the Blu-ray Disc Association announced that HDR10+ has been added to the list of optional HDR formats that can be included on UHD Blu-ray discs.

Here’s another important tidbit I learned at CES. Before the show, I had thought that the dynamic metadata of HDR10+ required HDMI 2.1 to convey it from an external device, such as a UHD Blu-ray player, to a display. However, Samsung maintains that HDR10+ metadata can be conveyed via HDMI 2.0—at least, the “important metadata.” Apparently, the limited metadata includes only 24 KB per frame and can be encoded in the video info frames, though HDMI 2.1 will be required to send the complete set of HDR10+ metadata. The company demonstrated this with a 2017 firmware-updated UHD Blu-ray player connected to a 2018 QLED TV playing content that had been encoded with HDR10+.

With content creators (20th Century Fox, Warner Bros, Amazon) and hardware manufacturers (Samsung, Panasonic) committed to support HDR10+ in their products, it seems likely that others will follow, making HDR10+ a viable alternative to Dolby Vision. And lest you think this is another format war, I see the situation as similar to various audio formats, such as Dolby Digital and DTS, peacefully coexisting. As long as hardware devices support multiple formats—which they already do—I see no reason why HDR10+, Dolby Vision, and the other HDR formats can’t coexist as well.