As the comments in my recent article about HDR-capable displays clearly illustrate, the definition of what makes a display "HDR-capable" is not entirely clear. So here is some further exploration of the subject.

I'll start with "HDR-compatible," which I define as being able to accept an HDR-encoded signal via HDMI. It says nothing about the display's ability to actually render an HDR image with higher peak luminance, lower black level, and wider color gamut than standard dynamic-range displays can achieve.

But even "HDR-compatible" is somewhat fuzzy; for example, an HDMI 2.0a port is required to accept an HDR10 signal with its static metadata, but HDMI 2.0 is sufficient for Dolby Vision, which embeds its dynamic metadata in the signal itself. The metadata describes the characteristics of the image being transmitted—peak luminance, color gamut, etc.—and the display uses it to enable its HDR mode (if it has one) and adjust the image data to fit the display's capabilities, a process called "tone mapping." If the display doesn't know what to do with the metadata, it will be ignored and the signal will (hopefully) be displayed in standard dynamic range with BT.709 color.

Then there's "HDR-capable," which I define as being able to accept an HDR signal and render the image on the screen with higher peak luminance, lower black level, and wider color gamut than standard dynamic-range displays can achieve. Of course, this means the display must be able to interpret the HDR metadata used in HDR10, Dolby Vision, or both and tone map the image data to its particular capabilities.

Here's where things get really fuzzy! Displays have different peak-luminance, black-level, and color-gamut capabilities, so what are the minimum requirements for an HDR image? At CES this year, the UHD Association published its Ultra HD Premium specification that attempts to answer this question, at least for displays that carry Ultra HD Premium certification.



As you can see in the Ultra HD Premium specs above, the display, content, and distribution must have a pixel resolution of 3840x2160—no problem there, since all HDR displays use that resolution. The next display spec is somewhat less clear: The color bit depth must be "10-bit signal" (actually, 10 bits per color or 30 bits total for red, green, and blue). That means the display must be able to accept a 10-bit signal, which is used by all HDR content at this point. But does it mean the display's imaging panel must have native 10-bit precision? Probably not.

Some manufacturers, such as Sony and Samsung, do not reveal the bit depth of the imaging panels in their TVs. They say that an 8-bit panel with good image processing (including something called "dithering") can potentially render a better HDR image with less banding than a 10-bit panel with poor processing. That's probably true, but if the panel is 8-bit, I contend that it's inherently limited when it comes to HDR. After all, 8 bits gives you 256 steps from lowest to highest luminance no matter what the processing is, while 10 bits give you 1024 steps, which means inherently less banding and a wider range of colors. In this case, more is better in my book.

Speaking of color, the Ultra HD Premium spec calls for "BT.2020 color representation" throughout the signal chain—in other words, the color gamut represented in the signal must conform to BT.2020 (actually, use BT.2020 as a "container," though the actual colors used in the content can be anything within that boundary). But there are no currently available consumer displays that can render the full BT.2020 color gamut, so the spec says the display must be able to reproduce more than 90% of the digital-cinema gamut known as DCI/P3, which is itself a smaller range of colors than BT.2020. Tone mapping, here we come!



But 90% of what, exactly? The area of the P3 triangle in the CIE diagram (shown above)? The coordinates of the red, green, and blue points (whatever 90% of a set of coordinates might be)? Assuming it's the area of the P3 triangle, the red, green, or blue points may be off by different amounts in different displays. But in all of those cases, the displays' triangles might all be more than 90% of the P3 triangle, even though they might render somewhat different colors on the screen.

The last spec is labeled "High Dynamic Range," and it includes the peak luminance and black level as well as the EOTF (electro-optical transfer function), which determines how much light the display outputs in response to different brightness values in the signal. The EOTF is specified as SMPTE ST 2084, also called PQ (Perceptual Quantization). This is unambiguous; both HDR10 and Dolby Vision use PQ as the EOTF, so the display must be able to properly interpret and render it.

Notice that there are two peak-luminance/black-level specs: more than 1000 nits and less than 0.05 nits OR more than 540 nits and less than 0.0005 nits. (Interestingly, the higher range corresponds to a contrast ratio of 20,000:1, while the lower range corresponds to a contrast ratio of over 1,000,000:1!) The spec does not say why there are two ranges, but they obviously apply to LCD and OLED TVs, respectively. As you probably know, LCD TVs can achieve much higher peak luminance than OLED TVs, while OLEDs can achieve much lower black levels. An unfortunate omission is a specific range for projectors, which can't reach anywhere near 540 nits of peak luminance on most screens, much less 1000.

What about displays that do not carry the Ultra HD Premium certification? In my view, to be considered HDR-capable, they need to be able to accept and understand HDR-encoded signals (with their associated metadata) and render them with higher peak luminance and wider color gamut than current HD content calls for—100 nits peak luminance and BT.709 color gamut. (The current HD system does not specify a black level, and LCD TVs are getting pretty good at deep blacks, especially FALD designs, but I would hope that HDR-capable sets have even deeper blacks than standard dynamic-range sets.) How much greater should the peak luminance and color gamut be? A factor of 2? 5? 10? That's a totally open question, and we're likely to see displays touted as HDR with capabilities all over the map.

One big problem is trying to decide if a display is HDR-capable based on the manufacturer's specs. As I mentioned earlier, some companies do not reveal the bit depth of their imaging panels, and as we all know, peak-luminance and contrast-ratio specs are notoriously inflated using measurement techniques that do not correspond to real-world viewing. This is why Ultra HD Premium certification is valuable—displays undergo extensive testing by independent facilities to determine if they meet the specs, so there's no need to rely on a manufacturer's claims.

Some have suggested that edgelit LCD TVs inherently do not qualify as HDR-capable, but that is not true. For example, there are quite a few 2016 Samsung edgelit LCD TVs that are Ultra HD Premium Certified, so this is a non-issue. What's important is whether or not the display can render an HDR image with sufficiently greater dynamic range and color gamut in response to HDR signals than the current SDR standard specifies.

Other than the question of what defines "sufficiently," I'm convinced that an HDR-capable display—in conjunction with HDR content—can offer a much better viewing experience than anything we've seen up to now. That's why I created the list of HDR-capable displays in the first place, and why I encourage shoppers to seriously consider getting one of those models instead of a display that cannot render HDR images from HDR sources.