Join Date: Dec 2005
Location: Philadelphia, PA
Mentioned: 420 Post(s)
Tagged: 1 Thread(s)
Quoted: 8749 Post(s)
Is Dolby Vision a "Must Have Feature" for Your Next TV?
Today’s 4K and 8K TVs offer exceptional picture quality thanks to the combination of high-resolution and HDR with wide color gamut. Today, there are four HDR formats found on consumer TVs: HDR10, HDR10+, Dolby Vision and HLG. However, not all TVs support all four formats.
Arguably, the quality that is available to consumers at home has superseded that of commercial movie theaters, thanks to the capabilities of modern flat-panel displays. For example, HDR content for home viewing is typically mastered to either 1000 nits or 4000 nits peak luminance (typically 4000 nits) giving it a lifelike appearance when shown on a compatible display.
The most common flavor of HDR is HDR10. This format is used by streaming services, video games, as well as Ultra HD Blu-ray. HDR10 provides 10-bit color for smooth gradation, but it relies on static metadata. That means if a TV cannot match the monitor upon which the content was mastered, some sort of compensation is required. With static metadata, TVs take a one-size-fits-all approach to filling in performance gaps.
Dolby Vision has emerged as the most popular alternative to HDR10. This format is capable of handling 12-bit color, for ultra-smooth gradation. Moreover, it employs dynamic metadata that helps the TV optimize content based on its capabilities, scene by scene. So, if the TV is not able to match a mastering monitor, it will still make the most of the content that it’s fed. Dolby Vision is available on a wide variety of televisions, including those sold by LG, Sony, TCL, Hisense and Vizio… but not Samsung (which relies on HDR10+, a format that also includes dynamic metadata).
Ever since it was released, I have seen enthusiasm for Dolby Vision grow. Its adoption by major streaming services including Netflix, Amazon and Vudu increases its appeal. This is particularly so for the streaming original shows such as those offered by Netflix and Amazon in UHD. The fidelity of that material is movie quality.
Arguably, if the TV is good enough, the advantage of dynamic metadata “melts away.” However there is no consumer retail television that is able to output 4000 nits peak luminance while in a calibrated “Movie Mode.” So, currently, all TVs need some sort of help to determine how to tonemap the material. With regular HDR10, this is accomplished with an algorithm that takes its best guess at how to handle discrepancy. With dynamic metadata, whether it's Dolby Vision or HDR10+, there is no guesswork involved on behalf of the TV. Having said all that, the algorithms that do the translation of vanilla HDR10 have gotten quite good. Consequently, the days of HDR10 content looking too dark are largely behind us.
So, having said that, here is my simple yes/no question: Is Dolby Vision a "Must Have Feature" for your next TV?
Editor, AVS Forum
Last edited by imagic; 03-13-2019 at 02:51 PM.