Originally Posted by mozmo
I think you're referring to dual layer streams, which involves using 2 HEVC encoders/decoders. I don't know of any SOC outside of the VS10 that can decode 2 4k HEVC streams, nor are the space/bandwidth requirements feasible.
All the dolby vision streaming encodes we have right now are single layer, 1 HEVC main 10 encode/decode with post process upconvert to 12bit. ...
Once again, in my opinion you are misreading and jumping to the wrong conclusion.
VS10 is not a SoC.
There are several Dolby Vision profiles: 12-bit PQ / 10-bit PQ / single layer / dual layer using 8-bit AVC, 8-bit HEVC or 10-bit HEVC.
Universal HDR-compliant displays
It's up to each "service provider" to decide which profile to use.
For example, the BDA has decided to use the 12-bit PQ dual-layer profile.
"220.127.116.11.5.2 Dolby Vision
The Dolby Vision video stream is composed of a BDMV HDR [aka CTA HDR10] video stream and a Dolby Vision enhancement layer video stream. The enhancement layer is an HEVC video stream with embedded Dolby Vision metadata. The Dolby Vision video signal is characterized by the followings:
• color primaries: BT.2020 with non-constant luminance
• EOTF(Electro-Optical Transfer Function): SMPTE 2084
• Bit depth: 12bit (combination of BDMV HDR video stream and Dolby Vision enhancement layer)
• Enhancement layer video stream: 1920x1080 resolution, same frame rate with the BDMV HDR
video stream, 100Mbps or lower together with the BDMV HDR video stream"
Universal audio/video receiver (Dolby Atmos, DTS:X, etc.) is achieved.
Universal HDR TV (HDR10, Dolby Vision, HEVC HLG HDR, VP9-HLG / VP9-PQ YouTube HDR, Dynamic HDR) is required.
Push for universal HDR TV!