AVS Forum banner

21 - 35 of 35 Posts

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #21 ·
That's a clever test, thanks. The m2ts fails (black screen) to play back on my X800M2 but the mp4 plays in full glory (RPU and FEL).
If you incorporate the m2ts into BDMV folders with TSMuxer then it'd work from USB - that's the thing with dedicated BluRay players - you kind of have to trick them a bit..

As @DaMacFunkin says, none of them will ever support DV in MKV - that's a completely different process to that used by discs and needs to be coded for specifically and outside of the Dolby SDK.

The only way with MKV is with media players where such constraints don't really exist... (or are ignored!)
 

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #23 ·
Dolby Vision TV-Led vs Player-Led Comparison: Which is Better? - YouTube
Is it true that in this 1 to 1 comparison with Oppo 203/Sony TV it is proven that TV-Led is better than Player-Led ? This is the second time I read or see that Dolby Vision processing can better be done at the TV than on the player...

(Don't fall asleep when the teller of the story almost gets lost in his own story 😂)
Strictly speaking, no, it's not true, but that's not the whole story - it depends on the hardware involved.

Assuming we're talking about a Full Enhancement Layer source and a player capable of processing the video in the enhancement layer (eg a UHD BluRay player), Player led combines the base and enhancement layers to form the 12-bit Dolby signal and applies the RPU. This is then output using a static PQ (perceptual quantiser) gamma curve, the parameters of which are defined in the Dolby Block of the TV's EDID as one static tone mapped 12-bit signal. (some TVs will accept 10-bit also, and the spec allows that, but I'm not sure why you'd want to do that!)

From there, the TV has to do very little work - the signal is already combined and dynamically tone mapped to the static curve in the player so for all intents and purposes it's like processing a standard HDR10 signal in 12 bits - apply the right colour primaries and gamma and it's job done.

With TV led that recombination of the two layers and tone mapping using the RPU occurs in the TV after the two layers have been tunneled in an 8-bit RGB wrapper down the HDMI cable. This is why we have player led Dolby Vision at all - Sony buggered up their first tranch of TVs - they just weren't powerful enough to do this despite already being out in the marketplace, so a little hack based on tech designed for gaming was quickly repurposed to save face (and a lot of money!)

What Vincent missed is that there are two separate devices processing the Dolby Vision in his tests - both I believe made by Mediatek, but the Oppo is much older and to be fair, not that great, so there are bound to be image differences.

To be fair to him though, I can't really think of a fair way to make a comparison because you're always going to be faced with two different processors and I dare say, proprietary image processing (although they're not supposed to for Dolby Vision). I guess a fairer test would be a Sony TV and a Sony player but I'm not sure what SOCs they each use - be interesting to see though.

Initially Dolby had quite a firm grip on Dolby Vision displays being able to provide the same or similar image quality to that seen from the Dolby Master regardless of manufacturer but as happened to THX for different reasons, that has all been washed down the rusty drain of compromise and market forces. Good job too imho!
 
  • Like
Reactions: compingharry

·
Registered
Joined
·
168 Posts
Strictly speaking, no, it's not true, but that's not the whole story - it depends on the hardware involved.

Assuming we're talking about a Full Enhancement Layer source and a player capable of processing the video in the enhancement layer (eg a UHD BluRay player), Player led combines the base and enhancement layers to form the 12-bit Dolby signal and applies the RPU. This is then output using a static PQ (perceptual quantiser) gamma curve, the parameters of which are defined in the Dolby Block of the TV's EDID as one static tone mapped 12-bit signal. (some TVs will accept 10-bit also, and the spec allows that, but I'm not sure why you'd want to do that!)

From there, the TV has to do very little work - the signal is already combined and dynamically tone mapped to the static curve in the player so for all intents and purposes it's like processing a standard HDR10 signal in 12 bits - apply the right colour primaries and gamma and it's job done.

With TV led that recombination of the two layers and tone mapping using the RPU occurs in the TV after the two layers have been tunneled in an 8-bit RGB wrapper down the HDMI cable. This is why we have player led Dolby Vision at all - Sony buggered up their first tranch of TVs - they just weren't powerful enough to do this despite already being out in the marketplace, so a little hack based on tech designed for gaming was quickly repurposed to save face (and a lot of money!)

What Vincent missed is that there are two separate devices processing the Dolby Vision in his tests - both I believe made by Mediatek, but the Oppo is much older and to be fair, not that great, so there are bound to be image differences.

To be fair to him though, I can't really think of a fair way to make a comparison because you're always going to be faced with two different processors and I dare say, proprietary image processing (although they're not supposed to for Dolby Vision). I guess a fairer test would be a Sony TV and a Sony player but I'm not sure what SOCs they each use - be interesting to see though.

Initially Dolby had quite a firm grip on Dolby Vision displays being able to provide the same or similar image quality to that seen from the Dolby Master regardless of manufacturer but as happened to THX for different reasons, that has all been washed down the rusty drain of compromise and market forces. Good job too imho!
I am so completely non technical, but my logic struggles with "combined signals and static PQ gamma curve". Do things stay dynamic using 1 processor in the player? Or does the signal get decoded in the TV with static info and dynamic info? (I try not to bother you after this and get my head around this maybe making a manual for personal use to keep my head clear...)...And thank you, of coarse...(y)
 

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #25 ·
I am so completely non technical, but my logic struggles with "combined signals and static PQ gamma curve". Do things stay dynamic using 1 processor in the player? Or does the signal get decoded in the TV with static info and dynamic info? (I try not to bother you after this and get my head around this maybe making a manual for personal use to keep my head clear...)...And thank you, of coarse...(y)
At some point in either process the dynamic metadata has to be translated into changes in brightness in the image.

When that happens in the player, the resultant signal is tone mapped before it hits the TV

When that happens in the TV, the signal is effectively tone mapped at the screen.

The result in both cases is the same.
 
  • Like
Reactions: compingharry

·
Registered
Joined
·
980 Posts
Strictly speaking, no, it's not true, but that's not the whole story - it depends on the hardware involved.

Assuming we're talking about a Full Enhancement Layer source and a player capable of processing the video in the enhancement layer (eg a UHD BluRay player), Player led combines the base and enhancement layers to form the 12-bit Dolby signal and applies the RPU. This is then output using a static PQ (perceptual quantiser) gamma curve, the parameters of which are defined in the Dolby Block of the TV's EDID as one static tone mapped 12-bit signal. (some TVs will accept 10-bit also, and the spec allows that, but I'm not sure why you'd want to do that!)

From there, the TV has to do very little work - the signal is already combined and dynamically tone mapped to the static curve in the player so for all intents and purposes it's like processing a standard HDR10 signal in 12 bits - apply the right colour primaries and gamma and it's job done.

With TV led that recombination of the two layers and tone mapping using the RPU occurs in the TV after the two layers have been tunneled in an 8-bit RGB wrapper down the HDMI cable. This is why we have player led Dolby Vision at all - Sony buggered up their first tranch of TVs - they just weren't powerful enough to do this despite already being out in the marketplace, so a little hack based on tech designed for gaming was quickly repurposed to save face (and a lot of money!)

What Vincent missed is that there are two separate devices processing the Dolby Vision in his tests - both I believe made by Mediatek, but the Oppo is much older and to be fair, not that great, so there are bound to be image differences.

To be fair to him though, I can't really think of a fair way to make a comparison because you're always going to be faced with two different processors and I dare say, proprietary image processing (although they're not supposed to for Dolby Vision). I guess a fairer test would be a Sony TV and a Sony player but I'm not sure what SOCs they each use - be interesting to see though.

Initially Dolby had quite a firm grip on Dolby Vision displays being able to provide the same or similar image quality to that seen from the Dolby Master regardless of manufacturer but as happened to THX for different reasons, that has all been washed down the rusty drain of compromise and market forces. Good job too imho!
Thanks for your tests! In addition to this take a look at my Oppo DV tests (2 posts).
 

·
Registered
Joined
·
216 Posts
Strictly speaking, no, it's not true, but that's not the whole story - it depends on the hardware involved.

With TV led that recombination of the two layers and tone mapping using the RPU occurs in the TV after the two layers have been tunneled in an 8-bit RGB wrapper down the HDMI cable. This is why we have player led Dolby Vision at all - Sony buggered up their first tranch of TVs - they just weren't powerful enough to do this despite already being out in the marketplace, so a little hack based on tech designed for gaming was quickly repurposed to save face (and a lot of money!)
So i tried to understand all this... The "first" Sony TVs, for example Sony AG9 OLED, with LLDV/player-led, do they correctly display FEL DV if fed by a source that can process it (like a BLURAY player, x700 or something else) ?
Or do you mean that they fcked up with these TVs and therefore displays it incorrectly/not showing the enhancement layer in FEL content?
 

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #29 ·
So i tried to understand all this... The "first" Sony TVs, for example Sony AG9 OLED, with LLDV/player-led, do they correctly display FEL DV if fed by a source that can process it (like a BLURAY player, x700 or something else) ?
Or do you mean that they fcked up with these TVs and therefore displays it incorrectly/not showing the enhancement layer in FEL content?
It's not whether the TV can display the FEL, it's whether the player can process the FEL - so yes, with a BluRay player that supports Player Led DV (like the x700) the FEL is correctly processed in the player and the output is a single layer (ie single video stream) 12-bit Dolby Vision signal (base layer and enhancement layers combined).

Where it gets complicated is if you use a BluRay player that only supports TV Led DV on a Sony TV that only supports Player Led DV.

This is the point at which Dolby's credibility fell away since they always marketed DV as "use any DV player on any DV display and you'll see exactly the same image" and because of Sony's cockup that ceased to be true. But as I say, in reality it did us a great favour in the real world.
 
  • Like
Reactions: box4m

·
Registered
Joined
·
168 Posts
I am just starting to read @Sledgehamma review of Z9x and immediatley I get confused.
"...plus the dynamic formats HDR10+ and Dolby Vision". Earlier I read that HDR10 is used for the baselayer of DV, which I think of as static in my mind. But I read also @markswift2003 maybe HDR10+ is gonna be used as BL, which is dynamic?...So in my mind: a dynamic baselayer plus dynamic enhancement?
 

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #31 ·
I am just starting to read @Sledgehamma review of Z9x and immediatley I get confused.
"...plus the dynamic formats HDR10+ and Dolby Vision". Earlier I read that HDR10 is used for the baselayer of DV, which I think of as static in my mind. But I read also @markswift2003 maybe HDR10+ is gonna be used as BL, which is dynamic?...So in my mind: a dynamic baselayer plus dynamic enhancement?
Depends on the content - Dolby Vision can have either HDR10 or HDR10+ as a base layer.

As far as the actual video is concern both HDR10 and HDR10+ are identical, the only difference is the metadata and with Dolby Vision, regardless of whether the base is 10 or 10+, that metadata is not used when processed as Dolby Vision - it uses its own metadata in the RPU.
 
  • Like
Reactions: compingharry

·
Registered
Joined
·
57 Posts
so yes, with a BluRay player that supports Player Led DV (like the x700) the FEL is correctly processed in the player
Sadly all UHD BD players have issues with Player-led DoVi mode even if FEL processed correctly, issues demonstrated in Vincent's video.
Oppo has it, X700 has it, Panasonic has it.
Shield for example doesn't have such issues even if FEL ignored.
I'm sure Dune and Zidoo dont have it either
Probably because of newer DoVi decoder. Who knows beside Dolby? :)
 

·
Registered
Joined
·
216 Posts
Sadly all UHD BD players have issues with Player-led DoVi mode even if FEL processed correctly, issues demonstrated in Vincent's video.
Oppo has it, X700 has it, Panasonic has it.
Shield for example doesn't have such issues even if FEL ignored.
I'm sure Dune and Zidoo dont have it either
Probably because of newer DoVi decoder. Who knows beside Dolby? :)
Zidoo is also ignoring the enhancement layer, like the shield, dune i dont know but im guessing it also ignores since its same chip? might be wrong
 

·
Registered
Joined
·
168 Posts
Zidoo is also ignoring the enhancement layer, like the shield, dune i dont know but im guessing it also ignores since its same chip? might be wrong
Maybe I did my homework right: The video essence is not used, but the info in the EL is used for tonemapping :giggle: (by the RPU)
 

·
Registered
Joined
·
2,070 Posts
Discussion Starter · #35 ·
21 - 35 of 35 Posts
Top