Originally Posted by losservatore
Dolby vision beats HDR10 on Oleds. after all our oled do great with the best hdr format (Dolby vision).
I think one important detail that he's sort of getting wrong is that neither HDR10 or DV actually specify a tone curve, per se. HDR10 specifies the maximum nits of the disc, and the correct nits for each pixel are determined based on that. How the TV handles nits outside their range is entirely dependent on the TV itself, as you can see when he changes the dynamic contrast settings.
Dolby Vision is similar with its metadata, but slightly different. For one, it specifies the maximum nits on a per scene or per frame instance. If that was the only difference, then with dynamic contrast set to low, those two TVs should have looked basically identical aside from very subtle banding differences. The reason Dolby Vision looks so different is because it includes metadata about the physical capabilities of the display used to master the disc. The TV itself also has this information built in about what it
is capable of as well. The Dolby Vision software
in the TV dynamically determines a tone curve which is intended to display the details visible on the mastering display, but adapt those details to fit within the capabilities of the TV you're using, on both the low and high end of the spectrum. Nobody but dolby really knows how that process works exactly, but I would imagine the shadows, midtones, and brightest highlights are curved steeper to retain contrast in the most important areas, while the medium highlights may be flattened somewhat to compensate. For scenes that fit within the nit range of the TV, it would be a flat curve that would likely match the HDR10 presentation, as long as both modes were calibrated properly. That's just a guess, but that's at least one way you would retain detail in shadows, highlights and midtones without the ugly ringing/halo effects you get in some HDR tonemapping solutions.
So I don't think Dolby Vision really has a steeper overall "gamma" curve like he kept suggesting. He calibrated both modes to display the same nit ranges at the same levels, which means the "gamma" should technically be the same, as long as the image fits within the TV's capabilities. I think the reason it looks like that is sort of the compressing of relative nit ranges in the shadows, highlights, and midtones. Because the curve becomes steeper in those ranges to retain that detail, the image looks more like it would if the TV had a steeper gamma setting. The closer you get to the scene max nits being the same as the TV's, the less pronounced this effect should be, as tonemapping becomes less aggressive the less it's needed. Although even within max nits, there may still be some difference in the curve to account for the difference between the mastering display and the TV.