Originally Posted by donaldk
Darin for perception 100 nits is halfway to 4K, 5K, perhaps even 10K nits, so half the electronic signal for both halves should work well, that is what an HDR algorithm developer pointed out to me. All the HDR systems work with linear curve on the low end, resembling traditional gamma and a logarithmic curve on the second part. Difference lies in how they handle the transition/break.
Thanks. That makes some sense to me based on electronic levels (I'll try to look up some curve data later), but I'm trying to understand how this relates to CR.
With SDR the halfway point for signals is 50% video level (ignoring above reference white for simplicity), or about 20% luminance. With a 2000:1 on/off CR projector the top half of the electronic curve gets about 5:1 of the total CR range and the bottom half of the curve gets about 400:1. Even with the bottom half getting the bulk of the range it is the half where normal human vision is likely to notice poor CR the most.
Using an extreme example to show how bad things could get, now try the same thing with 10k nits and the same 2000:1 and the top half of this curve would get 100:1 if the electronic halfway point is 100 nits. Now there is only 20:1 left for the bottom half of the electronic curve even though I think it is the part that really needs the most CR.
In such a case a person could decide how to spread the error out so that they don't crush all the shadow detail, but they would need to deviate from what the HDR EOTF calls for in order to do that. Basically, you need to rob Peter to pay Paul when there isn't enough range to get the whole curve right.
The above is much of the reason that I think high native or zone on/off CR is necessary to do HDR the way Dolby envisioned. HDR makes real on/off CR even more important than before IMO, and I thought it was already important for SDR (along with other factors that matter for great picture quality).
On the flip side I think that ANSI CR should carry even less weight for HDR than it did for SDR since average luminance levels as a percentage of the top white level should be much lower with HDR content than SDR content, since if you map both "gamma" curves from their relative 100% to 0% the HDR curve would be below the SDR curve for every point except 100 and 0.