Originally Posted by Ftoast
Thanks to both of you for that polite and easily understood explanation..that makes a lot more sense why banding isn't ridiculously prevalent for high CR displays.
With contrast differences making a smaller impact, what size of increase would it take beyond current highCR displays to make banding a problem with good sources? Is native CR over 50,000:1 even possible, let alone possible with bright LCD?
Well, there's still a fundamental problem with measuring "true black" as it relates to CR, because a division by zero (unmeasurable anyway----some arguments there) is either not possible or conceptually viewed as infinity. Not particularly useful. But to directly answer your last question, I can't imagine any LCD ever
blocking enough light to achieve 50,000:1, I don't care what
the high level is. This is keeping FALD out of the discussion: It introduces a confusing calculation by itself because they're shutting off the light behind a region.
But besides that, strictly for HDR and banding issues, I think it's best to not think first
in terms of CR, but think in terms of the range. The reason is because:
1. The high and low levels (range) allow you to calculate the CR, not the other way around.
2. The range (and bitdepth) allows you to calculate the bands and their neighboring values. (Assuming you know the curve applied....they're not linear).
In both cases, you start with the range; it's the raw output of light and how many steps you have available from the min to the max of that output, that determine the banding effect. The CR is something you calculate after knowing the range.
Of course, we're specifically leaving out band mitigating algorithms like spatial dithering for this discussion.