Assume that we agree that for digital connections between source and display device..
a) video levels should be maintained
b) video is intended to end up within the 16-235 interval
c) stuff that ends up below 16 should NOT be visible.
d) stuff that end up above 235 SHOULD be visible
e) to keep this discussion cleaner, lets pretend we have perfect D65 tracking whereever we set the contrast
For the sake of THIS discussion, please refrain from discussing points a to c or e. Point d) is the one I'm after here. Fair enough, I've paused my player in lots of scenes where I'd expect to find points of maximum white and sure enough, a contrast setting so high that values above 235 can't be distinguished make these scenes look bad. In my experience, resolution above 235 is clearly needed, and this so often that setting maximum white of my display to match 235, although giving a "punchy" nice image, will be easy to spot since clouds start to look like marshmallows etc. So, point d) is as clear as a) to c) I think.
Getting to the real point then.. How MUCH above 235 _needs_ to be distinguishable? There is data above 235 quite often, but is there EVER any data above, say, 245? 250? 240? I'm not after the "should" here, I'm after the reality-checked value to shoot for.
I know that you "should" be able too distinguish values all the way up to 254, but maybe you'll end up reserving the maximum white of your display for values that are only present in that one scene in that one film that you don't watch very often?
I can live with better contrast in 99% of my movies even if the 1% look worse. If it's 80-20, I could not. Anyone got hard numbers or experience to share here? Thanks.
a) video levels should be maintained
b) video is intended to end up within the 16-235 interval
c) stuff that ends up below 16 should NOT be visible.
d) stuff that end up above 235 SHOULD be visible
e) to keep this discussion cleaner, lets pretend we have perfect D65 tracking whereever we set the contrast
For the sake of THIS discussion, please refrain from discussing points a to c or e. Point d) is the one I'm after here. Fair enough, I've paused my player in lots of scenes where I'd expect to find points of maximum white and sure enough, a contrast setting so high that values above 235 can't be distinguished make these scenes look bad. In my experience, resolution above 235 is clearly needed, and this so often that setting maximum white of my display to match 235, although giving a "punchy" nice image, will be easy to spot since clouds start to look like marshmallows etc. So, point d) is as clear as a) to c) I think.
Getting to the real point then.. How MUCH above 235 _needs_ to be distinguishable? There is data above 235 quite often, but is there EVER any data above, say, 245? 250? 240? I'm not after the "should" here, I'm after the reality-checked value to shoot for.
I know that you "should" be able too distinguish values all the way up to 254, but maybe you'll end up reserving the maximum white of your display for values that are only present in that one scene in that one film that you don't watch very often?
I can live with better contrast in 99% of my movies even if the 1% look worse. If it's 80-20, I could not. Anyone got hard numbers or experience to share here? Thanks.