Originally Posted by mskreis
Chromapure now has the option to auto calibrate HDR. I thought I would give it a try. One of the initial steps is setting brightness for HDR. Tom states, though, that this may differ from SDR. How should I set brightness specifically for HDR?
That depends on what you mean by 'brightness'? And what display are you working with?
If you mean peak luminance, this can be whatever you want it to be depending on the performance of your display. I imagine most people just run at or close to the maximum their display can produce, though some may back down lower than this if it has a negative impact on other image factors such as contrast etc.
If you mean black level, this is set in the usual way using an appropriate black level test pattern (I find the R Masciola test disc best for this).
If you mean average picture brightness (which is set initially by the Display Max Light setting) this is the trickiest of all, and something i have a problem with. There are no set standards for calibrating this, and most people do it by eye using real material, with reference to either the approximate comparable levels for the same scene on the SDR blu-ray (though that is not always appropriate because they are mastered differently), or by comparison to a reference display such a a flat panel with high peak luminance (more useful if calibrating a projector).
I am surprised no approximate standard has been determined yet for setting this kind of average brightness for the nit-for-nit range - I would have thought for example that a 40 IRE windowed pattern should be roughly a similar brightness level on most displays, and should fall in the nit-for-nit range on most displays? It might be an interesting exercise if we were all to compare that measure
EDIT: I see you have added to your post. Yes for black level use a black level pattern in the normal way, but with a HDR input. As I say, I prefer flashing patterns for black level, so I prefer the R Masciola patterns.