If a meter can read up to 1000 nits?
I haven't really seen any hdr calibration. I am little confused of the process.
So i have read that HDR can produce in a scene up to 1000nits of content. If were calibrating a display with pure white showing 120 nits for room viewing then how is it ever gonna hit 1000 nits? Or is the calibration process difference. Thanks.