Originally Posted by Joe Bloggs
Wouldn't it probably be regraded anyway for the HDR home release? I thought cinema HDR was not as high nits as LCD consumer HDR displays.
You are correct that a theater grade is different than a home grade, HDR or SDR, because of the difference in luminance. When the home grade is made depends on those involved. For the newest stuff, it is becoming common that all grades are created at the same time. But the titles that were graded for theater HDR before home HDR existed were likely re-grades.
Do they always/usually go back to the original negative though? Or do they often just do a HDR grade on the (SDR) digital intermediate ? Also lots of times I think they don't scan the original negative (I'm guessing for those that aren't new releses). But what about CGI shots - surely lots will have been not originally rendered in HDR. Basically if the editor/grader is pushing buttons/sliders to increase/decrease brightness/nit values to whatever he feels like it's not really real HDR. Real HDR would be more like a camera capturing with known nit/lux levels and the display outputting the same (live broadcast HDR is probably more "real" HDR) - though in reality we don't need/want the extreme dynamic ranges real life is capable of (I personally don't think we need to keep going higher and higher it nit levels for LCDs, UHD BDs (which are limited in the nit levels they are allowed to use for a few years) etc. I'd prefer they didn't grade to have way high contrast/dynamic ranges. Making stuff really high brightness doesn't necessarily equal better quality or a great to watch UHD BD). Plus the higher the scene nit level, the more the judder/strobing of 24 fps will be noticed (basically HDR emphasises the judder - so it really needs to be combined with HFR).
A digital intermediate is neither SDR nor HDR. It is basically the edited together version of the original source materials (but in whatever resolution they chose to work at, 2k or 4k). The color grade does not apply to the DI. It applies to the deliverables (end products). So when they are color grading, it uses all the available color and dynamic range of the source to create all the final versions of the film. And today, that can be dozens of versions when you take into account all the different combinations of 2D, 3D, SDR, HDR, proprietary big screen formats, home formats, regional versions, various resolutions, etc...
CGI is also format agnostic. It is rendered in neither SDR nor HDR. It is rendered to match the original source, then it is all graded together in the end.
If a movie is shot on film but edited using a DI, the original negatives are scanned, usually at a high resolution and bit depth. The only instance where the negatives probably aren't scanned are when the film is finished photochemically instead of digitally. Christopher Nolan being the only big director who finishes on film nowadays. So his final print is the version that is scanned.
For older films, if it was previously scanned at a high enough resolution and bit depth, that can be used to create a real HDR grade. If it was not, then it will need to be re-scanned. Preferably 16 bit and 4k or higher. And those usually are not scans of negatives, but of the highest quality finished version they can find.
If someone talks about going back to the source or original negative, it most likely means that they want to use a higher quality/resolution than what they worked in originally and preserved in the DI. Basically, a remaster. They would most likely re-scan the negatives or go back to the original raw digital camera negative if they are available.
I'm not sure I follow you on the "real HDR" camera stuff. But I assure you that most current professional cameras are capturing real HDR. And as to the level they are mastered, it is usually limited to the monitor they use to master the HDR grade. That is typically either Sony's 1000 nit OLED or Dolby's 4000 nit LCD.