Originally Posted by EvLee
It's not just Disney that does this. Many HDR releases of older titles are regrades on top of an existing SDR master. The quality of that regrade can vary quite a bit depending on how much time is provided, as well as the quality of the source material. There are a lot of tools and tricks colorists have at their disposal, but if the colorist has to fight limitations in the source (that might also be present even if they went back to a film scan) then they can only do so much to the image before it starts to fall apart. In addition to that, some directors and franchises are very protective of the original look of the movie. In those cases, the colorist may be able to do more but they have to dial it back to stay within the boundaries established by the client.
As for working from EXR files, even that doesn't necessarily guarantee a better result. Usually they are best source to work from. However, if the renders weren't being carefully checked during production, the files can contain all sorts of inconsistencies in the FX. Some lights could be far brighter than others, some effects may contain artifacts... there are all sorts of things that would be masked in SDR and overlooked if the production is in crunch mode. Then when you open up the EXR file and finally look at it in HDR, you just go holy #$*!!! how do we fix this? On top of that, those EXR files have a good chance of missing other fixes that were made later during post-production, so then you have to track down those too, recreate them and hope none were missed. Going back to the EXR does not always make the remastering easier. Speaking from experience.
Totally agree on both points. For the SDR-to-HDR grade point, you can absolutely deliver fantastic results with the right source, WHEN enough attention is dedicated to it. An SDR source (including film) often still retains a lot of the detail you see revealed in HDR highlights, rolled off instead of clipped. Whenever that's the case, if you know what you're doing an have the right tools, you can seriously bring out a ton of extra highlight information. Heck, even with a pure 8-bit SDR source, you can still bring out a ton of information (although it obviously creates banding). Here's a couple examples I've done.
First, from SOLO, is this shot in SDR:
After carefully manipulating curves, I was able to achieve this much highlight detail:
Then here's another shot, from Endgame:
Then my manipulated version:
If I can extract this much highlight detail out of an 8bit source (compressed at that), then imagine what a studio could do with a proper 16bit source with WCG. The problem is, you actually have to dedicate time to carefully analyzing and manipulating every single scene. A lot of these cheap jobs are applying one filter to the whole movie, and often even using the same filter on multiple movies. You really need to dedicate time to a project to get good HDR. On a scene by scene level at the very least, and really, on a shot by shot level for best results. You really need a talented colorist experienced with HDR that really knows what good HDR should do, and you need to give them the same level of dedication you would before any theatrical release for the color grading process. If you're not willing to put in the time and money to making it look great, it's really a wasted opportunity and we'll probably just end up seeing a proper remaster years down the line.
And none of what I did above results in highlight detail that looks unnatural. When viewed at the correct exposure, those are VERY bright highlights, and yet they look perfectly natural. It annoys me so much when people say things like "the highlights aren't overdone
" because what they're usually doing is excusing crap highlights, not suggesting the highlights are of an appropriate dynamic range. Sure it's possible to crank things up beyond what's appropriate for the scene, but I've found that's pretty dang rare.