AVS Forum banner

1 - 9 of 9 Posts

·
Registered
Joined
·
7,111 Posts
Discussion Starter #1
We need to talk about HDR ...

Creative Intent
There are many ways to use HDR but at the extreme ends there’s a subtle approach that stay close to what we’re used to and an aggressive approach that delivers eye-popping, nearly blinding video. HDR offers creators a more sophisticated canvas (or, if you will, an enhanced paint box) and some consciously choose to use it in a very restrained way. Here’s what Charles Bunnag, one of the colorists/finishing artists on the team that did The Mandalorian, had to say about this:

"It was done with true HDR with a creative intent to not go super bright. Whether someone thinks this is bright enough or not is welcome their opinion. My main issue with this article and similar critiques I have seen lately is that they are taking guesses and presenting them as fact. They could have simply reached out to someone and asked. This leads me to another concern I have been encountering lately, using only technical numbers to judge and guide creative intent. While art and technology have a tight relationship in our field, having a deep understanding of technology and technical processes does equate to making well informed artistic decisions. In this case, it seems like what matters to the “not HDR enough” group is hitting certain numbers first, rather than what supports the best narrative and creative intent.

I also find the argument that the “image doesn’t go bright enough” for their tastes as strange as if they said a sound mix isn’t truly surround sound because the surrounds are not active all of the time and blasting at full volume."


These comments reflect my concern with DTM being the goal. We are quick to shoot down how some manufacturers choose to approach HDR on projectors if it is NOT DTM. But in such strong headed zest, are we actually trying to advance a solution which is doing an end run around creative intent?
 

·
Registered
Joined
·
136 Posts
A small log for your fire.

Hollywood colorists KNOW that consumer gear can't reproduce the entire colorspace/brightness gamut. They use a special OLED monitor that sells for around $30K and they get enough life out of them to do about two movies. Not practical even for the wealthier portion of the population. DTM done well is about mapping the artists intent into what your hardware at home can produce. For example, in The Martian there is a scene where Matt Damon drags himself back into the hab, closes the door. After a lot of experimentation and fooling around with DTM and finally getting a Lumagen, I believe the mastering intent was the harsh martian environment is overly bright, enough to almost hurt your eyes in contrast to the dim slightly dirty hab entrance. What's the Max CLL for the scene? Who knows (ok, there are several people who know) and who cares but the glare is proportional to what my PJ can produce and the contrast of light to dark tells the tale. I can say before I had the DTM feature in the Lumagen I was using the a curve loaded into the PJ and while provided a watchable experience I feel the Lumagen DTM better translates the artistic intent. The outside is brighter, the inside is appropriately dim and dirty. I spent a lot of time trying to get the inside looking white when I realized it's not supposed to be spaceship pristine white, it's supposed to be a little dirty. In the context of the film, it's I'm better, but not completely safe. I still have half an antenna in my gut.

Projectors are a tough situation because the brightness varies with screen size and distance. Of course calibration is required to get the picture sorted before any type of DTM is going to look right. A fixed display the manufacturer knows precisely what the display can do and they can hard code the proper parameters for the EOTF to make the right brightness at the right bit value on the video. There has to be some kind of configured/adjusted brightness mapping with projectors.

Final thing I will say about DTM, I do know at least one Hollywood colorist uses a Lumagen with DTM in the same way an audio mastering engineer would use an audiophile 2ch setup to validate what it should look like with DTM on a properly done residential display. Unfortunately I don't have experience with the DTM included in the more current projectors (mine is 3 years old), only with the what Lumagen brings to the table and that's why I keep referencing it. I feel like done properly as referenced in my description of the Martian scene, I feel that DTM helps my projector convey the artistic intent by mapping what I see into the closest approximation my video chain can produce of what the Director/Colorist were looking at when they color corrected the movie.
 

·
Registered
Joined
·
960 Posts
A small log for your fire.

Hollywood colorists KNOW that consumer gear can't reproduce the entire colorspace/brightness gamut. They use a special OLED monitor that sells for around $30K and they get enough life out of them to do about two movies. Not practical even for the wealthier portion of the population. DTM done well is about mapping the artists intent into what your hardware at home can produce. For example, in The Martian there is a scene where Matt Damon drags himself back into the hab, closes the door. After a lot of experimentation and fooling around with DTM and finally getting a Lumagen, I believe the mastering intent was the harsh martian environment is overly bright, enough to almost hurt your eyes in contrast to the dim slightly dirty hab entrance. What's the Max CLL for the scene? Who knows (ok, there are several people who know) and who cares but the glare is proportional to what my PJ can produce and the contrast of light to dark tells the tale. I can say before I had the DTM feature in the Lumagen I was using the a curve loaded into the PJ and while provided a watchable experience I feel the Lumagen DTM better translates the artistic intent. The outside is brighter, the inside is appropriately dim and dirty. I spent a lot of time trying to get the inside looking white when I realized it's not supposed to be spaceship pristine white, it's supposed to be a little dirty. In the context of the film, it's I'm better, but not completely safe. I still have half an antenna in my gut.

Projectors are a tough situation because the brightness varies with screen size and distance. Of course calibration is required to get the picture sorted before any type of DTM is going to look right. A fixed display the manufacturer knows precisely what the display can do and they can hard code the proper parameters for the EOTF to make the right brightness at the right bit value on the video. There has to be some kind of configured/adjusted brightness mapping with projectors.

Final thing I will say about DTM, I do know at least one Hollywood colorist uses a Lumagen with DTM in the same way an audio mastering engineer would use an audiophile 2ch setup to validate what it should look like with DTM on a properly done residential display. Unfortunately I don't have experience with the DTM included in the more current projectors (mine is 3 years old), only with the what Lumagen brings to the table and that's why I keep referencing it. I feel like done properly as referenced in my description of the Martian scene, I feel that DTM helps my projector convey the artistic intent by mapping what I see into the closest approximation my video chain can produce of what the Director/Colorist were looking at when they color corrected the movie.
The fact that you have to use the words "believe" and "feel" pretty much sums up the problem. And I don't mean that in a snarky way.

But OK, let's assume that The Martian goes for realism and that we have a good idea of what things would actually look like on Mars. What do you do with titles that are very much not meant to look realistic? E.g. Fury Road, Suspiria, Pitch Black?

If the goal indeed is accurate reproduction at home of what the content creators intended, then HDR is a complete and utter failure, DTM (of any caliber) or no DTM.
 
  • Like
Reactions: Archibald1

·
Registered
Joined
·
136 Posts
The fact that you have to use the words "believe" and "feel" pretty much sums up the problem. And I don't mean that in a snarky way.

But OK, let's assume that The Martian goes for realism and that we have a good idea of what things would actually look like on Mars. What do you do with titles that are very much not meant to look realistic? E.g. Fury Road, Suspiria, Pitch Black?

If the goal indeed is accurate reproduction at home of what the content creators intended, then HDR is a complete and utter failure, DTM (of any caliber) or no DTM.
No snark taken. I absolutely hear you on my qualitative analysis. I specifically intended to have a qualitative argument. To me the director and colorist are creating an interpreted and qualitative experience. Would the Matrix be the same if it was bright and warm instead of a pervasive institutional green? Hell no! Should Mad Max be more orange than yellow (or the other way around?), who knows but it's bright and candy color. I will never forget the first time I saw a master's painting. It was a portrait of a man by Rembrant. It seemed to glow with light and life even though only oil and canvas. I know I saw the vision as Rembrant intended that day and it was powerful.

I don't think HDR is a complete failure, but it's a hot freaking mess. This isn't the first time the consumer electronics industry has created a mess. Ever had HDMI sync issues? Confused about if a HDMI cable is 18Ghz capable, has ethernet pins wires, has ARC pins wired? Yeah, that's a disaster too...many other examples all over the place. You would think they would have made a simple resistor matrix or chip embedded in the connector to signal to the electronics what the cable can do...problem would be solved... but nope...

The best thing that can happen is for a certification to take hold that forces the mastering environments, the sources, the displays, and the source material to a single standard. Once that happens a pixel viewed by the director and the corlorist will look the same as the pixel we see. I don't see that happening to a perfect level any time soon. The closest we get is probably a LED or OLED display with Dolby Vision. You're not going to get the entire color space out of any modern display technology right now. LED can hit the max brightness value but can't get dark enough. OLED has immeasurable blackness but can't get bright enough without reducing the panel's life span to uselessness. Projectors have both problems, can't get bright enough, can't get dark enough.

The best experience for "matching" should be on a panel based display in conjunction with something like Dolby Vision. The MFR knows what the panel can do, they can build a correct EOTF to map source color X to displayable color Y within the brightness range of the scene. Dolby Vision content is mastered to their HDR spec and it includes per scene brightness to let the display do the best job possible to match the master environment. That's as close as you're going to get to what a director would be looking at. Specifically Sony demonstrated this a few years ago with their Z9, possibly at CEDIA. They had the Z9 and one of those "disposable" mastering monitors side by side and played the same HDR content. I wasn't there but an AV dealer friend was and he said that the Z9 was 99% of what the mastering monitor was capable at 1/3 the cost and 10x the lifespan.

Perhaps in a few years we will get to where we have displays that map the whole brightness gamut. For now I have a D-ILA projector with Lumagen DTM, a 940E Sony LED with Sony/Dolby Vision DTM, a 2019 Oled with LG/Dolby Vision DTM. If the rocks are a little more orange or yellow in Mad Max, does it change what I feel? Honestly no, relatively is bright, it sells that there is something WRONG with this world. The best thing a director can shoot for it to make you feel something with their work. I'm not getting the pantone chart out to grade the color. It's a fantasy and enjoying that fantasy is what the director wants. Is it a Rembrant? No, but it's a hell of a lot cheaper to own. Do you think the projector at the movie theater run by the pimple faced teenage projectionist is fully calibrated and compensated for the brightness of the light source? Nope. You're lucky if the lens is clean. The experience those of us posting in the $3K+ projector forum is probably consistently better and closer than what everyone else gets. Don't get me started about movie theater audio. The only theaters I've been in that I feel are competent are the AMC Dolby theaters, everything else I've experienced of late is garbage from a sound and picture perspective.

I feel like the displays I own give me a experience the director would be happy with. To your point, if it's different what the hell can I do about it? Not much beyond complaining. If there is a better standard or mechanism for truely matching the source mastering I will be right on that bandwagon. My plan for the foreseeable future is enjoy the absolute hell out of all the HDR content I can get my hands on.
 

·
Registered
Joined
·
2,135 Posts
I am not sure what you are trying to argue for here.

Creative intent goes out the window when you are trying to play thousands of nit content on a display capable of not even 100 nit in most cases.

It is not possible to show the image in the original intent so we must do the best we can to replicate it as close to the original intent as possible given the physical limitations of the display.

That is exactly what DTM is trying to do. If if fails to stat true to the original intent then it's simply poor DTM.

The question is not about DTM vs no DTM. It's about good DTM vs. bad DTM. DTM is the best tool we have so far for attempting to portray the original intent on a physically limited display. DTM has the best feature set that is needed to maximize the physical characteristics of the display we have to use. Whether a particular implementation of DTM does a good job of that is the fault of the implementation, not a fault of the DTM toolset itself.

When you play an HDR movie on your projector, should the DTM make it most closely resemble how it looks on a 700 nit OLED? OR how it looks in the DCI theater format? Or how about how it looks on the studio's official SDR release?

DTM is the tool in all of these cases. The final look you are targeting is the question. The content has to be tone-mapped one way or another since the display does not reach the nits that the content is mastered for.

Static tone-mapping leaves dynamic range on the table which is a giant waste when we have so little dynamic range to work with. DTM allows a much more efficient use of that limited dynamic range.
 

·
Registered
Joined
·
8,818 Posts
The Mandalorian is a terrible example, that never goes over 200 nits, and if you have 200 nits, or very close to it, MadVR at least, will display exactly 1:1 nit for nit as it was intended uip to your entered peak nits, so actually, thats one instance where technically using DTM will actually get you way closer to the real director intent than using a 1000 nit clipping point with a fixed curve for eg... Not to mention entire scenes are likely nowhere near 200 nits, so again, the only true way to watch it nit for nit is to use DTM which is a 'smart' solution.
 

·
Registered
JVC RS4500 | ST130 G4 135" | MRX 720 | MC303 MC152 | 6.1.4: B&W 802D3, 805D3, 702S2 | 4x15 IB Subs
Joined
·
9,016 Posts
These comments reflect my concern with DTM being the goal. We are quick to shoot down how some manufacturers choose to approach HDR on projectors if it is NOT DTM. But in such strong headed zest, are we actually trying to advance a solution which is doing an end run around creative intent?
Do you think creative intent is to have the video be too dark to enjoy?
 

·
Registered
Joined
·
8,818 Posts
However some valid points exist in the articles provided, and the comments by Deakins, and is one reason why generally I am against some of the contrast boosting algorithms in DTM.

I am very aware of not maxing out the scopes just 'because'. Perhaps there is room for an algorithm to somewhat 'pad' the tone response of a shot down to still leave some of that purposefully intended headroom above the peak in the shot, it would need to be smart enough to identify that some shots are trying on purpose not to max out the scopes. Should be possible to do without impacting the other advantages of DTM.

This is really only relevant for shots with intended headroom though in the scopes, if we ignore that for a moment, and not think of the scopes as the full story, the fact that we are not even achieving the intended peak nits means that if we did use DTM and according to your projector the scopes are full, but on screen you are still reaching the intended master nits mostly (at least closer to than a static frame), then I think the answer there is that DTM is technically correct and the headroom that was intended (up to say 1000 nits peak) is not relevant.

Thats why Blade Runner 2049 even though quite dim, never looks like someone is slamming contrast down my throat, its still a very relaxed and natural viewing experience.

Very high nits films are a whole other story.
 

·
Registered
Joined
·
7,111 Posts
Discussion Starter #9
However some valid points exist in the articles provided, and the comments by Deakins, and is one reason why generally I am against some of the contrast boosting algorithms in DTM.

I am very aware of not maxing out the scopes just 'because'. Perhaps there is room for an algorithm to somewhat 'pad' the tone response of a shot down to still leave some of that purposefully intended headroom above the peak in the shot, it would need to be smart enough to identify that some shots are trying on purpose not to max out the scopes. Should be possible to do without impacting the other advantages of DTM.

This is really only relevant for shots with intended headroom though in the scopes, if we ignore that for a moment, and not think of the scopes as the full story, the fact that we are not even achieving the intended peak nits means that if we did use DTM and according to your projector the scopes are full, but on screen you are still reaching the intended master nits mostly (at least closer to than a static frame), then I think the answer there is that DTM is technically correct and the headroom that was intended (up to say 1000 nits peak) is not relevant.

Thats why Blade Runner 2049 even though quite dim, never looks like someone is slamming contrast down my throat, its still a very relaxed and natural viewing experience.

Very high nits films are a whole other story.
Great points...

This is where I'm wondering if there is value in continuing to include/consider what someone like Sony is choosing to do? Are they just being lazy for not following the JVC DTM path? Or are they making different choices which benefit some parts of the PQ over others?

Even with my OLED, when watching a DV source, I have the option to stay strictly with what Dolby Vision wants to do, or I can enable HDR/DTM and experiment with Low, Mid and High dynamics for different effect. Each setting will take the source and manipulate different parts of the PQ in ways that look very different than strict DV processing. In some ways, it is not unlike when plying with SDR on our JVC projectors, and then switching between the A, B, C & D Gamma options. I always liked what the B gamma did for many types of source material, a more contrasty look it added, despite knowing it had to be less "accurate".

I suppose my desire is to have the same options with HDR content... 1) a solid reference standard like DV and 2) the option to make informed choices too drift away from that standard for effect.

A benefit of any higher dynamic range display type, you can manipulate the PQ in greater ways, without too easily adversely impacting other parts of the curve, because you're not living on the edge of reproduction limitations.
 
1 - 9 of 9 Posts
Top