Consumer HDR was invented because bluray was mastered to a peak luminance of 100nits when flat panels had started to go up to 800-1000nits. For these panels, being able to keep the same reference white around 100nits and shoot highlights up to 1000nits+ means more dynamic range. That's because they have MORE dynamic range available than the former format offered (0-1000nits vs 0-100nits).Merry Christmas!
Just so I'm following...
I understand that you have tried to find a way to use HDR because that will retain the most color saturation. But I'm wondering about your most recent best/compromise settings (70 nits) and contrast. I had inferred, or hoped, that while
your latest settings didn't get highlights as bright as your previous setting (100 nits), they were still brighter than you would get from SDR. In other words my question is: are you managing to get a somewhat more dynamic/contrasty image with your current HDR settings vs a regular SDR setting? Or are the only benefits you see related to color saturation?
I'm just hoping we could still get even a bit of the HDR contrast effect on our projector.
With our projectors, in a bat cave, we can't sustain much more than 50nits for reference white, so we were already remapping the 0-100nits of bluray to 0-50nits. BUT we have a great native contrast within that range, say 100,000:1 with our rs500 (it will be less or more depending on each set-up, but let's keep this as an artificial number for the discussion). With bluray, we're mapping 0-100nits into 0-50nits and we get 100,000:1 native on/off (at least twice as much with the DI). Our black levels, with a reference white of 50nits, are very low, thanks to the 100,000:1 native on/off.
The only thing we needed was the Dolby Vision cinema HDR which has reference white at 48nits and peak white at 108nits. But that will never happen, so no need to lament about that.
We have to deal with HDR content, mastered for flat panels (either OLED with deep black and 600nits or LED with no so good black but up to 1000nits). This means we have to map to the same reference white (it's not defined but it's around 100nits) to around 50nits, just the same. Anything more than that, at least in a bat cave, it too much for your eyes (too straining) if sustained over a long period of time. The question is what do we do with the highlights (anything above 100nits in the content, which we display at around 50nits anyway).
We can try to get up to around 150-200nits peak brightness (for those who can, we end up with significantly LESS native contrast, and a significantly ELEVATED black floor). So we'll still map 0-100nits into 0-50nits, and then we can compress 100-1000nits+ into whatever remains, say 50-150nits in most cases.
You have to realise that we're talking highlights. Most of the content is mastered EXACTLY as bluray, between 0-100nits (which end up between 0-50nits in our bat caves).
In order to map the highlights into 50-150nits, you have to open the iris (which LOWERS the native on/off) and for most of us use high lamp, which raises the black floor even more.
This means that we end up with LESS contrast, to try to show some elusive highlights (mostly specular highlights, chrome, sun effect, some bright lights etc). Chasing the highlights means killing the black floor and reducing the effective contrast, hence the dynamic range.
This is the main difference with panels. With panels, HDR allows you to use formerly unused dynamic range (the part between 100-1000nits). With projectors, it KILLs contrast. For example, if I open my iris fully to get around 200nits peak brightness in high lamp, my contrast drops from more than 100,000:1 to around 40,000:1, with a much higher black floor. HDR done the JVC way (high lamp, iris open) means LESS dynamic range, not more, at least for those who can display SDR with reference white at 50nits with the iris not fully open. The closer you can set the iris to reach around 50nits, the more dynamic range you gain.
This is why, instead of trying to chase highlights, we have two solutions to regain this lost dynamic range:
1) use SDR BT2020, which forgoes the highlights completely and converts say 0-1000nits (or more) to the usual 0-100nits, which we then display within our usual 0-50nits, with one main upside (return of the black floor and full native on/off, so MORE contrast than HDR) and one main downside, desaturation of the WCG (we might be getting a bit more than rec-709, but not that much and definitely not the full WCG that we're getting with HDR).
2) Go the other route I suggested recently, which is to keep HDR, so keep the full WCG, but map it into the configuration that gives us the most native on/off, so very close to our SDR settings. That way, you map the usual 0-100nits in the content (where most of the information lies anyway) into our 0-50nits, BUT we're using whatever we can get while still keeping a decent black floor to map the rest.
In my case, I use an iris setting of -13 in HDR instead of -14 in SDR, and I have reference white around 50nits and I use 50-70nits to display the highlights (100-1500nits as I clip around 1500nits to save brightness).
This compromise gives me:
- Black floor and low APL scene almost as good as what I get in SDR rec-709, I just lose the DI and my black floor is only slightly elevated.
- I get close to the native on/off I have in SDR, say around 90,000:1. Much better than the 40,000:1 I would get in JVC recommended SDR, and better than the 60,000:1 I would get with the iris at -7.
- Full BT2020 (at least the portion that's used at the moment, so around DCI), so more vibrant colors than rec-709 /BT2020, especially with high luminance parts of the pictures: sky is more blue, fire is more red, instead of being closer to white in SDR BT2020 and much closer to white in rec-709.
- The higher resolution of HDR, even through -eshift
- More highlights than in bluray, because in bluray most of the time there is nothing between 235 and 255, which with UHD Bluray we know that there is something above 100nits, even if it's not all the time and not most of the content.
- Much less banding than in bluray thanks to the 10bits color depth.
- Full immersive sound track (my main reason for upgrading at this stage given all the limitations in projectors)
Overall, it's not perfect, but I would call it SDR+ instead of what we get otherwise, which is HDR-. I think I'll need a Radiance Pro or Dolby Vision with the DI to be fully happy, because only this will allow us to keep all the benefits while keeping an accurate calibration, the full dynamic range our projectors are capable of and the black levels which are why many of us have bought a JVC.
The big difference with the Radiance Pro approach is that we can keep all the benefits of HDR without any of the drawbacks:
- We can decide how much we're ready to raise the black floor, which gives us our peak brightness. Because we get the DI back, we could choose to open the manual iris more than with my latest compromise.
- We can map the whole HDR content into this dynamic range using their shaping LUT.
- We can keep the DI because the Radiance converts to SDR before the PJ displays the content.
- Because the Radiance knows your peak white value, it can map the content much more precisely than the player can.
- You can adjust this mapping to your preference re where to clip as there is no standard to achieve that.
- You can calibrate accurately to known standards like BT2020 and BT1886 or power gamma 2.4.
This means in theory that we can get close to my last compromise, but with better calibration accuracy for both gamut and gamma, and we get the DI back, so near perfect fade to black (with more or less artifacts depending on how much you open the iris).
In the meantime, my latest compromise is the best I've found to date, despite the somewhat inaccurate calibration and the loss of the DI.
I realise it's counter-intuitive, but I hope the explanation above will help to explain why HDR on projectors without a DI active in HDR is kind of one step forward and two steps backward. My compromise gives one step forward and half a step backwards, which is still half a step forward
The thing that annoys me most, apart from the loss of the DI (I can live with that with the manual iris at -13) is the lack of accuracy in the HDR calibration. I can tune it to something that looks correct and that I like, but I'm not sure how accurate it is. I know it's not completely out of whack, but I have to lower dark gamma for example to prevent raising the black floor a little bit more than I'd like to. I'm not crushing black, but I know that as a result my low end is slightly too dark.
This "shooting in the dark" regarding calibration is what I hate most about HDR, but the PQ improvements vs SDR are too good to turn down, and that's without mentioning the AQ improvements (immersive track most of the time not present on the bluray).
What I call DG is Dark Gamma, PT is Picture Tone and BG is Bright Gamma.My apologies for asking naive questions. Hope you won't mind me asking. What is DG? In Gamma D, I see Picture Tone (PT), Dark Level (DL) and Bright Level (BG). What's DG?
Also, can you please recommend what Contrast/Brightness need to be for Super White Input?
Thx and happy Holidays.