I just did quite a lot of interesting exploration to see if my hypothesys is correct, and I think I am correct here.
HDR for any given frame actually has a lower ADL as far as your display device is concerned. I will show you how.
I pulled a pretty good frame from LUCY, this is the kind of shot which wows me in HDR, it has extreemely bright highlight elements and a darker mid tone to it, there can appear to be tremendous range to this image in HDR.
First, here is the shot from the SDR Bluray. Look at the waveform. The display device sees this whole image and will map it using gamma to its capabilities. Think of the waveform in its entirety as a container. In SDR this is very easy to get right, since you only have to set your white and black clipping, and you can be sure your display will map this properly and everybody for a given Gamma setting (I am going to use 2.4) should be looking at an image with the same ADL.
By the way, this shot in SDR has an ADL level of 2.96%
Now, this is the shot pulled right from the UHD Bluray with nothing at all done to the image:
This shot RAW as it sits on disk if you viewed it at 2.4 has an ADL level of 2.26%
. But we dont view it this way, this is just for giggles.
Looking at the waveform though, and thus, the size of the container which has a ceiling of 10,000 nits, we can see most of the content is lower. But since we dont have a 10.000 nits display, we dont clip to that, and hence we need to look more at what say, setting a 1000 nits clipping point might do, which is probably what most LCD's and OLED's will be doing...
So here is the same image but tone mapped to 1000 nits clipping, now our container ceiling is 1000 nits max which is much more realistic. Most of the content is even lower now, NOTE. If you have a 1000 nits display, and you simply set your clipping point as such, then this image will look correct. I am using a Samsung QLED right now, and I just put backlight on max which is about 1000 nits, and sure enough, yep it looks like it should.
But more interestingly, guess what. This shot now has an ADL of only 0.58% according to a true 1000 nit display, LED LCD or OLED.
The tone mapping uses gamma 2.2 as a baseline, so I calculated it based on that but this should be correct math. Since the tone mapping I just used is designed to put the display into 2.2 gamma mode and it will do the rest (MadVR).
So what we know from this, the ADL is based on the 'container' of information the display device is dealing with and associated gamma. Tone mapping will greatly impact this number. What I can tell you with confidence, is, if the general level of the image is overall brighter and I am including shadow and mid tones, vs another image, then its going to have a higher ADL level. SDR is more 'flat' and everything is bunched together, so, to your display device and I can prove this with math, it absolutely has a higher ADL level. At least I have just shown this to be the case with a true 1000 nits display.
But what about projectors? Lets assume you have a pretty bright one since you mentioned that, lets assume you can get 150 nits on a large screen since 'cinema' is what we are chasing.
Then, lets look at a VERY well dynamic tone mapped version of that shot for a 150 peak nits display.
This has an ADL of 2.81%
Now we see that the image is using the WHOLE container. BUT, its punchy, there is still a good deal of depth between the brightest elements and the darkest, so, when viewed at about 150-200 nits, this should look stunning and notably superior to the SDR version, there should be no doubt. But as you can see, according to your display, interestingly, unless we push the tone mapping harder, it does indeed have a very similar ADL level to the SDR version, it does NOT have a higher ADL level.
I will note, I doubt that Daves DLP Harpervision comes close to this, not to step on him, but unless Dave is making a new version of Harper-vision for every scene in the film, then this tone mapping is going to be quite superior, its doing pretty much exactly what Dolby Vision is doing it knows precisely how bright each frame is and is adapting itself to maximise the rendition of dynamic range in each scene so as not to throw away any potential for that famous HDR punch. Except this adds a little twist, it will discard the very very brightest pixels and tone map a hair under it, making that impact even better, read this post for more info on what it is doing:
Lets lastly look at the same shot but how we used to tone map HDR when using a single value for the whole film. I used to use around this setting on my JVC with custom curves before Dynamic HDR Tone Mapping and Dynamic Clipping came around.
This is a target of 350 nits, this is probably quit similar to how the Oppo used to do tone mapping by entering a value. I havent seen the new Panasonic version, but its probably not too far off, though I think that target will change throughout the movie if I'm not mistaken so its a bit more dynamic. point being MOST people with projectors are probably looking at something resembling the below if they have decent light output and running things like curves.
This shot has an ADL of 1.77%
Summary, HDR does NOT have a higher ADL than SDR, it is not brighter according to your display
It would be better to have a display device with higher contrast all round (or stronger native at low ADL) than one which favours high ADL levels.
Most all the LCOS technology projectors will have a contrast ratio of about 4,000:1 (up to about 6k) by 2% ADL by the way. Most DLP's probably 1600? (Kris?)