I’m far from an expert in HDR and trying to learn as I go here.That thinking has been shown to be incorrect. You really don't need the large brightness increases for projection. Lets just say Kris Deering " opened my eyes " - and the shadow detail and black scenes look much better as a result. In fact, HDR should not look wildly different for SDR - except the color looks better and everything looks more realistic. it's taken 5 years of HDR / 4K evolution to get here though - I was watching at 45 foot lamberts not too long ago !
My understanding is that there are two components to HDR and in total they want to occupy a greater percentage of what human vision can resolve. One direction is in the WCG aspect enlarging on the area bound by the RGB triangle the other direction is the intensity of the range between white and black throughout the expanded WCG (brightness).
If you strictly go by what is encoded the brightness spec falls way short with projector technology. From all I can read and understand on the subject that’s where DTM comes in and it tries to break the encoded data down and reconstruct it to do best with what any given projector is capable of. Most of what I’m reading is saying one component the data is assigned to is the WCG and the other is some increased brightness mainly reserved for highlights.
If this can be done without extra brightness and only WCG this is the first I have heard this and I would agree it could well be a finer degree of film-like viewing.
I don’t know. I think even flat panel TVs use some DTM based around room light levels. In the old days TVs had brightness control that made everything between white and black brighter. I gather DTM is an advanced form of control.
Learn something every day is my motto.