With the exception of crowd sourcing this is pretty much what HDR does on these sets already. It does not simply clip at the peak brightness of the set. The peak brightness is input into the formula and a "custom" gamma curve is produced.Probably not the right thread, but this post reminded me I've been meaning to bring this up...
For movies that are "mastered wrong" for the new tech, are there any recommended 4K-compatible, HDR-compatible converters/filters available or in the works?
Let's start with a simple example of content mastered at 4000 nits max coupled with a display capable of 500 nits max. My understanding is that simplistically put anything coded as 500+ nits is displayed identically -- i.e. max for the display.
I'm imagining a converter/filter device that you put between the player and the TV (or between the player and the AV/Receiver) that has two modes: Analyze and Filter. You run the movie/content in Analyze mode during which it would observe the range of the mastered content. After that is completed, you then put it in Filter mode with parameters/settings set for max output of 500 nits and perhaps a "conversion curve" (simple linear, exponential -- linear in the low blacks, etc).
The converter/filter need not use a constant (set of) equation(s) throughout, thus different scenes or sections of scenes could be adjusted differently to allow for "best rendering of scene X" as well as "improve jarring transition from scene X to scene Y".
Taking it up a notch, such a device should probably offer save/load Analysis Profile to USB and save/load Filter Configuration from USB. With these facilities, there's a nice crowd-sourcing opportunity to improve the viewing experience for all popular published content.
In the case of HDR10 this is fixed across the entire movie, in the case of Dolby Vision, which is a form of "dynamic HDR", this can change from scene to scene, just as you suggest.