Thanks, that's helpful! So we have two different scenes here: Murder wants a slower brightness reaction time, HP wants a faster one. Could your idea of changing the brightness reaction time depending on histogram changes help here? How much does the histogram change for the HP and Murder scenes?
(Of course one problem with the histogram change idea is that the amount of change will differ from frame to frame, so I may have to adjust the brightness reaction times constantly during the same scene.)
Murder stays under 1% of lum changes
The Harry Potter scene is at minimum 5% lum changes where it would need to react in 0.1s to hide the artifacts, but with higher changes just not catched by the scene detection (max 9.70%).
The Predator's intro stays under 2-3% (2.69% max) where it would need to react in 5s to completely hide the artifacts and the slow luminance drop.
But if the brightness/darkness delay is immediate, the Harry potter scene changes a lot (~7% lum changes overall for the few frame that are causing the issue), while the Predator's intro stays calm (most of the time under 2%), so it should adapt (in theory) ?
Also, could it be worse to add a delay before madVR reacts?
I mean... when the nits peak is increasing/decreasing progressively, madVR don't do anything during this delay and then has to adapt faster to catch up.
For the same reaction time, if it reacts immediately, it has more time to adapt, hence less visible brightness change from frame to frame?
For "darkness", if the reaction time is slow enough not to notice the increase in brightness (let's say 5s), adding a delay will just make us lose the image dynamic, without any benefice?
For me, "luminance" and "darkness" are equally important for the dynamic HDR.
Something exponential like this to be safe?
y = reaction time
x = luminance change [%]
y = 12,5*(e^(-0,644*x))
luminance change -> reaction time
0% -> 12.5s
2.5% -> 2.5s
5.0% -> 0.5s
7.5% -> 0.1s
10.0% -> 0.02s (0s)
Just adapting with the latest tests.
Edit:
This scene from Mother! needs an immediate reaction time not to blown out highlights: https://www.mediafire.com/file/g7wbtvg2qta8bdn/Mother!_scene_1.mkv/file
0.05s is not fast enough (358nits target for a measured peak of 707nits), but probably acceptable.
Problem: the luminance graph does not change much 
So my solution will not work better than the current one, this is the best I came up with:
brightness delay: no delay
brightness reaction time: 0.1s (causes minor brightness jumps, but slower reaction times blow out highlights too much)
darkness delay: no delay
darkness reaction time: 5s (slow enough not to notice the increase in brightness, fast enough not to cause a lot of dynamic loss)
To test one of the minor brightness jumps (with default darkness values), Jurassic World: Fallen Kingdom at 00:18:19.
Edit2:
A possible solution to improve and protect the current algo: (only for brightness) applying a slow reaction time when the nits peak change is small and a fast one when it is big?
Or more precisely, when the difference between the current target and the measured peak is small/big?