It doesn't work for me with the fade in at the beginning of HP and the Deathly Hallows part 2 (http://www.mediafire.com/file/qtb3v8qb0ghqqwk/brightness_jump.mkv/file ).
It would also be nice to have it for fades out I think.
For instance, for the BvS scene at 02:11:13 to 02:11:19. When the green spear goes in/out of frame, a fade in/out is detected, but the target is too slow to adapt.
Well, it's an important question how we define "fade in" and "fade out". I know, the OSD currently reports "fade in" and "fade out" sometimes, even in the middle of a scene. And that might lead us to believe that madVR could/should react faster to brightness changes in such a situation. However, the big problem is that such a fade in/out detection could produce false alarms. E.g. if there's an explosion, it could be detected as a fade in. Or if a lamp slowly dims down and then goes off, it could be detected as a fade out. But are these really fade in/out situations? Not really.Edit: This sample contains the brightness speeds issue (not adapting immediately) during fades in, followed by multiple false positives caused by Metric1 and "ignored" by Metric2: http://www.mediafire.com/file/i11tmqyvrbr4ulq/fade_in_%2B_false_positives.mkv/file
I don't see why it would be problem to immediately adapt to the "ideal" target in this case or why it would produce any bad effect, but I admit I also have a hard time imagining how it would look like.Of course it's a valid question if the limitation to true black screens as a fade in starting point is necessary or not. Maybe we can consider any fluid brightening of the image to be some sort of "fade in" and then immediately adjust image brightness? However, I wonder if that wouldn't produce visible artifacts in some scenes. Also, currently the fade in/out detection just looks at the overall image APL. What happens if one half of the image gets much brighter, causing a fade in to be detected, and the other half of the image actually gets a bit darker at the same time? Do we still want to adjust to the increased APL immediately?
I don't think the brightness speeds can be fast enough for fade ins/outs, and it looks like to me that it could easily causes issues elsewhere to adapt the speeds based on Metric1 value.However, maybe this is also a topic we should postpone until we will look at brightness reaction speeds? I had the idea that brightness reaction speeds could generally be increased if Metric1 is high. This would not only - but also affect fade ins/outs. So maybe such a logic would already suffice to solve fade ins/outs completely?
Yes, we can easily fix the brightness jumps by telling madVR there is no scene change if Metric2 is below 2 for instance.FWIW, the fade in mis-detections in those 2 samples you uploaded are probably easy to fix by the new Metric1+Metric2 combination options available in the next build.
It wouldn't be a problem if adapting immediately on fade outsEdit: Just retested a sample you sent me earlier called "WB fades.mkv" (not sure if I named it that way or you). It's the Warner Brothers logo fading in and then out again. At some point I had made some changes to fix this, but with the latest build there's a brightness jump again during the fade out. Not sure how to fix this. In an older build I had increased the Metric1 threshold when I had detected a fade-out. Maybe I should do that in the new build, as well? But should I also increase the Metric2 threshold in that situation? I'm not sure...
Well, one big problem coming to my mind right now is that a "fade in" is only a "fade in" if it lasts for several frames, and I don't know that in advance, if I don't look several frames ahead. So if I don't know that there's a fade in or fade out going on, then I can't adjust the APL immediately. But if I then after maybe 5 frames detect that we have a fade in/out, if I then at that moment start adjusting the APL immediately, we'd get a brightness jump once more. I'd have to adjust the APL immediately right with the first frame of the fade in/out to avoid a brightness jump.I don't see why it would be problem to immediately adapt to the "ideal" target in this case or why it would produce any bad effect, but I admit I also have a hard time imagining how it would look like.
Finding a real case scenario would help to check if there is any issue and how it could be improved IMO.
I don't think the brightness speeds can be fast enough for fade ins/outs, and it looks like to me that it could easily causes issues elsewhere to adapt the speeds based on Metric1 value.
Well, my thinking was that we wouldn't actually increase brightness speed for high Metric1 values, but rather slow them down for low Metric1 values. But anyway, that's a test for later...I don't think the brightness speeds can be fast enough for fade ins/outs, and it looks like to me that it could easily causes issues elsewhere to adapt the speeds based on Metric1 value.
Well, one big problem coming to my mind right now is that a "fade in" is only a "fade in" if it lasts for several frames, and I don't know that in advance, if I don't look several frames ahead.
It looks like it could be interesting, but only safe if we can look ahead. So maybe we can get back on this in some time, when madVR is capable to see in the future.But in the middle of a scene, it's hard to find the exact starting point of a fade-in, if I don't look many frames ahead.
We can totally block some false positives right at Line 1 if "sub of prev frame" is enabled before any evaluation is done.1) "disable sub of prev frame": Currently the previous frame is substracted before evaluating any of the 3 new lines of options. Is that the right way to handle this? Or maybe the previous frame should only be substracted when evaluating line 3 of the new options? I'm not sure.
Applying it to both removes more false positives without hurting anything more, so... yes for both.2) "disable sub of prev frame": I assume this should apply to both Metric1 and Metric2, correct? Or should this maybe only apply to one of them?
If you don't want any sudden brightness jump during fade ins/out, why don't you just ignore the metrics when a fade in/out is detected?3) There can be a "scene change" false positive when there are strong brightness changes multiple frames in a row, e.g. during a fade out. The "sub of prev frame" feature probably solves that. But maybe we should do something extra to make sure there's no false positive scene change detection during fade ins and fade outs, to avoid a sudden brightness jump? If so, what's the best approach? E.g. should be increase the thresholds in line 3 of the new options by 25% during a fade in/out? Should we also change lines 1 and 2 thresholds?
Yes, you're right. It's a bit better like this already:May I ask why you're using a threshold of only 4 at line 3 for Metric2? It seems very low. Shouldn't you use a threshold there which really hits the mark where Metric2 properly separates real scene changes from non-scene-changes? If you don't like Metric2 for line 3 too much, probably it would be better to choose a higher weight for Metric1 instead of low-balling the threshold of Metric2? I don't really know, though, just wondering...
Yes, in "Inferno", during the talk in the airport (while their talk on a white background, some passengers pass in front of the camera, with sudden changes of brightness). Now I can't remember the time, tonight when I go home checkDoes anyone have some example(s) where a brightness jump is clearly visible?
Thanks for your feedback!I've arranged the results by descending order for both false positives and real scene cuts in this image (including whether they were a miss or not from the final round of metric 2 tests):
I think(?) the current line 2 thresholds will have to be changed, and/or the line 2 logic might be better served by setting the logic to "and" instead of "or"
For instance, at present it is 20 *or* 20 as default, which is indicated by the blue line running horizontal.
As you can see, it would automatically flag most scenes that we tested for false positives as false positives (Row 2 - 47 false positives). That can't be good.
If it were 20 *and* 10 for instance, then only 9 would automatically be flagged as false positives (which would be better than the 10 that were flagged as misses based on metric 2 alone from Neo's results).
For real scene cuts
Instead of all of those above the blue line being "automatically" picked up as a scene cut (higher than 20 ), we would automatically miss out on another 17
However, that's not to say that these remaining 17 will be misses, but rather they're just not automatically flagged as a real scene change
Metric 2 seems to be a bit higher for those real scene cuts that are missed (i.e. not automatically flagged) than the corresponding metric 2 stats for false positives, so maybe decrease metric 1 weighting to compensate, and hopefully that would sweep up the remaining misses and correctly flag them as real scene cuts?
The red lines indicated a threshold of 30 for metric 1 and 10 for metric 2 for the second line, and the red thick lines on the right of the image just show those real scenes that would not be automatically flagged as a real scene cut (I was just experimenting to see the difference between 20 and 10, vs 30 and 10 for line 2).
(I picked those numbers to experiment with because the average for metric 1 is 27.8 for false positives / and 33.67 for real scene cuts and the average for metric 2 is 5.7 for false positives / and 14.1 for real scene cuts. So I wanted to see the effect of something around the average).
Of course, if you want to be more cautious for line 2, then you could raise the thresholds further in addition to switching to *and*
I'm sure that there must be some fancy logic formula that could be used in excel to calculate these inc weighted thresholds. I've just no idea how to do it
I don't work with the excel sheet anymore, because everything was tested with the "disable sub of prev frame" enabled.@Neo-XP, what are your thoughts on that? I know you like M1 more, but maybe it would be worth giving Fer15's suggestion a try to see how it would work?
Fair enough. I kinda forgot about the "disable sub of prev frame" option. Obviously, unchecking the "disable sub of prev frame" option will generally lower Metric1 and Metric2 values. So the current Excel thresholds are clearly too high.I don't work with the excel sheet anymore, because everything was tested with the "disable sub of prev frame" enabled.
Also, it may be a good method to know which algo produces less misses, but doesn't reflect neither the numbers after the sub of prev frame, or where false positives can be important or not, and likewise for real cuts.
To set appropriate thresholds, I think we should now concentrate more on real issues (visible artifacts) and work from there, than just looking at (mostly) non-representative values.
Would it maybe make sense to retest all the Excel scenes with Excel once more, but with "disable sub of prev frame" unchecked?
That might give us a means to do some number crunching to find a good starting point for optimal thresholds? And then we could fine tune the thresholds from there?
It can't hurt to try I guessThoughts?
It would be kind of redundant IMO, not useful (and maybe doing more bad than good) if we find good values for "line 3".Oh yes, and how do you like Fer15's suggestion to modify "line 2" of the new options to do AND instead of OR logical operator? Meaning, we would judge a situation to be a clear scene change if both Metric1 and Metric2 are above the thresholds in line 2 at the same time? That said, if that happens, isn't it likely that "line 3" will come to the same conclusion? So maybe line 2 is not all that useful at all?