Quote:

Originally Posted by

**markus767**
@

**andyc56**
Would it be possible to add a check that any filter change that MSO wants to apply during the automated optimization run doesn't add any excess phase group delay (although it would improve the frequency response smoothness, i.e. reduce the RMS error)?

Not without completely rewriting the optimization code.

If I understand you correctly, you're thinking of optimization as a successive refinement of a given (single) solution according to some algorithm, something like:

1) Next solution guess = some function of the current solution guess

2) Repeat (1) until done.

This is how e.g. "steepest descent" (gradient) optimizers work, as a generalization of Newton's method, sometimes called "greedy algorithms". But that type of optimizer often fails on practical problems whose objective function has many local minima (a multi-modal objective function), such as the following:

The failure involves premature convergence at a local minimum, where it then gets "stuck" and can't get out. By contrast, global optimizers need a way to explore the solution space, with potential solutions wandering about like ants on the "surface" (really a hyper-surface in a space having dimension N+1, where N is the number of parameters being varied). In the case of differential evolution used by MSO, this is accomplished by having an inventory of 100 solution guesses. They're initially randomly generated, except for one, which contains the filter parameters from the UI. Then for each guess vector, the objective function value is computed and saved with the solution guess. So at any optimizer pass, there is not one "current solution guess", but 100 of them. On each pass, new guesses are generated for each solution vector by combining existing vectors in various ways. If this newly-generated guess vector at a given position in the collection of 100 vectors has a better objective function value than the current one, the current one gets replaced - otherwise it stays the same. This is done for each of the 100 guess vectors in the current generation (collection of 100 guesses), in order to form the next generation .

So there is not a current solution guess and a next solution guess, but rather a current generation and next generation. The best solution of the 100 is what gets plotted and used. I suppose it might be possible to calculate the excess group delay for each individual guess in the generation, and only replace it if the new guess also has better (or no worse?) excess group delay. There are so many questions, especially about efficiency, but also other algorithm details. What's the criterion? Rate of change of excess group delay (spikes)? Also, in this algorithm, the connection between "the best solution of the current generation" and "the best solution of the next generation" is not necessarily at all direct.

It's an interesting idea, but I really don't intend or want MSO to be my life's work. Quite the contrary in fact. I'm sitting here with 4 partially built subs in my garage, and none in my system, like the proverbial cobbler with no shoes. That's why there haven't been any changes in MSO in a while, and won't until later this summer when I finish the subs and other home projects.