Originally Posted by splotten
So from what I can gather from all of your posts are that phase shift is definitely not the same as a time delay,
The two are closely related. For example, a time delay that is constant for various frequencies, introduces a readily-predictable phase shift that varies with frequency.
For example a time delay of 1 milliseconds at all relevant frequencies introduces 90 degrees of phase shifit at 250 Hz, 180 degrees at 500 Hz, and 360 degrees at 1,000 Hz.
Most analog circuits produce a phase shift that inherently varies with frequency in a way that differs from that produced by a straight broadband delay.
a phase shift circuit will introduce phase shifts into the audio signal itself.
As long as the phase shift is signficant at audible frequencies. For example a circuit could cause 90 degrees of phase shift at 1 GHz. That would be 9 degrees at 100 MHz, 0.9 degrees at 10 MHz, 0.09 degrees at 1 MHz, 0.009 degrees at 100 KHz, and 0.0009 degrees at 10 KHz. Obviously the phase shift in the audio band is negligable for practical purposes.
This might or might not be audible according to various posts found elsewhere, but is IMO best avoided if possible.
The amount of phase shift that is audible depends on the amount of phase shift, the frequency of the signal, and whether or not the phase shift is applied equally to all audible sources or not.
The better solution seems to be a true time shift to align the phase of the sub and speakers. This is readily available in the receiver.
In general, probably true.
In general, it seems better to rely on the delay function in your AVR as it is likely to be the cleaner implementation of the effect.