Originally Posted by adam2434
Is the distortion from the the tower speakers themselves, or from room interactions that are mitigated with sub integration?
Seems like comparative distortion in bass would depend on the specific towers and sub. Theoretically, one could have very high performance full-range speakers and a crappy sub. Wouldn't it be possible that the sub would produce more distortion than the towers? I imagine that at a certain level of sub performance (and associated cost), you cross over into lower distortion vs. typical full-range speakers.
Distortion shouldn't spike because you're crossing over the towers to a sub(s). So I would assume that because he measured an in-room 18hz response, but gave some distortion percentages that he was running without a sub, or with a sub crossed at 60hz.
If you have a solid pair of towers and a less than average or capable sub, and you're pushing the volume levels up and up, then yes, most likely the sub will produce more distortion and tap-out before the towers do.
You don't crossover into distortion. I don't understand that really or what you're trying to mean.
You cross speakers over to subs because practical towers cannot reproduce frequencies that are considered LFE (or ULF) with any authority, any many other reasons. If anything, at the crossover point there would be less distortion because those frequencies would be rolled off depending on the order of crossover used and the sub would begin picking up where the speaker leaves off.
However, distortion can increase if you're crossing over the towers below their tuning point. So if a tower is tuned to 60hz, and you're crossing that tower over to a sub at 40hz, then excursion will spike.