Originally Posted by d-rail34
, I'm not sure where you're getting your information on sub calibration settings using the AVR's room correction software. What's being suggested here has nothing to do with weak signals from the receiver.
I'm not saying anything about room correction software. All I'm talking about is increasing the signal from the receiver and turning the sub's amp down in order to get the exact same effective volume.
Correct me if I'm wrong, but if you have the subwoofer trim set to -10dB on your receiver, isn't the signal to the subwoofer 10dB less than what it would be if you had the trim set to 0? How is that not a weaker signal?
Nor have there been suggestions to turn down the sub via the AVR's trim level. It's the complete opposite of that.
And yet people on this thread are suggesting that it's a good idea to run at -10dB trim, which means that the subwoofer is turned down by -10dB at the receiver, right?
The whole idea is to give yourself plenty headroom on the AVR to bump the trim level up 3-5db hot if you want to...which most people want to. And thus far I haven't heard of anybody complaining about damaging their subs by following this method...EVER! Myself included, as that's where I run my sub.
Okay, great, set up your receiver and subwoofer however you want. If you want to be able to occasionally turn up the subwoofer volume on your receiver, great. In that case you will need to run with some negative trim normally. I don't see how that's better than using the knob on the back of the subwoofer but it's your equipment.
Where you adjust your sub's gain level to achieve a -9/-10db trim level post calibration is going to vary depending on where your sub is placed, and how far from the MLP it is. I've never heard of anyone having to turn their gain to max in order to achieve this. Although, I have heard of people running their subs at max gain and adjusting the trim to compensate, but that's a rare few. Even Ed Mullen/SVS President told them that it wouldn't hurt anything so long as you're not overdriving the sub's driver.
I never said anything about max gain. But it's a well-understood property of amps that, as you turn up the volume/amplification, distortion increases. You don't have to turn an amp to max in order to get distortion.
So the scenario we're talking about is that somebody is running at -10dB trim on their receiver. Okay great. They could achieve same result by setting the trim on the receiver to 0 and then turning down the subwoofer amplifier to 10% of what it was set at. (Remember that 10dB requires 10x the power.)
So you get a stronger signal from the receiver (10x the amplitude = much better resolution and signal:noise ratio) going to the subwoofer, and the subwoofer's amp only has to work 1/10th as hard, and you end up with the same effective subwoofer volume. The only practical difference is that if you want to increase the volume of the sub, you have to stand up and go turn a knob instead of press a button on your remote. I know which I would choose, is all I'm saying.