Originally Posted by master0068
Well this is why you shouldn't assume.
By safe, we mean for the RECEIVER.
Why would anything above 0db relate to SAFE for hearing, when it isn't indicative of actual volume?
People are trying to figure out the technical aspects of going above 0db for an amplifier, and if it causes dangerous stress to the amplifier.
There is no 0dB "for an amplifier." There's only 0 dB when it's refeenced to something. For example, all digital recording references to 0dBFS (for full scale) (hottest signal you acn encode) and go down from there. You cannot exceed 0dBFS in a digital recording because the system can't go there. No bits left at the top of the range to encode louder
But 0 dB on a receiver master volume control has no meaning until and unless it is calibrated to mean something. Otherwise it;s merely a number. Once you cailbrate, whether you are boosting or cutting the previous stage's output at any given volume level depends on the way the gain stages are set up in the receiver, and on how the calibration come out.
Again, many modern receivers with autosetup calibrate so that reference equals zero on teh master volume. Without the reference calibration, the number on the master volume means nothing, either with respect to actual sound level or the power being output. You could of course test yourself using suitable test tones and know how loud 0 on your receiver is with a -20 or -30 dBFS test tone, and could refer that back to movie "reference." But unless you designed the receiver, or have studied the schematic closely, you are unlikely to be able to guess when the overall gain (versus the level of the input source) goes from cut to boost in your system. The more sensitive the amp input, the less preamp level you need to achieve any given output.
My Denon calibrates to reference = 0 on the MV. SO with a -20dBFS pink noise test tone, I'd get 85 dB from each speaker at my listening position. Let's say I take my Paradigms out (they're rated about 95 dB at one watt at one meter) and replace them with highly efficient horn based speakers (let's say 105 dB at one watt at one meter) and recalibrate. Calibrated, my new speakers will be equally loud as the old with any given program (or test) signal with the MV set at at 0 or -10 or -20. The point of and the result of the calibration is that SPL at the listening position remains the same referenced to the master vollume level (for any given program or test input). But the power being output by the amplifiers to reach those levels has been cut by a factor of 10 when I calibrated for the new more efficient speakers. So 0 on the MV might mean 200 watts for a 105 dB peak with the old speakers, but only 20 watts for the new.
By itself. zero on the MV simply tells us nothing about power output. . . . Which is why if you listen loud and are not certain you have plenty of power it makes sense to listen carefully for distortion to make sure you're not going to kill anything. Most modern receivers could be expected to go into a thermal shutdown before they were damaged by normal program material. But that's plenty annoying, if cheaper to deal with than frying output transistors.