Originally Posted by infoman1
With a pre-out voltage of 1.0v, could it be used with an external amp with 29db gain structure and and input sensitivity of 1.6v? Not sure of the math but it looks like it would cut the amp output by almost half??
Another thing I have noticed in all of these recievers with pre-outs. Few if any have pre-outs with a high enough output to effectively drive a high wattage amp. Onkyo, Sony, Yahama all put out a paltry 1.0V. Seems like the sweet spot is around a 100watts.
It would definitely limit your amplifier output. Power is related to the square of the output voltage. 1.6 volts would produce approx 2.6 times the power that 1v produces.
So if the amp puts out about 250 watts into 8 ohms ( I assume given that gain,) you only get 100 watts with an input voltage of 1v.
As for many receivers having a low pre amp voltage, Gene over at Audioholics has mentioned that before.
By my calculations, you need a 32 dB gain to hit 200 watts into 8 ohms with 1v of input.
With a gain of 31 dB, you need 1.1 volts.
With a gain of 30 dB, you need 1.3 volts
If you want the math, these are your stock formulas -
Voltage gain given dB gain -
[(dB gain) / 20] raised to the 10th power
Necessary voltage for x watts of output into 8 ohms -
sqrt(x * 8)
Given your needed voltage gain to hit your amp's rated power, and given the gain given in the amp's specs, you can calculate input sensitivity, or the voltage need to drive the amp to full power -
(input voltage needed for full power) / (voltage gain)