Originally Posted by SeekingNirvana
Question on input sensitivity.....
If a amps specs show 1.25 Vrms for 800w into 8 ohms, what changes if it went to 4000w into 4 ohm bridged?
I understand that it will take 1.25 Vrms input to achieve max gain at 8 ohm, but how much does the Vrms change when you ask the amp to put out 4000w at 4 ohm?
An amps doesn't "put out watts." It only produces voltage. The wattage is always calculated for a certain ohm by using the formula P = E 2 / R.
E=Electrical Potential (voltage)
If you look at the Measuring Amplifiers
thread that is stickied in the DIY forum, you will see only output voltage being measured, not watts.
If an amp is spec'd for 1.25 Vrms, then that means that it takes 1.25 volts of input with the gain at maximum for the amp to output its maximum voltage. You can input more voltage, but must reduce the gain so the math is the same in the end. Trying to increase the output voltage beyond its max is what causes clipping. For example, with 32x of gain you get this:
1.25v x 1 (Gain Control) x 32 (Output Gain stage) = 40
Remember that the Output Gain Stage varies from amp to amp, but is always a fixed number.
If your input voltage maximum is 5 volts (measured with a volt meter), then where do you put the gain control?
5v x Gain x 32 = 40
5v x .25 x 32 =40 So you end up with the same maximum output voltage, but have had to reduce the gain control.
A number like .25 isn't very helpful for reducing the gain control. Gain controls are usually labeled in dB. In my example, it is easier to take the measured voltage input voltage (5v) and subtract the required input voltage (1.25v). In this case you are changing voltage by 3.75 volts. If you use a Voltage to Decibel
calculator, you will find that 3.75 equal 11.5 dB. Turn down the gain control by 12 dB and you will be at the correct setting. You can also use the Crown Audio db Voltage Ratio Calculator
How does the ohm rating factor into this? An amplifier can always output the most voltage at the least resistance. With an 8 ohm load, an amp can output more voltage than a 4 or 2 ohm load. The more resistance there is, the less efficient the output. Regardless of the ohm load, if the amp is spec'd for 1.25 Vrms for maximum rated voltage then it can never output more voltage than its maximum rating.
JL Audio has a School of Sound: Short Course in Audio
PDF that is very helpful to read. On page 42 it shows a 9 step procedure for setting proper gain levels using a voltmeter. It is intended for car audio, but the principles are the same.
Some amps show their Output Gain Stage as a multiplication factor (gain factor) and some as a decibel increase. The CV-5000 shows 36 db. Other amps might show 30x. Use these formulas to convert voltage gain expressed as a multiplication factor to voltage gain expressed in dB and vice versa:
Gain (dB) = 10 * log (Gain)
Gain Factor = ANTILOG10 ( Gain in dB / 20 ) [i.e. ANTILOG10 ( 36 / 20 ) = ?
The CV-5000 is rated at 1.42 Vrms sensitivity and 36 dB of gain.
What is its gain factor? Hint: Antilog10 is the 10x button on the Windows Scientific Calculator
What is its maximum output voltage?
You can read more here: All About Gain