Originally Posted by Immoros
Follow-up question: if I were to go with separates, or test vs. an AVR, is it safe to go with a 200 watt/ch amp on speakers rated up to 100w or 120w? Obviously I don't plan to max out the gain.
If we take the typical AVR as being a 100 wpc 2 channel box, a 200 wpc power amp doesn't make sense because of the small difference that the addtional 3 dB worth of power makes. If your AVR or separates have a volume control calibrated in dB, make a +3 dB or - 3 dB change and see what you think of the difference. I predict it will be audible, but far less than inspirational. Now say to yourself "I'm going to pay $$$$ (fill in the blanks) for this". How do you feel about yourself?
Now, consider the fact that said 3 dB difference would only exist if you were already pushing your AVR right up against clipping. If you aren't clipping out your AVR, then all the additional power in the world won't make any difference!
Now, lets try to get a little serious. See what a +10 dB or -10 dB change sounds like. That should be something like twice as loud or half as loud. It is actually somewhat interesting!
To get a +10 dB loudness advantage over that AVR that you can hear clipping on loud passages, you need a 10 times more powerful amplfier! We are talking a killowatt!
Now 1 KW per channel is not rocket science or hyper expensive - your nearby pro audio dealer can fix you up.
But here's the risk factor - what happens if you actually try to take real world advantage of that 1 KwPC amp and your 100 or 120 watt speakers? More likely, what happens if you have errr, a little accident?