Originally Posted by blakeyj08
So I've read through the stickied posts and searched for others, but basically what I've been able to take away is that every brand rates their watts/channel differently, and unless it's HK you don't know what you're getting.
That seems narrow and harsh. In the US there are laws related to how power ratings are determined, and they are often overkill.
My question is - what kind of power rating should I be looking for in order to push a constant 100W to my 6ohm fronts (5.1 setup)?
Where does this 100 wpc criteria come from?
Is it possible to get an AVR that will do it for <$500?
I keep looking at receivers advertising 100W+/channel, but then it always turns out that's with 2 channels driven, and it drops to around 80W/channel with all driven.
So what? 80 watts is only 2 dB less than 100 watts, and it takes a whopping 10 dB of power difference to create the impression of just twice as loud.
Have you ever heard an actual 2 dB difference in power being delivered to a speaker? Remember that over just a few seconds the power being delivered to a speaker can change as much as 60 dB or more.
The hidden gotcha is that music is far easier to amplify than steady tones. because it has a constantly varying amplitude. Power supplies, heat sinks and output transistors are far more stressed by the steady grind of a sine wave.
Music is also constantly varying in terms of frequency content which means that the effective impedance of your speakers is also constantly varying because their impedance is frequency dependent. While your speakers may be rated at 6 ohms, their effective load on the amplifier is probably significantly lower with real world music as compared to the sine waves used in power ratings.
The power supply and heat sinks in even very low cost receivers is usually overkill because advertised ratings have to be based on sine waves while music is what we usually listen to.