Originally Posted by Brad Sutliff
Thanks for all the help guys. I appreciate it. Since were on the topic of watt usage. If I have a 90wpc receiver and I run speakers around 90db sensitivity near reference levels. How much of that available power am I actually using? Just so I know how overkill it might be
Short answer; 5 watts at the most, if that, at 3-4 ft. You'll probably use a little more for movies because they are usually more dynamic than music.
Long answer; It depends how far away you sit from your system. I'll give you an example: I run a PA system at a church, the average attendance of the church is about 85 people, which is about 35 feet away from the main speakers. We have a Radio shack MPA-200 100 watt amplifier for the auditorium, we have two 8 ohm speakers wired in parallel which gives us a 4 ohm load, that gets the mains 100 watts of power. My point of saying all this is that at 35 feet away 100 watts is actually painful
Now keep in mind the speakers we are using that have a sensitivity over 90 dB so that helps out.
For my system I've determined that a 25 watt amplifier is enough power based on the fact that my speakers are only 4 and 1/2 feet away from me. One thing I can't believe nobody mentioned is you really don't want to run your amplifier at it's rated power all the time; the reason being that a typical class A/B amp runs it's hottest near it's rated power which will reduce the life of the capacitors. That 100 watt rat shack amp runs very hot at 100 watts so I actually start compressing at around -3 dB to keep it from toasting. I would buy a 100 watt receiver for a 10 ft listening distance that way your amplifier isn't toasting when your using it: Keep in mind this is at reference levels, if you listen at lets say -15 dB 100 watts is overkill. My recommendation is if you have a class A/B amp double the wattage you think you'll need, for class A get a receiver get a receiver with maybe 10 watt headroom because class A runs cooler the more watts you're using.