Originally Posted by Floyd Toole
There is not a truly satisfactory answer to your question. The traditional measure of sound level used in system calibration is pink noise band limited to 500 Hz to 2 kHz - all middle frequencies. All else is extrapolated from this. In cinemas the signal is broadband pink noise measured using C weighting (a broadband metric that rolls off below 50 Hz and above 5 kHz). These are of course different, and neither properly weights the very low frequencies.
Eliminating the need to reproduce frequencies below 80 Hz will indeed reduce the power demands on the main speakers, but it is a small fraction of the total bandwidth so we are not talking about a large factor. If you stick with your 200 watt estimate, enjoy the security of having a small amount of headroom - never a bad idea.
Remember, those calculations were for cinema reference sound levels. I know of nobody who uses those levels for movies in domestic rooms. The resulting sound levels are dangerous to hearing and - in my opinion - too darned loud. In some films they are too loud even for cinemas - some turn the volume down as much as 10dB (1/10 power!) to keep customers from walking out. A main topic of discussion within SMPTE standards committees is how to tame the "loudness wars" that developed when soundtracks went digital and the full dynamic range could be used without distortion - it has been, encouraged by film executives standing at the back of the dubbing stage shouting "more".
Just a barely noticeable volume change of 3 dB reduces power demands by a factor of 2.
Thank you for your response. The problem is, if I want to have “peace of mind”, so to speak, it will cost me about $2,000. No small price to pay for something that may or may not make a difference.
I was wondering if I could employ a different method to help determine my needs.
My current receiver outputs 100 watts per channel (full band 20hz-20khz, two channels driven). Based on the chart, 100 watts into 4ohms would deliver approx. a 16 “dB difference”. This is about 4db lower than what I would require for “peace of mind”. So I guess, instead of getting 105dB peaks at my listening position, this would give me closer to 101dB peaks, without overloading the receiver.
What if I measured the dB at my listening position of one speaker at a volume that closely corresponds to the loudest levels I typically listen at? For example, when I watch a movie, I typically have my receiver’s master volume set somewhere around -15 (sometimes a little lower, sometimes a little higher, depending on the movie).
If I measured this in the right way (e.g. full band pink noise? Actual program material?) and the measurements came in much lower than 101dB, then would that mean I probably don’t need to spend $2,000 on a more powerful amplifier?
Am I on the right track here, or am I way off base?