Originally Posted by Mark Seaton
Originally Posted by MKtheater
When you say +5 dBs outside and -10 dBs in a live room are you keeping the volume at reference? Meaning in a live room reference might be -10MV and outside could be +5 dBs? That is what I am saying is that how many people just trust that MV 0 is reference to begin with? Mine is not on at least three porcessors and AVR's. It is supposed to be but I measure much louder than reference at 0 dBs.
The above probably inadvertently gets directly at the confusion. Calibrating a system to a reference is simply a gain structure setting, nothing more. "Reference level" simply means that a known recorded signal is intended to produce a known level at the listener. The focus on maximum capabilities comes into play as our recording media has maximum recorded levels, so once the gain structure is defined, a maximum is defined for a given volume setting.
That said, the subjective loudness of "reference level playback" will be different for every movie. There are general guidelines that recording engineers attempt to follow, but some movies and scenes will be recorded louder or quieter than others as the director and engineer feels is appropriate. Just because you set the volume to 0dB on the MV, that doesn't mean loud scenes in a movie are hitting the maximum possible levels.
When I state +5dB or -10dB, I simply mean where the main volume control is when listening after similar calibration method (test DVD). I have observed differences between test tones in various processors, sometimes it's the type of signal. There are also all sorts of odd things which happen with dialog normalization, and it's possible some processors don't always handle things similarly, opening more opportunity for internal test-tone calibrations to give differing results. Base-lining with a test DVD helps minimize such differences
Yup, "reference level" is more or less the standard production signal alignment level for a system. That way everyone in the production chain has a good idea on the baseline levels used in the production chain. Calibrating the end consumer receiver master volume control level to that standard is the end of the alignment chain.
DD Dialnorm will change output levels on a Dolby track. For a specific example if you use War of the Worlds as a test DVD and use the DD track, the DD Dialnorm value is -23 which reduces volume by 8 dB. The DTS track does not use Dialnorm, so it plays back with no reduction in playback volume level. Which version (DD or DTS) plays back at "reference level" with the use of the identical calibrated master volume setting? Both playback at "reference level" when you come right down to it even though they do not playback at the same volume level with the DTS track being 8 dB louder than the DD track.
In addition, THX receivers assume that a typical DVD uses a DD Dialnorm value of -27 (AKA results in THX Reference Level), and a straight Dolby Digital receiver assumes that a DD Dialnorm value of -31 (AKA results in Dolby Reference Level) is used on a DD DVD. You get a 4 dB difference in volume level between playback calibration level of those two types of units (THX and DD).
To make matters worse, not all receivers use 0 dB on the master volume control as the calibrated "reference level" position. On my receiver -22 dB on the master volume is the calibrated "reference level" master volume position.
As you noted about having a base-line, I do check my receiver with the use of straight recorded PCM test signals, the internal receiver test signals, as well as the THX Optimizer test signals taken from the Pirates of the Caribbean DVD . In my specific standard DD receiver, all three calibration methods match up as far as master volume calibration is concerned.