Originally Posted by GIEGAR
Are you saying that 0dB reference level is set at a fixed/known
voltage at the pre amp?
Yes. It is as stated, 1.0 or 1.2 V.
Here are extracts from the owners manual of a couple of flagship AVR's. The Denon appears to give all the conditions for the rated output quoted, but does not give the maximum output at all. Denon do rate the digital output though. The Onkyo gives both rated and maximum outputs, but it's unclear (to me) under what conditions these are quoted at. I presume the maximum output would be just below clipping of the pre amp output. There's no real question here - just an observation that meaningful info is lacking, even at the top of the range. It's almost to the point of deliberately making it difficult to compare capabilities.
The comparisons between distortion and noise are not invalidated by the 1.0 vs. 1.2 V reference levels. There's only 1.6 dB of level offset between them. There are other factors that will make a bigger difference, especially in noise specs, like what bandwidth was used to measure the noise. Again, there are industry standards, but unless they are stated, one cannot assume what was done.
If so, how are voltage adjustments made to account for variables after the pre amp (eg power amp gain, speaker sensitivity, listening distance and room characteristics etc) to achieve reference calibration? Or have I got this completely balls-up?
There is no direct relationship between rated output and the rest of the playback chain or its calibration. However, it would be nice if it were generally relevant to actual use, and usually it is.
Do manufacturers choose an approximate 0dB reference level setting and state that the pre outs can supply at least that level of low distortion signal?
Well, they choose an exact reference level, and hopefully they state it.
So, I think you can see that the rated output level is not related with the maximum output level, which is all I was trying to say originally.