An example of the above. If a receiver has .05 percent distortion at 70 watts per channel from 40-20khz, but in the octave below 40hz, the distortion increases to .3 percent...still at 70 watts...one may rate the receiver at 70 watts per channel @ .05 percent THD from 40-20hz, OR alternately, 70 watts per channel 20-20khz at .3 percent THD. Both would be an accurate spec.
If, as is usual, the distortion is more consistent across the receiver's bandwidth at lower power levels...say at 60 watts per channel the receiver doesn't exceed .02 percent distortion at any point from 20hz-20khz, then you could rate the receiver's power at 60 watts per channel, 20-20khz, at .02 percent distortion.
It depends on what's more important to a manufacturer to emphasize...power output, or distortion.
With real program material, to most people, there will probably be no audible difference between 1 percent distortion and .02 percent distortion. But one looks far more impressive. The ear is surprisingly tolerant of harmonic distortion at low frequencies...particularly in the two octaves below 100hz. Plus, speakers routinely have distortion figures which are orders of magnitude greater than electronics...5 percent THD at 40hz at a sound pressure level of 100db at one meter for a small two-way speaker would actually be quite good. But for an amplifier, that figure would look God-awful.