Examining a plot of a typical amplifier's THD percent versus power output, typically shows a sharp increase in distortion at both ends of the power spectrum -- steeply rising distortion at low power levels of less than one watt, and again at the upper end of the amplifiers power output capability. Is the reason for the rise in distortion at the lower power end to do with the noise level floor becoming significant as a percent of the output signal, or is there some other explanation? I'm told most listening, in home systems with moderately efficient speakers, occurs at levels of around one watt per channel or less. This being the case, should one be concerned about an amplifier's distortion behavior below one watt. Should this be a critical point to focus upon when comparing specifications across amplifiers and receivers, or is it generally inaudible?