A few points,
First thing is the distortion is rated at the power level stated--this does not mean in any way that is the distortion you get when not driven at that level.
For instance, a Texas Instruments chip amp is rated at verious points, it puts out 0.004% distortion at 5 watts, 0.1$ distortion at 60 watts and 10% distortion at 110 watts (roughly)
Distortion on amplifiers is not linear, it starts fairly high at very, very low levels (1/100th of a watt) then rapidly falls to very low levels which riase as it gets near it's limits.
It is not uncommon to have an amp output 0.01% or less distortion at 1 to 25 watts and stay really low until it gets close to max level (say 100 watts) The more power it puts out, the more distortion it has as ALL amplifiers will do 10% and higher distortion as you get close to maxing it out. The level of distortion is up to the manufacturer. You can have the exact same amplifier and rate it verious ways. Say you want impressive THD specs, just rate the amplifier output lower to get those 0.005% specs. If you want more impressive power numbers, just go up the power output scale to get those numbers but at higher distortion levels. A 100 watt per channel amp at 0.01% distortion can be the same exact amp as one rated for 140 watts per channel at 0.7% distortion or rated for 200 watts at 10% distortion.
They call those distortion charts "hockey sticks" because the start with medium distortion at very, very low levels which declines rapidly as you approach 1/10th of a watt then it stays very low for most of the chart but then kicks up rapidly as the amp maxes out.
For this reason, it is best to get a distortion CHART that shows all the information with multiple impedances--say 8 and 4 ohm and make your decision from that chart. In professional sound, it is common to state the output at 0.5% distortion, this gives a very high output rating and generally will the limiter will start to engage to protect hte amplifier (and the speakers to a certain point) The actual operating distortion level can be lower, much lower at 0.01% to less than 0.1% for most of the range. It is just how they are rated. Slam a speaker with 500 watts of power and it will distort far more than 0.5% so amp distortion does not matter at that level, the speaker will be creating far more distortion which will mask anything the amp does. (as long as you keep the limiter light at bay)
The other interesting things amps do is generate more power at lower impedance levels but that also creates more distortion--another game the manufacturers can play.
In summation, numbers really don't mean anything so get a THD chart for the amplifier desired at the impedance you plan to use. Yes, a $5 chip can go down to 0.005% distortion at 5 watts of output so you don't need to spend a huge amount to get the numbers you desire--no point in having a numbers battle when the chart will show you the full story. Try to find the charts that do full bandwidth testing, 20Hz to 20KHz for your mains and 5 Hz to 200 Hz for your subwoofer amplifiers if you desire ULF for Sci-Fi and war movies. Some amplifiers will filter out frequencies blow 10 to 20Hz so getting a chart that shows you the ULF band will guide you to the proper subwoofer amplifier.
For this reason, I don't waste time with amplifier "reviews" and look for actual amplifier testing with full bandwidth THD charts with 8 ohm and whatever the minimum impedance of the amplifier tested (1 to 6 ohms depending on amplifier) Hope this helps you in your quest for amplifiers to meet your needs.