Originally Posted by svtfast
When I read item descriptions they say, " 90 watts each channel etc etc etc."
When I look at the specs, I see this:
Those are simply minimum continuous Sine wave (slang: "RMS") power ratings for various test scenarios. In your example there are four different test scenarios. Each scenario has various test conditions including the dummy resistive load (eg. 8Ω) the amp is hooked up to, the bandwidth of the test signal(s) (e.g. 20Hz - 20kHz; 1kHz) applied to the amps input, the distortion threshold at which the measurement is taken (eg. 0.08% THD) and the number of associated channels driven simultaneously. Different scenarios generally reflect accepted or mandated power rating conditions of various jurisdictions around the globe.
For AVR's sold in the US market, the manufacturer is obliged supply (somewhere in their product literature) at least the power rating scenario mandated by your FTC. In your example, those are the top two scenarios, with the only difference being the THD threshold. The manufacturer can then also supply other less stringent scenarios (eg. 6Ω; 1kHz; 1% THD; 1 channel) to yield higher power ratings. It's typically this figure that they multiply by the number of amp channels to produce a "headline" power figure for the unit (eg. 185W x 7 = 1295W!).
When comparing power ratings, I recommend you use the "FTC" ratings as at least they're consistent and mandated. Having said that, I think the rated power figures are among the least
important specs to consider when choosing an AVR. More important are the features and connectivity you want/need and getting yourself as far up the manufacturer's auto EQ/room correction routine "food chain" as your budget will reasonably allow. Once you get to this level, the power rating will generally have taken care of itself.
For a good reality check if/when you begin to obsess about incremental increases in rated power, compare a few by inserting them into this calculator
and seeing the differences in gain expressed as dBW (dB reference 1W). For example, the difference between 80W and 90W in your example is 0.5dBW of gain. That equates to an inaudible difference in SPL.
My speakers run on 6 ohms so should I pay attention to output on 6 ohms? I guess I need to read up on this.
No, ignore them. Those are simply test scenarios that (while being acceptable somewhere around the world) are designed to extract the maximum power rating possible out of the amps and don't reflect the output of the amps at a given time during actual usage. They reflect unrealistic conditions, in particular a 10% THD threshold that would be totally intolerable in use playing program material and would also represent an very hazardous level of distortion for your speakers.
In addition, the 6Ω rating of your speakers is a nominal
impedance assigned by the manufacturer to represent the speaker's frequency variant impedance. For example, below is an impedance trace for which the manufacturer specs the speaker as: "6Ω nom; 4Ω min.". The power dissipated
by the speaker rapidly rises and falls with the program material and is a product of the voltage applied across the terminals and the current drawn by the (complex) load it presents to the amp (P = V x I).
Are you scoping out new model Denons? If you don't actually need the features of these new models, you can probably save some money &/or get further up the model lineup with older, superseded models.
Out of interest, what speakers will the AVR be driving?