Originally Posted by Heinrich S
So due to Ohms Law, an amp that claims high current but has a lower wattage is just BS? Like NAD amps compared to Yamaha amps in the same price range. They tend to produce more power in all channels than Yamaha do according to test bench results. I figure that has something to do with the additional current in the NAD, but maybe NAD are just fudging the numbers on the spec sheets so that when they get tested the results look better than they are?
What you say here actually leads to the answer you are looking for but is not obvious
. As noted before, you can use Ohm's law to arrive at Power = V*V/R. It seems therefore that current does not enter the equation and all that matter is the voltage. Indeed, when designing a power amplifier, the first thing you do is decide on the voltage as that sets the upper limit on how much power you can have. A car audio amplifier for example that uses the car's battery voltage of say, 13 volts as is, will only be able to produce 13 * 13 / 8 = 21 watts with an 8 ohm load. This is why typical car stereos have power in this range. High power car audio amps got around this limit by having dc-to-dc converters that raise this voltage up substantially in order to create say, hundreds of watts. The car battery can produce huge amount of current but that was of no value at its low standard voltage without the dc-to-dc convert to raise it.
So what about the current? As I said, it seems to not matter as i just computed the power using just voltage. The answer to that lies in the example you give. A home theater AVR may rate two channels driven at 100 watts each, but with all 5 channels driven, the output may drop to 40 watts/channel. Ohm's law and the computation above clearly fail to explain this. After all, if the amp had sufficient voltage to produce 100 watts, and the load is kept constant, how did it all of a sudden fail to produce that same power when all channels were driven? That is your question, right? The answer is that voltage did not say constant. It actually dropped due to lack of current. The power supply had insufficient current to drive all five channels to 100 watts. When it ran out of current, its voltage sags and lowers. Once that voltage drops, the above formula then predicts that our power output drops just the same. In other words, the voltage is dependent on current draw due to design limitations in practice.
There are actually two solutions to above. One is to boost the power supply capacity to power all the channels. Or alternatively as Pioneer does, use more efficient class-D amplification. A ton of current/power is wasted in traditional amplifiers due to heat. That is why those giant heat sinks are there to keep the output transistors from getting too hit and destroying themselves. Here is a quick graph showing the difference between two classes of amps:
I know NAD has some class D designs but have not looked at their AVRs to know if this is how they get additional power or simple brute force of more beefy power supply.
As I think was noted, "audiophile" products are commonly over-built in this department. Their market is not price sensitive and is easy to simply beef up the various subsystems to have ample current and extra headroom. This reduces the cross-talk you may get between channels where pushing one causes the output of the other channels to potentially droop.
In AVRs, i am a huge fan of these higher efficiency amps. People often put these things in cabinets and such, with reduced air flow. The result is poor reliability. My pioneer elite runs cool in a totally enclosed cabinet. I have a fan but never turn it on. It actually produces less heat than my DSL modem! The prior Onkyo I had would heat up this large cabinet that the granite top would get too hot to touch above it. We are talking about an 8 foot by 3 foot slab of stone getting that hot! The fact that the pioneer maintains its output across multiple channels is just a bonus.