Originally Posted by Th3_uN1Qu3
Power (Volts x Amps = Watts) dictates the gauge of the wire, not current alone. If you have an 8-ohm, 250 watt speaker, you would use the same cable for a 4-ohm, 250 watt speaker.
The main advantage of a 4 ohm speaker is that vs an 8 ohm it requires half the voltage swing for a given output power. Semiconductors with high current ratings are more reliable and cheaper to make than ones with high voltage ratings, that's why high power speakers are found in 4 ohm and 2 ohm versions only.
That funny-I have NEVER seen a loudspeaker cable rated for wattage. Maybe you can point out some?
If you want to keep everything equal-you DO need a larger gauge wire for a lower impedance load. That is because for an equal wattage-more current will be flowing throught he cable attached to the lower impedance load.
That extra current will result in more loss of signal AND a lower damping factor on the system.
Having a higher impedance is evidenced by several different types of systems. In audio the standard "70V" in which you hook a bunch of high impedance loudspeakers together. You don't have the losses across normal cable as you would using lower impedance loudspeakers.
Also in AC power distribution. By going to higher voltages and less current-the power company has less loss across the cable.
Also a 4 ohm DOES NOT require half the voltage swing. It requires 70.7% of the voltage that an 8 ohm loudspeaker does-for the same wattage.
For 100 watts a 4 ohm driver requires 20V. For an 8 ohm and 100 watts it is 28.28-NOT 40 volts.
Double the voltage is 6dB more-not 3.
Voltage (and distance) is 20 log x1/x2 while wattage is 10 log x1/x2.