No it is not a rewrite.
1. A "10W" LED is not going to consume 10W of power regardless of the voltage applied.
Any LED is going to have a specified forward voltage drop and a suggested operating current (or range of current). So you pick a current you want it to operate at and compute the value of resistor required for a given power supply voltage to obtain the desired current.
Let's make up some numbers. Forward Voltage Drop = 2V. Recommended Current = 100ma.
Required resistor for 24V supply = 220 ohms.
Drop the voltage to 12V and the current will drop to appx 45ma.
Drop the voltage to 5V and the current will drop to appx 13ma.
So, no, lower voltage does not result in higher current.
2. It is just is an over simplification to say that "more current causes more heat". To say it again, the heat produced is proportional to the power dissipated.
So if you double the current but double the cross section of the wire (assuming the same material, etc) the heat produced will not change because the power dissipated has not changed.
An extreme case applies to superconductors which have no resistance and thus dissipate no power - therefore no heat.
3. To allow a circuit to operate with a 32A breaker instead of a 20A breaker requires you to allow the wire (of the same gauge, material, etc) to dissipate more power and thus generate more heat. This is true for a 120V, 240V or a 440V circuit. Thus it must be a difference in the local electrical code that allows the 32A breaker.
The "advantage" of higher voltage is that you can obtain more power with the same current OR you can obtain the same power with less current.
And you can also be electrocuted more easily.