You're going to have a lot of signal loss with a length that long.
No, the signal loss for 100 feet will be very small.
Assume that the coax is equal to 22 gauge wire for both shield and center conductor (this is not quite accurate but close enough). The round trip resistance will be about 3 ohms. The input impedance of the subwoofer amplifier will be much higher, from 20 to 200 K Ohms. That makes the wire's component so small it can be ignored.
Treating this as a DC resistance problem, which is fair enough at audio frequencies, we use I = E/R to get the current. Assuming a 1 volt signal, Into 20 K Ohms the currant, using will be about .00005 amps and into 200K Ohms it will be 0.000005 amps.
Voltage drop is a function of current calculated by this method -
To determine voltage drop per 100 feet given load current and wire gauge:
VD = Voltage drop per 100 feet (Volts)
IL = Current load (AMPs)
AWG = Wire gauge
Skipping over the tedious stuff, it comes down to this. Signal loss (voltage drop) will be trivial.