Originally Posted by olyteddy
AC is used for power distribution because designing a regulated power supply with a wide range of input voltages is easier with an AC input, especially in the early days when they still used transformer based supplies,
...by which I think you are referring to linear supplies.
I'm under the impression that modern switching power supplies are ordinarily engineered to deal with a wider range of inputs than are any common transformer-based supplies. Stand-alone switching regulators often have published input specs of about 80 to 250 VAC..
Modern amplifier supplies are switching supplies and could probably run just as well on DC.
Maybe the switching regulator circuits I have analyzed are not modern, but I'm pretty sure that the first component in the input circuit is a transformer. I have seen some wound to step up and others wound to step down, but in neither case would DC pass through it. On the other hand, linear supplies designed for low voltage AC input ordinarily begin with a bridge rectifier, which could work with DC input if it was in the right voltage range.
About five years ago, I had to develop a bid for UHF signal distribution to 1,200 residential units in a five building condo complex, and at the time, I could get relatively old, used 60 VAC C-Cor 750 MHz line extender amps for thirty something dollars each, which is about one tenth what an individual would pay for new, modern units, so if those old units came with linear supplies instead of switching supplies, they would still be what we likely would be working with here.
Another reason for AC is that by modifying the power company's Sine wave to a (nearly) square wave you can send power that is (nearly) as effective as DC. A 90 volt square wave is about the biggest you can push through class 2 wiring, I believe. The final reason for AC is electrolysis. The cable plant is an eclectic mix of metals (largely aluminum but also copper, zinc, steel, stainless steel and others) so by using AC damage from electrolysis is minimized. Most modern cable is powered by 90 Volts, but usually the input voltage range of equipment I've worked with (30, 60 and 90 Volt gear) has allowed as little as 50 to 60% nominal Voltage before there was a problem.
That emboldened type is what I was looking for, because if we eventually get enough info to try to kludge together a bargain-basement priced amplified downlead, there is going to be a hell of a lot of voltage drop over 600 meters of RG-11.
600 meters of RG-11 will lose close to 70dB at the top of the UHF broadcast band, so if there is no severe input signal strength or differential problem, he'd need no more than 3 inline amps, but the one nearest to the antenna would see a substantially lower voltage than would the one nearest to the residence. Estimating the voltage drop would be a crap shoot, because we likely will not be able to reliably estimate the current draw of the two intermediate or downstream amplifiers with the kind of precision that a cable system engineer is held to. Over 600 meters, I'd just drop in two or three 30 dB amps in an RG-11 line, with the proximity of the first one to the antenna to be determined by the range of signal levels coming off the antenna, and then play it by ear/eye from there.
The OP can buy a nice cheap Sencore 1453i meter on eBay for $75 (I bought two from them) and replace the battery pack for $30 to $40 and leave out a lot of guesswork.