Originally Posted by Primestar31
Does anybody know what sort of interference, LTE can cause to signal levels? Not sure if I'm even having any issues I could attribute to possible LTE.
LTE interference is an issue, obviously, because the 600, 700 and 800 MHz cell bands used to be part of the UHF TV spectrum. Hence, antennas and amplifiers do a good job at receiving them.
When you're close enough to a cell tower (1-2 mi or less), the level is strong enough that it can overload the amplification in your system (particularly high gain preamps).
In mild cases, it degrades the signal-to-noise ratio, but not enough to cause breakups.
In moderate cases, you'll notice random dropouts.
In severe cases, it will completely wipe out reception of a channel, as the signal-to-noise ratio drops completely below the decoding threshold.
The easiest way to tell (save for a spectrum analyzer) is simply to install an LTE filter before the first amplification stage and see what difference it makes, if any. Usually in the presence of LTE overload, the signal meter readings on a TV will jump around wildly, whereas you'd normally see very slight changes from moment to moment.
One example case from my files: 45 mi from both full-power and 15kW LP stations, running a preamp and UHF/VHF-Hi yagi's on the roof - LTE tower 75 degrees off-axis, probably 1 mi away. Without an LTE filter, full-power's all scanned in strong (but not stable) level, some LPs were exhibiting constant breakups, some didn't scan in. Installed LTE filter, SNRs improved across the board - full-power's are stable, all LPs scanned in with none breaking up, including a weaker station that had previously not scanned sitting at a stable 17dB SNR.
Not a universal rule, but I've observed LTE interference worsen in the evening, presumably as the network is more congested.