Originally posted by hschen
I use amps for the purpose of compensating for the loss by lengthy cable run. Since you are a RF guy, may I ask you a stupid question - how many feet of cable run we should add amp? Add amp at the antenna or near the TV end?
You need to determine the loss of the cable for the length of the run that you have, calculators are available all over the web for different coax types which may have more or less loss depending on the type. Here is one that I've used in the past: http://www.ocarc.ca/coax.htm
Next you need to measure the signal at the antenna to see how much gain you need to get that same amount at the end of the coax. You also need to know the gain of the amplifier you're using.
Finally, if you amplify the signal at the end of the line you're adding gain to the noise, ideally you need a 2 part amp / pre-amp system, where you would add the gain at the sending end or the antenna and you might need attenuators on the receiving end to null out the line a bit.
I had an experience in Orlando, my former home, where my Channel Master 7000 series amp (great for HD btw), was killing my signal. In that situation I was about the same distance to the towers as I am here in Phoenix, I experiemented with 6db of attenuation in-line and finally settled on 9db to the signals in order. I didn't have the luxury of a network analyzer at that time to mesaure the signal at the input of the antenna so there was some guess work, but reality the 30db of gain from the CM7xxx was too much.
Here are some basic numbers as an example:
100' RG-6 on Vhf Channel 7 (175mhz) has an inherent loss of 3.6db
100' RG-6 on Uhf Channel 45 (650mhz) has an inherent loss of 7.3db
Take this into consideration when designing an amp system and you'll have a better quality signal going into your receiver without overdriving the unit.