Originally Posted by Pete-N2
Since my mother-in-law has the TV on from morning till night, to keep peace in the house, I had to get TV reception back FAST! That night I borrowed a small Hi-V/UHF antenna from a friend. As I recall he bought it to play around with about the time of the digital transition because it was small, cheap and had amazing specs. I don't think it got any good reviews. It's the antenna on the right in the attachment.
I put it in the attic hanging off the edge of a plastic storage box that aimed it somewhere between the two antenna farms. It is about six feet lower and four feet SW from the outdoor antenna. For convenience I ran it through a distribution amp to the TV's. (It has a built in preamp.)
To my surprise that antenna works better that the outdoor antenna! All quality levels are 70 to 95% and SNR's are around 29dB the lowest being 26dB. Channel 7/18, one of the weakest became one the strongest. Channel 10/30 one of the strongest became the weakest. Of course 15/3 doesn't exist.
If it wasn't for 15/3 I'd probably make this configuration my permanent solution.
(Trip -- are you going to change your last name?)
I am interested in your comment about us being one of the weakest to now be one of the strongest. My understanding is the signal "strength" that is represented by consumer tuners/tv's is not really just signal strength but signal quality. This is a "number" that represents both RF strength and quality. I have been told this by people who are a lot smarter than me. Anyway, what that means is you could have a really strong RF signal, but if the quality (S/N ratio - signal to noise) was not up to par, the number will be lowered.
This year I was able to purchase a Rohde & Schwarz ETL analyzer. It is the "Cadillac" of ATSC test gear. When I started looking at our signal .vs some of the others, what I found concerned me a great deal. We are proud that we met the original FCC HD deadline of May 1, 2002. We made the deadline - on the air with a FULL POWER HD signal. Nobody else in the market did. While this is something to be proud of, 10 years later I found it had it's drawbacks. Our exciters were first generation, developed in the mid 90's as ATSC was being invented. The other stations that went on the air later had the advantage of having better exciters that produced a better quality signal.
This DOESN'T MEAN A BETTER PICTURE. It means your receiver has an easier time decoding the signal.
I can go on but I lobbied for purchasing new state of the art exciters. My wish was granted and we have brand new Harris APEX exciters. It took our S/N from about 28-29 dBm to 35-36dBm. That is a 7dB difference which is pretty substantial to people in a reception area with a lower RF level. I tend to believe this is what you are seeing.
The funny thing about all this is I wanted to email Trip to tell him, but I thought nahh. Let's see if he notices.
Well about 7 hours after these things went on the air, I get an email at work from him wanting to know what we did to the signal..
I was going to include a power point that shows old .vs new but my VPN isn't working at the moment. I will try later.