Okay, this is a serious question: Why are digital stations broadcasting at a maximum ERP of 1000kW when analog UHF stations are broadcasting at up to 5000kW? Is it because the "experts" figure that you need a lower powered signal to receive digital? Or are the broadcasters all just agreeing that 1000kW is as high as they want to go?
It seems to me, with all the problems receiving HDTV out there, that a boost in ERP might be in order...
Could it be an FCC regulation to try to control the clutter?
We already have lots of stations overlapping and interfering with each other, due to the doubling of transmitters for the transition, and the government getting rid of a big part of our TV spectrum.
I've seen problems with:
ch 19 in Boston and Adams Ma
ch 30 in Boston and Hartford
ch 57 in Boston and Springfield
etc, etc. More power would just make the situation worse.
What I'd like to see is not more power but say 2 times the number of TV channel slots, and all broadcast towers required to be on Mountain tops at over 3000 ft elevation. Then we'd all get better tv coverage.
It is partially interference, partially competition. I remember reading about an AM station that broadcast from St. Louis during WWII. It was strong enough that it could be received in Europe by our troops (curvature of the earth and low frequency). Anyway, after the war other stations around the country started to complain that their signal was encroaching or such and the FCC moved to limit power output.
Or at least that's how I remember it. I'll try to find the article online.
Still looking, but this
Well, digital is supposed to be received with less power, from what I understand. And, going to VHF should help even more.
There's a PBS station 40 miles ENE of me that's only at 49.7kW. The terrain where the tower is located is quite hilly but relatively flat near me. And, their tower is less than 700' HAAT. It's a directional antenna but I'm in a decent spot with a Relative Field Value of about 0.83 and I lock consistently on that station. Even before I went to the stack on my roof. I even lock on it with my attic-based antenna.
FCC estimates that DTV coverage in UHF for the equivalent of analog TV's Grade B contours requires a DTV signal of 41dBuV/meter at your antenna. The Grade B contour for analog TV is based on a UHF signal level of 64dBuV/meter. So from this little exerise, the FCC is estimating that DTV is ok at a 23dB reduction in signal level compared to analog TV.
1 MW vs. 5 MW is a reduction of 7dB. Should work then, huh?
And, yes, too much signal will cause interference with other stations, etc.
Of course, I'm just crunching numbers, this has nothing to do with the real world.
|Originally posted by Man E
I remember reading about an AM station that broadcast from St. Louis during WWII. It was strong enough that it could be received in Europe by our troops (curvature of the earth and low frequency). Anyway, after the war other stations around the country started to complain that their signal was encroaching or such and the FCC moved to limit power output.
Could you actually be remembering WLW in Cincinnatti? Referred to during the war as the "Nation's Station", I believe the FCC allowed it 500kw in order to be heard in Europe. That authority was revoked after WWII and it returned to the clear channel power of 50kw that has always been the limit.
Could be Jon. Your facts are certainly more specific than are mine :D I wasn't able to find any other information. Gee, guess I'll do some work now ;)
DTV and NTSC stations are rated differently. DTV is average power,
NTSC is peak (sync tip). The peak/average ratio of NTSC depends
very much on brightness, the peak/average ratio of DTV is constant.
1 MW DTV and 5 MW NTSC are very roughly equivalent in average power.