AVS Forum banner
1 - 5 of 5 Posts

·
Registered
Joined
·
6,559 Posts
Discussion Starter · #1 ·
I noticed that the FCC Service Contours given in dBu are lower for full service stations than for low power stations. Low power UHF stations are 10 dB higher, high VHF stations are 12 dB higher and low VHF stations are 15 dB higher. Lower numbers result in larger contours and higher numbers in smaller contours. Why are they not the same?

The only reason I can think of is that the smaller contours of low power stations allow stations to be located closer together which affords them less interference protection. Full service stations get a larger area of interference protection. Is that it?

Seems to me that leads to the impression that the low power station service area is smaller than it really is.
 

·
Registered
Joined
·
17,697 Posts
Beats me. It's one of the big complaints I have, which is that LPTV/Class A stations are protected to a lesser threshold than full-service stations and I've never read a satisfactory reason why.

You might be right about them being closer together and getting less interference protection, which is true. If so, that suggests the rule is a relic dating to the days when contour overlap analyses were used to determine whether or not an LPTV station could fit in a given location. The rules, as written, actually still allow contour analyses to be used for LPTV stations, though in practice nobody does that because you can fit stations closer together by using an OET-69 style analysis.

On RabbitEars, you can manually look at an LPTV/Class A station's protected contour, but by default I show the same noise-limited contour used for full-service stations.

- Trip
 

·
Registered
Joined
·
6,199 Posts
FYI: Taking into account the various "Planning Factors" (which included "typical" Antenna Gain, which is LOWER on LOWER Channels...and THERMAL NOISE FLOOR...which IGNORED higher Man-Made Noise in VHF Bands), OET-69 (both 6 Feb 2004 and 2Jul1997 versions) stipulated "acceptable" Field Strength Contours for Analog (Table 1) and DTV (Table 2) with fol. dBu differences. Note that VHF (esp. Lo-VHF) was dis-advantaged right from the start:

Code:
        NTSC   ATSC   DELTA
Ch2-6    46     28     18
Ch7-13   57     36     21
UHF      64+F   41+F   23
where F = 20 (615/fc) and fc = UHF Channel Center Frequency
Note that NTSC uses F(50,50) reliability statistics, whereas ATSC uses F(90,50) reliability statistics in the computerized Longley-Rice ITM predition program.
Also note that NTSC signals are measured with a PEAK reading meter whereas ATSC is measured with an AVERAGE reading meter...if ATSC signals were ALSO measured with an PEAK reading meter, they would be 6-7 dB HIGHER.....and with Non-Fading Gaussian Noise, ATSC works with MUCH LOWER SNR than NTSC. [A lot of these important details were left out of OET-69.]

Due to broadcaster complaints re ACTUAL DTV contours NOT matching NTSC in VHF (esp. Lo-VHF) Band [despite numerous comments to the contrary, FCC/OET had NOT included Man-Made Noise Levels in the "Planning Factors"], FCC allowed "Maximization Requests", which were based on calculated Percent of users experiencing Co-Channel Interference...with, of course, biased protection for Full Service stations. Esp. see FCC 7th Report&Order with 8th Further Notice of Proposed Rulemaking (FCC 07-138) and 8th Report&Order (FCC 08-72)...among others...
 
1 - 5 of 5 Posts
Top