Join Date: Jan 2006
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 0 Post(s)
Originally Posted by dr1394
The minimum signal level for terrestrial (antenna pointed at the horizon) receivers is:
thermal noise floor + noise figure + required S/N ratio
The thermal noise floor is:
p = noise power
k = Boltzmann's constant (1.3807x10E-23)
t = antenna temperature (290K = 17C)
b = bandwidth (6 MHz)
p = 1.3807x10E-23 * 290 * 6000000
noise power in dBm (10 log p + 30) = -106.2
If you use a 3 dB preamp at the antenna, then your receiver noise figure will be essentially 3 dB. The required signal to noise ratio is around 16 db. So the minimum signal level is -87.2 dBm.
To convert from dBu to dBm, the equation for a 75 ohm antenna is:
P(dBm) = E (dBÂµV/m) + Gr(dBi) - 20log F(MHz) - 75.46
where Gr is the gain of the receiving antenna.
The DTV station is on channel 35 (599 MHz) and the typical antenna gain is 14 dBi. At the 41 dBu contour, the received power will be 41 + 14 - 55.54 - 75.46 = -76 dBm. With our -87.2 dBm receiver, the signal will be 11.2 dB above the minimum signal required.
Originally Posted by etzeppy
As a follow-up, I'm measuring levels in the range of -60 dBm to -45 dBm for the local TV carriers. I'm also seeing FM and a few other signals at -20 dBm and higher. At what level would you expect to exceed the dynamic range of the receiver and start generating intermod or causing receiver desense?