or Connect
AVS › AVS Forum › HDTV › HDTV Technical › TV Fool Discussion Thread
New Posts  All Forums:Forum Nav:

# TV Fool Discussion Thread - Page 14

Doesn't it work something like this (plus or minus a few tenths):
antenna output: signal: -65, noise floor: -106, SNR: 41, NM: 25
balun output: signal: -67, noise floor: -106, SNR: 39, NM: 23
downlead output: signal: -77, noise floor: -106, SNR: 29, NM: 13
TV amp: signal -77, noise figure +7 (-99), SNR: 22, NM: 6

Or with pre-amp:
antenna output: signal: -65, noise floor: -106, SNR: 41, NM: 25
balun output: signal: -67, noise floor: -106, SNR: 39, NM: 23
pre/pre-amp: signal: -67, noise figure: +3 (-103), SNR: 36, NM: 20
post/pre-amp: signal: -47, noise: -83, SNR: 36, NM: n/a (20)
downlead output: signal: -57, noise: -93, SNR: 36, NM: n/a (20)
TV amp: signal -57, noise: -93, SNR: 36, NM: n/a (20)

where:
- balun loss: 2 dB
- downlead/insertion loss: 10 dB
- TV noise figure: 7 dB
- pre-amp noise figure: 3 dB
- pre-amp gain: 20 dB

### AVS Top Picks

Hi Tom --- I should say I'm not from "TVfool", but here's my understanding :

For DTV, TVfool 0 dB NM = -90.8 dBm

You can test this ... (again for DTV, not analog) Subtract the NM prediction from the dBm prediction for any given case, and you should allways get about -90.8 or -90.9 dBm ....

It's as simple as that, really, and I believe the NM numbers are meant to allow folks an easier way to interpet the signal strength predictions, and as a starting point to calculate their losses(receiver/amp noise figure, line losses/etc) and gain(antenna gain only), vs using the dBm predictions, and a 0 dB NM is based upon minimum signal level needed to decode DTV ....

So, regarding that minimum signal level, why(~) -90.8dBm = 0dB NM you may ask ? .... There's a reason :

-106.2dBm = Thermal Noise floor involved.

15dB SNR - Theoretical Minimum SNR Necessary to decode DTV via 8VSB Note : Looks like Andy may be using about 15.4dB SNR required, which If I recall correctly is pretty close to the Median results of how receivers actually performed in FCC tests .....

Why do we need 15dB SNR (or S/N, or C/N) to decode DTV? Short answer --- Basically, This is "built into" the system so to speak -- Long answer(interesting info here though) on why/how this is can be found here:

.... So, If our receive system could have a 0dB Noise figure, and no losses between 0dBi gain antenna and receiver, we could theoretically decode the pretty pictures via 8VSB(ATSC A/53) DTV with a signal level at -91.2dBm (or -90.8 dBm signal level if we are using -15.4dB figure), but not say, -92dBm or less .....

But in the "real world" :

We'll have to at least subtract 6~8db = Typical receiver noise figure. With a LNA (Typical NF 1~4dB), in the right circumstances we could also lower system NF by several dB .... And, then other losses if not recovered by an amp, such as masthead amp (balun/feedline losses, feedline losses/losses from any splitters/etc) ....

If we aren't using a LNA, TOV (threshold of visability) for decoding DTV is about -84dBm or so at receiver input, but can be several dB less if our system NF is lower than our receiver's NF ....

Of course, thankfully, we also can ADD the gain of our antenna !

You might also want to see FCC OET Bulletin #69, as it is one place where FCC planning factors for DTV reception are detailed (Specifically of interest regarding above see Table 3 page 3 ) :

http://www.fcc.gov/Bureaus/Engineeri...et69/oet69.pdf

Hope that helps, and of course, Andy, feel free to add any thoughts or corrections ...
Quote:
Originally Posted by tczernec

Why does antenna gain (according to tvfool) increase the NM (assuming NM is a measure of SNR)? Shouldn't higher gain simply mean that both the noise and signal components of the signal are equally amplified, resulting in no improvement in SNR?

Tom

I followed you here, Tom.

The gain of the antenna is added to the NM because it improves the SNR.

The reason why you don't add the preamp gain is because the preamp amplifies the the signal and the noise plus adds a little noise (NF) of its own, making no improvement (even causing harm because of the NF) in the signal to noise ratio. A minimum SNR of about 16dB is needed to maintain lock on the digital signal.

Think of NM as how far above the signal is above ambient noise floor. The smaller the NM, the closer it is to the noise floor, which reduces the SNR.

You can check your own "NM" at your location by inserting an attenuator between your antenna (no amp) and the tuner, as described in the link in my signature. Keep increasing attenuation until the signal just drops out. The amount of attenuation at dropout is your "NM". Just before dropout you will see freeze, tiles, & etc. that are caused by the deterioration of signal quality because the bit errors (BER) are increasing to the point where the ATSC error correction system (FEC) can no longer handle them, and it gives up causing loss of lock.

Don't confuse signal strength (power/level) measured in dBm or dBmV with signal quality. Signal quality is related to BER.

The factors that reduce signal quality and cause a higher BER are:

1. Improper signal level: A weak signal will cause a poor signal-to-noise ratio; a signal that is too strong can overload a tuner or preamp. A nearby FM transmitter can also cause overload, which would require an FM trap.
2. Reflections from multipath problems.
3. Impulse noise in the reception area. (It's worse on VHF-Low. Just ask Trip in VA about PBS at home.)

Keep in mind, with digital, signal quality (BER) is just as important as signal strength, especially with multipath problems.
http://www.wowvision.tv/signal_strength_meters_BER.htm
http://www.wowvision.tv/HDglossary.htm Glossary
http://www.wowvision.tv Home

The importance of signal QUALITY is why I like to use a CECB with two signal bars, like the Apex DT502. The initial aim of the antenna can be done with the signal strength bar (or a SLM--signal level meter), but the final aim of the antenna has to be done with a way to monitor BER using the signal quality bar, to reduce the effect of multipath reflections.

When I was testing the DT502 with my CM4221 antenna I got (for my marginal ch13.1 on RF41):
Signal Quality 60%
Signal Strength 55%

I had aimed the antenna with my SLM, but when I rotated the 4221 slightly to the right I got:
Signal Quality 100%
Signal Strength 56%
Note the BIG change in signal quality with only a slight change in signal strength.

Quote:
Originally Posted by IDRick

While I certainly agree that there are far better tools for measurement, the CECB's can be very useful for finetuning antenna location. In my case, I have broadcast towers in two directions (200 and 270 degrees). My DIY 4-bay does a nice job of receiving signal from both directions with proper placement in the attic. The APEX 502 was invaluable for optimizing signal strength of my two lowest channels (1 at 270 and 1 at 200 degrees) and deciding final antenna location/orientation.

Signal quality readings from the APEX also had practical application. My antenna is connected to two tvs and a computer capture card. Anything less than 100 on signal quality scale results in audio and video dropouts with the capture card. I get marvelous recordings with proper placement of the antenna (optimized signal strength and signal quality = 100). I'm sure top quality equipment could fine tune my setup even further. I'm satisfied with a \$20 tool versus a several hundred dollar tool, especially for a one time use.
Quote:
Originally Posted by rabbit73

The gain of the antenna is added to the NM because it improves the SNR.

Well, the way I look at it, TVfool's predictions are predictions of signal strength, not SNR. The gain of antenna is added to the NM because it increases signal strength.

SNR will never be higher than what it is as transmitted. For DTV(ATSC A/53), I don't have time to dig up the links/provide the exact info, currently, but as I recall maximum possible would be something just around or over 30dB SNR. It may often be a bit below the maximum possible depending upon various issues on the station transmit end (such as the EVM - error vector magnitude) ....

Also, on the receive end, anything other than what the receiver can use from the DTV signal is just seen as noise by the receiver ... Uncorrectable multipath = Noise -- Interference = noise -- And so on and so forth .... anything but the useable/recoverable(by the receiver) portions of the 8VSB signal = noise ... Thus, We can actually have strong RF signal at receiver input, but a relativey low(usable by receiver) SNR, if for example multipath uncorrectable by receiver is involved ..... Or, OTOH, we can actually have relatively weak(but very "clean") signal, and a relatively high SNR ....

Quote:

The reason why you don't add the preamp gain is because the preamp amplifies the the signal and the noise

Yes ...

Quote:

plus adds a little noise (NF) of its own, making no improvement (even causing harm because of the NF) in the signal to noise ratio.

See the section under "mathematical principles" here :

http://en.wikipedia.org/wiki/Tower_Mounted_Amplifier

You might also enjoy reading this :

Quote:

A minimum SNR of about 16dB is needed to maintain lock on the digital signal.

Close enough.

Quote:

You can check your own "NM" at your location by inserting an attenuator between your antenna (no amp) and the tuner, as described in the link in my signature. Keep increasing attenuation until the signal just drops out. The amount of attenuation at dropout is your "NM". Just before dropout you will see freeze, tiles, & etc. that are caused by the deterioration of signal quality because the bit errors (BER) are increasing to the point where the ATSC error correction system (FEC) can no longer handle them, and it gives up causing loss of lock.

Yes, Yes Yes ... Except, do keep in mind such a measurement will be effected by issues such as any multipath(if present) uncorrectable by receiver, intermodulation products, interference/etc/etc, whereas the TVfool predictions aren't ....

Quote:

Don't confuse signal strength (power/level) measured in dBm or dBmV with signal quality. Signal quality is related to BER.

The factors that reduce signal quality and cause a higher BER are:

1. Improper signal level: A weak signal will cause a poor signal-to-noise ratio; a signal that is too strong can overload a tuner or preamp. A nearby FM transmitter can also cause overload, which would require an FM trap.
2. Reflections from multipath problems.
3. Impulse noise in the reception area. (It's worse on VHF-Low. Just ask Trip in VA about PBS at home.)

Keep in mind, with digital, signal quality (BER) is just as important as signal strength, especially with multipath problems.

Those are some of them+Absolutely ....

It really doesn't help many of these receivers meters are labeled "signal strength" ....
Thanks everyone for the great responses - I'm going to digest everything and do a few calculations. It makes sense now that the antenna gain should be added, since you're receiving more signal but not actually receiving more noise. Rather, the noise portion of the signal is just the ambient thermal noise floor. My tv tuner card actually outputs SNR so it'll be interesting to see what I get when the setup goes live in late May, especially with an interesting ridge of trees at approximately antenna-height in my LOS less than 200 feet away..

Thanks again for all the detailed responses!

Tom
Possible correction for Andy:

WPCW-DT's projected reception in Pittsburgh area appears to be using the CP's power levels and not the full post-transition license power.

I noticed in the KMZ files you have WPCW listed at 2.5 kw. They're supposed to go full-power almost right away whenever they do transition.

It would be helpful to see WPCW's future reception based on the 30 kw level from the full license, as opposed to the 2.5 kw CP level.

Also, I'm guessing the online tool is using the same power level, since it lists WPCW down with out-of-market and LP stations in terms of db.
Several great responses have been left already, so I'll try not to repeat things too much. Here's a simplified "visual" representation of the relationship between signal and noise. I don't know if this helps any, but there's a chance this might make things clearer for some people...

In the following graphs, the progression moves from left to right.

- The (dark blue) line running across the graph represents the TV signal level.
- The red hashed area at the bottom represents the thermal noise floor (ambient noise that is naturally present everywhere).
- The different gray sections indicate the part of the system that is causing signal or noise levels to go up or down. The alternating grays are only for visual clarity and have no significance beyond that.
- The entire chain can be thought of as two parts: 1) stuff that happens between the transmitter and your address; and 2) stuff that happens between "the air" at your address and your TV.

Noise Margin is best thought of as a Signal to Noise Ratio (SNR). In the following graphs, the Noise Margin is the difference between the signal line (dark blue) and the noise floor (black line on top of hashed marks). I'll explain the distinction between NM and SNR toward the end of this post.

Let's first take a look at a simple example. This first chart represents what happens when there is a very strong line-of-sight signals coming from the transmitter, and all that is needed is a simple indoor antenna.

1) For the given transmitter's ERP (assuming adjustments for the broadcaster's antenna pattern are already included), the line-of-sight propagation loss in this example is minimal, so a very high Noise Margin is available "in the air" at the given address.

2) Since this example is assuming an indoor antenna, there's going to be some signal loss due to building penetration.

3) The indoor antenna will have a little bit of gain, so that improves the signal level slightly. This also improves NM and SNR because antennas pick up the signal without raising the noise.

4) The cable between the antenna and the TV will have a little bit of loss associated with it (depends on cable quality and length).

5) As the signal enters the TV, the tuner circuitry has a Noise Figure associated with it. This indicates how much the tuner itself will degrade the signal before it reaches the part of the logic that actually decodes and displays the picture. This Noise Figure must be taken into account when estimating the amount of "usable" signal that actually gets to the decode/display circuitry.

6) When the net residual Noise Margin is greater than 0, then the TV signal can theoretically be decoded and watched. In reality, it's better to have a buffer of at least 5 to 10 dB residual NM so that the setup does not show blocking artifacts and/or drop-outs at every little fluctuation (e.g., blowing trees, rain, airplanes, etc.) or when there's interference (e.g., multipath, power lines, etc.).

In this next example, we'll look at a more difficult situation. In this case, the transmitter might be far away or behind 1Edge or 2Edge diffraction obstacles. By the time the signal reaches the given address, the signal is very weak. A very good antenna and pre-amp are needed to recover a usable signal.

1) The TV transmission might start strong, but terrain and distance might bring the signal down to barely usable levels. In this example, the Noise Margin is actually a negative value because the desired signal is down at the level of the noise floor.

2) In this case, an outdoor high-gain antenna is needed. There is no building loss, so the first thing we encounter is the antenna gain. It will need to be a high-gain antenna too. This pulls the signal back up to a level where we have a positive Noise Margin again.

3) There will be a short run of cable between the antenna and the pre-amp, so a tiny bit of loss is shown. If the amp is not mast-mounted, and is placed further down the chain (like in the attic or closer to the TV), this loss will increase, and will ultimately hurt the Noise Margin.

4) Then we hit the pre-amp. All pre-amps have a Noise Figure associated with them, which means they will degrade the signal a bit in the process of boosting it. At the pre-amp's output, both the signal and the noise floor have been boosted, so the Noise Margin remains about the same (minus the Noise Figure of the amp).

5) From this point on, the Noise Margin remains essentially constant. Cable losses, splitter losses, and receiver Noise Figures do not hurt the Noise Margin any more. Since the signal has been boosted to a high level, these downstream losses are no longer driving the signal into the thermal noise floor.

6) If the net downstream losses (and Noise Figures) happen to be greater than the net gain provided by the amp, you might be back into a situation where the Noise Margin starts to degrade again. For extremely long cable runs or other special circumstances, it may be required to install a secondary amp in the chain to minimize the loss of Noise Margin. There are several complications when doing this, so it's not recommended unless absolutely necessary.

Everyone should note that the antenna is the ONLY element in the system that helps you GAIN Noise Margin.

A pre-amp does not change the intrinsic gain of the antenna. Amps boost the signal and noise floor simultaneously, so a crappy signal going into an amp will yield a strong, but even crappier signal (because of the amp's Noise Figure) at its output.

Do not be fooled by lousy antennas that include a built-in amp and claim to be high-gain antennas. The amp's gain does NOT count toward the gain of the antenna itself.

Difference between SNR and NM

SNR is generally defined as the ratio between desired signal power and the power of the noise floor. The minimum required SNR for any communication system depends on the details of its design and signal structure. Modulation type, symbol rate, error correction codes, Turbo codes, Viterbi encoding, and dozens of other design considerations ultimately affect what SNR is needed to make the system work.

For example, ATSC requires a theoretical minimum of about 15 dB SNR in order to get a TV picture. NTSC requires about 27 dB SNR (with analog, it's a very subjective matter to decide what is "watchable", but this is roughly where you get a picture with "some snow").

There are other systems (like GPS) that can work even when the SNR values are negative. These signals have a very high "processing gain" that make it possible to decode it even when it is buried well below the noise floor. The desired signal can actually have less power than the ambient thermal noise and still be used.

NM, on the other hand, is generally defined as the amount of signal relative to the minimum threshold for operation. On a dB scale, the 0 dB point is at the theoretical boundary between working and not working. Positive dB numbers mean the system should work with some margin for error. Negative dB numbers mean the system should not work because the signal level is deficient by that many dB.

If we used SNR to compare ATSC and NTSC, we'd have two different number scales to deal with. We'd have to mentally keep track of the minimum SNR thresholds for each signal type and do a lot of quick math in our heads.

If we use NM to compare ATSC and NTSC, it's a lot easier to tell how well we're doing relative to the minimum operating thresholds. It also reduces some of the confusion caused by the differences in power levels. Many people believed that digital coverage was going to be worse than analog coverage because a lot of transmitters were broadcasting with significantly less power. However, less power does not mean less coverage.

The average person will not know the different operating thresholds for each signal type, so providing numbers for field strength, dBm, SNR, or ERP often leads to increased confusion for some people.

Best regards,
Andy

Hi Andy,

Thank for your the great explanation - it would be a great addition to your site for those interested in the details! I have a couple of questions:

- In your second example, why is the noise margin not degraded by the noise figure of the receiver (at the far right side of the chart)?
- In reading the entry on Tower Mounted Amplifiers on wikipedia, it seems that the system noise figure would be impacted by not only the noise figure but also the gain of the various amplification stages. In that case, would that imply that the noise figure of the receiver (which itself probably has an LNA) would be reduced because of the gain of the preamp?

Fascinating stuff - thanks for making this all a bit clearer!

Tom
It might be useful to have a third diagram like the second one, but without the pre-amp. The post-antenna signal continues to drop because of cable and splitter losses, and the NM decreases accordingly. Then the signal reaches the receiver and the NM decreases sharply (maybe even goes negative) because of the receiver's NF.

The pre-amp eliminates or reduces that final sharp decrease in NM.
Excellent diagrams and explanation, Andy! I know that I will be making links to that post.

You have given me a much better understanding of NM and what happens to the signal on its way to the tuner.

Thanks,

rabbit
Andy,
I think the propagation information available on your site is invaluable so I thought I would try to give an explanation of antenna noise, preamp noise and system noise. There is a lot of confusion on the usefulness of what a preamp does and what other factors that contribute to the successful reception of a DTV signal. It is correct that the DTV detector requires an SNR of about 15 dB to produce picture with very few dropouts. The amount of signal power received at the antenna is determined by the transmit power, propagation loss and antenna gain. You perform a phenomenal service calculating the doing the calculation for this value getting the xmit power from the FCC database and modeling the porp loss. For the noise margin calculation a level of system noise, the minimum required SNR and antenna gain must be assumed. The SNR value is 15 and perhaps antenna gain it is 0 dBi. Determining the noise seen by the detector requires knowledge of the electronic noise, the noise received at the antenna and gains and losses that the signal and noise sources go through.
A simple and useful model for a receiver is a noise source added to the input signal with the rest of the circuitry being perfect and causing no additional changes in SNR. This noise value is specified in a bandwidth independent number called noise figure. The noise figure is often provided in dB. This number is not 10 time the log of the ratio of two powers so can not be directly in system power calculations. This number could be converted to watts per unit of frequency but this would yield very small numbers. A more tractable number with a physical interpretation is a conversion to Deg. Kelvin. The formula for this is Noise temperature (T) = 290 * (10^(Noise Figure/10)-1). This number can be converted to watts per Hz by multiplying by 1.38 x 10-23 which is Boltzmanâ€™s constant. The total noise at the detector is that value times the system bandwidth which is about 6 Mhz for a DTV signal. The FCC planning factor for DTV's NF receivers is 10 dB but I would expect many to be between 3 and 6 dB. Doing a few calculations I get that a 10 dB NF or Deg. K receiver sees a noise level of -96.6 dBm at its front end and a 1 dB NF receiver 112.1 dBm. For a 0.4 dB NF receiver things get much better and the frontend noise power is -116.3 dBm which is about a 20 dB improvement over the 10 dB NF receiver.
A real preamp that has no clipping or distortion is an amplifier with its own noise source at its input. The effect on total noise at the detector is that it is now the sum of the antenna noise temperature, preamp noise temperature and the receiver noise temperature attenuated by the preamp gain. For example the 10 dB NF receiver connected to the antenna has a system noise temperature of 2760. If a 1dB NF preamp with 20 dB of gain (a power gain of 100) is placed at the antenna, the new system noise temperature becomes 150+75+2610/100=251. This is about 10dB. The same improvement would be had by having the transmitter increase power from 1MW to 10MW. The transmission cable attenuation makes the receiver without the preamp even worse. At high UHF channels 100 feet of RG6 will decrease the detector SNR by an additional 6 dB. If the preamp gain is much more that the cable attenuation, the attenuation will only have small effect on the system noise temperature. So in this system the preamp would provide up to 16 dB of detector SNR improvement. The bottom line for preamps is that any preamp with a lower NF than the receiver NF will help in increasing detector SNR and even if it has the same NF it will increase the the detector SNR by almost the cable loss. Also an overloaded preamp will have an apparent noise temperature of many thousands of degrees so it will do more harm than good.
John
Quote:
Originally Posted by andy.s.lee

SNR is generally defined as the ratio between desired signal power and the power of the noise floor.

Let's see how should I put this as I certianly do not disagree with that...

As it's very true, if we are using/thinking of SNR in terms of the RF signal and its signal strength above the noise.... And, it would not be unusual in that sense to have a "SNR" in a strong signal area of 60dB or more ....

However, my concern with using that definition for SNR with DTV involves those who have SNR meters of some sort on their receiver, in which case we are talking about something a bit different ...

SNR for 8-VSB (and related MER - Modulation Error rate, and EVM - Error Vector Magnitude) is explained and defined I think quite well, and probably in a much better way than I can say it in the following document :

http://www.tek.com/Measurement/App_N...5W_13224_0.pdf

--------------

Now, while we could probably use either definition interchangeably and come up with more or less accurate numbers if we have 15dB SNR (and no uncorrectable multipath/etc)... I think the problem is, we can't do so if we have a SNR something much above 30dB SNR, because with the general definition, we can have that, but we cannot actually have a 60dB SNR (or MER or EVM) with the "specific" SNR definition that applies to 8VSB DTV ....

In other words, while I certianly have no problem with the general definition for "SNR" you provided, I think it might be confusing for those who do not know the difference and are wondering why they aren't seeing a SNR reading any higher than about 33~34dB on say, their Sony HDTV's signal diagnostic screen .... In the past, I've thought about trying to explain this in a simple fashion by explaining it as involving the AGC circuit in the receiver, however that of course may not be entirely accurate, depending upon how any given "SNR meter" on any given DTV receiver actually works ...
Two simple words: "sampling noise".
8VSB is a digital system which only "needs" 15-20 dB to work.
The SNR due to sampling noise only needs to be a little more than this.

Imperfections in the transmitter will also limit the transmitted SNR...

Compare: SNR for 16-bit audio is limited to about 96+ dB and 24-bit is 120+ dB.
Jeff,
Your reference points out correctly that the SNR is the ratio of power in an idealized digital and all sources of noise power. A prop loss model usually only address changes in the signal level generally not any of the linear or non linear errors that are mentioned in your reference and these can be quite large as the ones induced by multipath. The noise I was attempting to quantify was the broadband noise as it can be held to a minimum but cannot be eliminated so at long distances it becomes a limiting factor in achieving the required 15 dB SNR. All of the noise source power add to the total noise and most of the forms of noise will increase to some degree every time the signal passed through an active electronic circuit. This includes the transmitter. I have seen that stations try to keep the noise in check so that the transmitted signal will have an SNR of 35 dB or better. This SNR will allow essentially perfect reception of the video and audio, so is sufficient. On its path to antenna it will only get worse since the signal shape will become less idealized and its level will get lower. The receiver should be designed so that except for adding broadband noise, no additional loss in SNR occurs. An accurate measure of the ratio of the ideal signal to everything else will never be higher that the SNR at the transmitter.
John
Quote:
Originally Posted by ctdish

Andy,
The FCC planning factor for DTV's NF receivers is 10 dB but I would expect many to be between 3 and 6 dB.

About 6~8dB is probably a good rough estimate to use for current DTV receiver NF. FCC OET tested numerous DTV receivers in 2005, the Median receiver NF for all of them was :

Channel 3 = 8.8dB, channel 10= 7.6dB , Channel 30 = 6.9dB

Of course, a few of those tested performed worse than most of the others ...

Follows is link to FCC report "Tests of ATSC 8-VSB Reception Performance
of Consumer Digital Television Receivers Available in 2005", PDF format :

http://www.fcc.gov/oet/info/document...on-testing.pdf

Receiver NF info is in chapter 5 ....

Readers may also find the FCC report at following link of interest regarding DTV receiver performance, it is Entitled "Interference Rejection Thresholds
of Consumer Digital Television Receivers Available in 2005 and 2006" :

http://www.fcc.gov/oet/info/document...s-03-30-07.pdf

Quote:

....There is a lot of confusion on the usefulness of what a preamp does and what other factors that contribute to the successful reception of a DTV signal. .... The bottom line for preamps is that any preamp with a lower NF than the receiver NF will help in increasing detector SNR and even if it has the same NF it will increase the the detector SNR by almost the cable loss.

Yes .... There is also excellent article along the same lines of your comments Involving Noise Temp, preamp NF/etc here :

http://www.geocities.com/toddemslie/UHF-TV-DX.html

Quote:

Also an overloaded preamp will have an apparent noise temperature of many thousands of degrees so it will do more harm than good.
John

That is where it gets complicated, I think, as Intermodulation distortion products created within an external amplifier or receiver front end are likely to involve strong signals on certian frequencies(likely very specific to each specific receive location involved) and are often likely to effect certian particular frequencies(and likely involve some rather complex math if one were to attempt to predict which frequencies would suffer most) harmfully, while other frequencies may seem relatively uneffected (such as involving successful weak signal reception) ....
Quote:
Originally Posted by holl_ands

Two simple words: "sampling noise".

That works, thanks ... I had on previous occasion gotten it down to a probably mostly-decipherable paragraph or so, but those two words do just fine ....

Quote:

Imperfections in the transmitter will also limit the transmitted SNR...

For those interested, described in more detail in link I provided previously .... Text of that document begins :

"This technical brief describes the effect various transmitter impairments have on the output signalâ€™s Signal to Noise ratio (S/N) in 8-VSB DTV transmission. .... "

Here's the link again(PDF) :

http://www.tek.com/Measurement/App_N...5W_13224_0.pdf

Quote:

Compare:

I've read minimum useable SNR for A/153 (ATSC M/H canidate standard) may be as low as 4db S/N ... haven't completely delved into the "ATSC parades" and the like in the newly published A/153 docs yet ....
Quote:
Originally Posted by ctdish

Jeff,
Your reference points out correctly that the SNR is the ratio of power in an idealized digital and all sources of noise power. A prop loss model usually only address changes in the signal level generally not any of the linear or non linear errors that are mentioned in your reference and these can be quite large as the ones induced by multipath. The noise I was attempting to quantify was the broadband noise as it can be held to a minimum but cannot be eliminated so at long distances it becomes a limiting factor in achieving the required 15 dB SNR. All of the noise source power add to the total noise and most of the forms of noise will increase to some degree every time the signal passed through an active electronic circuit. This includes the transmitter. I have seen that stations try to keep the noise in check so that the transmitted signal will have an SNR of 35 dB or better. This SNR will allow essentially perfect reception of the video and audio, so is sufficient. On its path to antenna it will only get worse since the signal shape will become less idealized and its level will get lower. The receiver should be designed so that except for adding broadband noise, no additional loss in SNR occurs. An accurate measure of the ratio of the ideal signal to everything else will never be higher that the SNR at the transmitter.
John

Certianly + Well said ....

Apologize If I wasn't clear in my response to Andy's post. I'm merely addressing possible confusion that might be involved For Some regarding thinking of TVfool's NM in terms of "SNR", such as is suggested here:

Quote:
Originally Posted by andy.s.lee

Noise Margin is best thought of as a Signal to Noise Ratio (SNR).

Solely as it relates to what I said here :

Quote:
Originally Posted by Nitewatchman

I think it might be confusing for those who do not know the difference and are wondering why they aren't seeing a SNR reading any higher than about 33~34dB on say, their Sony HDTV's signal diagnostic screen ....

As an example for my location .... TV fool predicts 65dB NM for one of the stronger signals ....

If I didn't know the SNR reading on my receiver involves specifics which are different from the way I'm "thinking about NM as a SNR" , I might wonder why the SNR reading on my Sony HDTV indicates "only" 34dB in this case ... Of course, all I'd have to do is go to a bit of extra effort, In which case I would notice it actually takes more than 60dB attenuation added into feedline to get "down" to threshold levels for decoding DTV ... But of course, a lot of folks aren't going to go to that effort, or even have enough stuff laying around to add 60 db or more attenuation into feedline -- ....

Ok, so why does that matter as long as we at least achieve the sufficient 15dB SNR ? Well, as one example let's say we're(incorrectly) using our results in this particular case to "gauge" whether or not an improvement in our antenna system(such as say moving antenna outdoors) will allow us to receive weaker predicted signals we can't currently decode with our current antenna setup .... We might think we're not going to have much luck with a signal with a 20db predicted NM if we're "only" getting 34db SNR on this one with a 65 db Predicted NM ...

I know it sounds silly, but my experience with these matters are that folks can easily become confused by them ....

Another example I previously mentioned, Along the same lines, If the receiver meter is labled "signal strength", It's difficult in some cases to convince that isn't what it's measuring or that signal quality/BER is instead involved ...
The more I think about it, just my opinion, FWIW --- but for DTV, I think "SM" (Signal Margin) would be better terminology to use instead of calling it NM (Noise margin) ...

Subtract our losses(the noise added by receiver or amp NF/feedline losses, estimated attenuation by having antenna indoors/etc/etc) -- and add our antenna gain ...

I think using that concept is pretty accurate(TVFool is modelling signal levels, not "noise" after all) and at the same time I think relatively simple ....

OTOH -- For analog reception(worse than TASO Grade 1 reception for example) ---The "NM" concept certianly has it's perks
I like the term NM. I think of it as the margin of the signal above the existing noise floor.
And, it's a good match to my attenuator tests. Or, as our north-of-the-border friend Autofils calls it, the "cliff-effect" measurement.
When you add in the Antenna Gain, it's the "Fade Margin".
What should I call that tree, 20 feet away, that my antenna points through?
Other than some foul language, I'd call it a time-varying delay line and attenuator.

I didn't see anyone mention that Noise Figure is only defined into a 50 Ohm system.

As long as an undesired signal is uncorrelated to the desired signal, it can be treated as noise. If you have another signal on that frequency, and the receiver tries to lock to it, you have to treat it as interference (not noise). Then, you are dealing with S/(N+I). Perhaps that goes a bit OT for this discussion.

For the non-technical crowd, I'd use the term "Signal Headroom" instead of margin. It's much easier to visualize. Run out of headroom, you get wacked on the head. No headroom, no picture. Like the opening to the TV show Max Headroom, but I digress.
Quote:
Originally Posted by MikeBiker

What should I call that tree, 20 feet away, that my antenna points through?

firewood
Quote:
Originally Posted by tczernec

- In your second example, why is the noise margin not degraded by the noise figure of the receiver (at the far right side of the chart)?

The Noise Figure of the pre-amp DOES make the NM worse. If you look closely at the amplified noise floor, I tried to draw it showing that it's boosting the thermal noise plus Noise Figure of the amp. At the point just after the pre-amp, the Noise Margin has been decreased by the amount of the amp's Noise Figure.

However, the Noise Figure of the receiver is no longer that important. The boosted signal reaching the receiver hides any degradation caused by its Noise Figure.

Note that good pre-amp Noise Figures are typically around 3 dB while typical receivers have Noise Figures around 6 to 10 dB.

Best regards,
Andy
Quote:
Originally Posted by johnpost

firewood

:d
Andy,
In your NM calculation the you have to establish some value of the noise. I assume this is what you are calling thermal noise in your graphs. Can you provide the value that you are using this would allow a clearer interperation of NM.
John
Quote:
Originally Posted by Nitewatchman

If I didn't know the SNR reading on my receiver involves specifics which are different from the way I'm "thinking about NM as a SNR" , I might wonder why the SNR reading on my Sony HDTV indicates "only" 34dB in this case ...

There are no set rules about how signal meters work on all the different set-top boxes and TVs. A few do try to provide a number close to actual SNR values, but there are limits to what they can do.

In my earlier comments, I meant to indicate that NM is more akin to SNR than it is to signal power, since the original question was comparing NM to these other metrics for estimating signal quality. Although NM is like SNR, it is not the same.

For signal meters that try to estimate true SNR, you'd expect the picture to go away when the SNR drops down to about 15 dB. On a NM scale, the picture would go away at 0 dB. Most signal meters cannot provide any measurement at all once the signal falls below this decoding threshold.

For very high quality signals, most signal meters cannot go above a certain point. Once the bit error rate gets down to zero (error vector is effectively zero due to quantization limits), the signal estimator cannot tell if the SNR is 50, 100, or 200 dB. Even before you reach this point, other sources of error (like pilot lock and local oscillator phase noise) start to dominate, and the perceived SNR will not go any higher anyway.

There are certain limits to what the signal meter can measure, and this should be treated independently of the true SNR available in the airwaves.

Best regards,
Andy
Quote:
Originally Posted by ctdish

In your NM calculation the you have to establish some value of the noise. I assume this is what you are calling thermal noise in your graphs. Can you provide the value that you are using this would allow a clearer interperation of NM.

It's about -106 dBm, which is the thermal noise for a 6 MHz channel at room temperature.
Quote:
Originally Posted by andy.s.lee

However, the Noise Figure of the receiver is no longer that important. The boosted signal reaching the receiver hides any degradation caused by its Noise Figure.

Your graphs have a logarithmic vertical scale, right? In that case, a "large" bump near the bottom corresponds to a tiny "bump" further up. The scale becomes more and more compressed the further up the chart you go.
Andy,
Thanks for the info. That value equates to a system noise temperature of 290 Deg. K or a noise figure of about 3 dB. Very few TV receiving systems will be better than that but there is no physical reason that one a little better could not be assembled. The noise in a receiver system always comes from someplace, at best it is the sum of the noise picked up by the antenna and the receiversystem noise temperature. It is not too hard to get a receiver including preamp that will have a noise figure of 2 dB which would be a noise temperature of about 170 degrees and some 0.4 dB preamps, about 28 Deg. K can also be bought. The antenna mostly picks up noise from the Earth which radiates at about 290 deg but some of the antenna beampattern is aimed at quite sky so the total noise temperature could be a around 145 Deg. This added to a 28 Deg. receiver would have a total noise temp. of 173. This would get the noise down to -108.5 dBm at the detector in a 6 MHz bandwidth. An antenna aimed at the sky like a satellite dish can be nearly pick up noise with a level that is 10 dB lower.
John
New Posts  All Forums:Forum Nav:
Return Home
Back to Forum: HDTV Technical
This thread is locked

### AVS Top Picks

AVS › AVS Forum › HDTV › HDTV Technical › TV Fool Discussion Thread