It has been discussed, within these forums, that some meters indicate "Signal Quality" values, while others indicate "Signal Strength." This is why I prefer CECB's with both (ie. APEX DT502, Sunkey SK-801ATSC, & Zinwell ZAT-970A).
So do I.
I have the Zenith DTT900, Zinwell ZAT-970A, Sansonic FT-300A, and the Apex DT502. My favorite dual-bar box is the Apex.
I have made a side-by-side comparison of the Zenith and the Apex using a splitter. I made changes in the antenna to vary the signal strength and signal quality. The Zenith bar seems to be a combination of signal strength AND signal quality.
Even though signal strength and signal quality are in percent or some other arbitrary unit, it doesn't matter because you are only looking for a comparison----more is better. You will soon get a feel for how much is needed to maintain a lock on a signal.
If you want to know the actual signal strength you will need a signal level meter (SLM) which is calibrated in dBmV. Broadcast engineers use dBm. The conversion factor between them is 48.8: +1.4 dBmV equals -47.4 dBm.
I have calibrated an Apex box using my signal level meter:http://www.avsforum.com/avs-vb/showt...6#post15414426
I don't know if this is technically correct, but I think of the quality value as something approximating the percent of correctly decoded packets. Can someone clarify what is being measured as "quality"?
The signal quality bar is related to BER (bit error rate/ratio). The tuner FEC (forward error correction) is capable of correcting some errors, but it's ability is limited. Once that limit is reached, you will start to see dropout, tiles, and picture freeze.
I would be perplexed if a signal meter showed 100% with a pixellating picture.
It is possible to have a strong but poor quality signal.
The factors that reduce signal quality and cause a higher BER are:
1. Improper signal level: A weak signal will cause a poor signal-to-noise ratio; a signal that is too strong can overload a tuner or preamp. A nearby FM transmitter can also cause overload, which would require an FM trap.
2. Reflections from multipath problems, static or dynamic.
3. RF interference or impulse noise in the reception area.
When I was testing the DT502 with my CM4221 antenna I got (for 13.1 on RF41):
Signal Quality 60%
Signal Strength 55%
I had aimed the antenna with my SLM, but when I rotated the 4221 slightly to the right I got:
Signal Quality 100%
Signal Strength 56%Note the BIG change in signal quality with only a slight change in signal strength.
The Zenith appears to combine signal quality and signal strength in one bar. It increased slightly when I rotated the antenna, but it's not as sensitive as the 2-bar box quality bar.
Anyone who is in a weak-signal situation (like worse than about -80 dBm on tvfool) and tries to aim his antenna between two different azimuths to get both stations is going to have a difficult time because off-axis aim causes an increase in BER.
It seems that the signal quality indication is a more sensitive aiming tool
than signal strength, because it shows the increase in BER from multipath reflections. In my situation the BER is affected by the weak signal, the fixed multipath reflections, and the changing multipath reflections. My antenna is aimed across a well-traveled road, so I get reflections from cars. (This is an example of the need for the new ATSC M/H standard.) When the quality went up to 100%, the car reflections were less of a problem. My stronger signals maintain a good lock inspite of the cars.
Is SNR a measure of quality?
The signal-to-noise ratio is the difference in dB between the signal level and the ambient noise level.
A SNR of 15 to 16 dB is needed to maintain lock on an 8vsb signal. Anything more is gravy; anything less and you have problems.