Originally Posted by Bryan88
I'm still here.
Good, I'm glad that we have given you some ideas to try.
A few thoughts about the Bravia signal strength meter:
Besides internal "signal strength meters' really are not true meters.
Yes, they are not true meters that give the signal level in dBmV or dBm, but they are still useful for comparisons. A Signal Strength meter with an arbitrary 0 to 100 scale is much better than nothing; that is why it is on the Signal Diagnostics screen. I must admit, however, that I would be lost without my signal level meter that reads in dBmV.
The Bravia signal strength reading is derived from the AGC (automatic gain control) circuit. The AGC circuit changes the sensitivity of the tuner to accomodate a wide range of signal strengths from very strong to very weak. Without the AGC, the signal dyanamic range would be much smaller.
When the signal is very strong, a larger AGC voltage is developed to reduce the sensitivity of the tuner (inverse relationship). As the signal gets stronger, a point is reached where the AGC is no longer able to reduce the sensitivity of the tuner any further which is the maximum signal strength number (80), which you have experienced. Beyond that maximum number, with even stronger signals, the tuner eventually reaches its overload point.
With very weak signals, the ACG voltage is smaller to increase the sensitivity of the tuner. Below a certain point, the minimum signal strength reading is shown (55 on my Bravia) where the tuner sensitivity can no longer be increased. With even weaker signals you reach the "digital cliff" with SNR below 16 dB, picture freeze, and finally dropout.