Originally Posted by nobleach
well it's like saying you can be pretty darn sure if you KNOW the length of cable, it's attenuation properties/impedance how many splits etc that you will start with a certain level and you can guess how much it'll drop.
Consider that I can pull WCJB at 100%... what I do with it once it's down the pole, that's "signal strength". Even if WCJB ups their power, my signal strength will not change.
That would be relative to a meter. To me if the transmitter doubles their power then the signal strength would go up by 3db. But if the meter you are using is already full scale, then the increase can not be seen I totally agree.
For my whole life "signal strength" has always been defined against something. Probably the most scientific or engineering term is dbu, or dbm, which can be referenced to a specific value of rms volts or millwatts into a load respectively.
But then in ham radio there is a system of 2 digit number to give another station some idea of how they are received. The first digit is one to five with five the best of just how they subjectively sound compared to background noise and static (analog transmissions). The second number is one to nine on their RF carrier strength. So giving someone a "59" or 5 by 9" report means they quality is 5 and signal is 9 or as good as it gets.
The military uses the same system except the numbers only go to 5. So in the military if you are "5 by 5" that is perfect. But neither the ham or military are quantitative, just relative and somewhat subjective.
Now we have this bar on our screens telling us signal strength. But no manufacturer I have seen yet says well 100% means say at least -30dbu. However I have noticed between receivers while they won't match exactly there seems to be a standard of some sort as most people find that around 40 to 50 % they can have break ups and pixelization in the slightest fade.
This tells me most of our receivers are set so in very very general terms, one needs 50% or higher to grab a signal with little to no drops. It seems to take 80% to eliminate all but the very very worst fades or pixelization. Above 90 it never seems to happen.
So there must be some relative standard for our bars for signal strength.
Now the way I learned Signal Quality, with digital it's moot. It's either perfect or it doesn't exist. So like digital we could assign zero and 1 to signal quality the way I learned it :@).
Funny how terminology changes from one situation to another. People talk in acronyms, phrases and I am very guilty of it, yet unless the reader is spot on to the exact field, they are lost.
So we could come up with the Gainesville OTA standard :@)
0 - Visit the forum or get cable
10 to 40 - push the antenna higher, make it longer, stack 2 antennas
50 - Better than I had it but the darn fades. Work on the antenna
60 - I could live with this if it were not for airplane skip
70 - A no man's land between near perfect and those darn pixels
80 - Real DTV!
90 - Makes others jealous
100 - What do you expect? He is running an amp!
Of course like anything this is a living document. So any attempts at perfecting the humor are allowed!