or Connect
AVS › AVS Forum › HDTV › HDTV Technical › What is the acceptable db range for the best cable (analog, DTV and HD) reception?
New Posts  All Forums:Forum Nav:

What is the acceptable db range for the best cable (analog, DTV and HD) reception?

post #1 of 105
Thread Starter 
I've tried to search for info on this and can't find anything relevant to it. A few days ago I had Cox send a cable guy out to check my lines just to see what the db level was and if was ok for the best DTV and HDTV reception. (We were also having problems with several analog channels. He replaced a splitter and that fixed the analog channel problem on most of the sets, but not all, so I'll be checking the attic splitters). I was not having any HDTV problems, but I wanted to be sure all was up-to-spec. SD DTV channels were not improved.

When he checked the main line in, he said it was "Ok". I asked what the db level was and he said it was "15db". While he was away, I glanced at his RF meter but not being familiar with them, I didn't know exactly where to look. I remember seeing 3 different db figures in 3 different areas. The first was 12.x db, the second was 15.7db and the third was.....I think, 24 or 28db. Since he said it was "15db" I guess I can assume it was really 15.7db. Unless that RF meter has several different screens and what I was seeing was something else.

So first, does anyone know what those 3 levels indicated on the RF meter's LCD screen, and if you do, are they acceptable?

If they are different data from the figure he told me, is the +15db figure he told me acceptable for all types of CATV reception?

Thanks.
post #2 of 105
You've got signal galore. In fact, 24dB is excessive from the cable company's perspective, since they are responsible for "cumulative leakage" that may interfere with aeronautical and other users of the cable TV bandwidth. I think that cable companies require zero dBmV for their analog and maybe -6dBmV for 256 QAM, measuredd at each wallplate.

You don't benefit from surplus strength. It jus gives you a little margin to compensate for future signal strength reduction than may occur due to mechanical degradation of your distribuution wring.
post #3 of 105
0dbmv at the STB is ideal it can vary by about + or -10db, however an excessive signal can actually overdrive the tuner in the STB.
post #4 of 105
Thread Starter 
Quote:
Originally Posted by AntAltMike View Post

You've got signal galore. In fact, 24dB is excessive from the cable company's perspective, since they are responsible for "cumulative3 leakage" that may interfere with aeronautical and other users of the cable TV bandwidth.

Thanks for replying. So I take it from your reply, that 15.7db figure I saw on his meter is the one he was talking about?

That may be a lot of signal, but don't forgot about the multiple splitters used and the 7db loss per tap for more than 2-way. That 15db is already -7 and -7 so that's already only +1db. Is that a good signal at the TV? (I'm going to go over all the splitters and find out what's exactly needed and of course remove those that are not needed). But I believe that 1db figure at the box will probably stand.

Quote:
I think that cable companies require zero dBmV for their analog and amybe 06dBmV for 256 QAM, measured at each wallplate.

How does that equate to db? (Is there a formula I think?)
post #5 of 105
Thread Starter 
Quote:
Originally Posted by RCbridge View Post

0dbmv at the STB is ideal it can vary by about + or -10db, however an excessive signal can actually overdrive the tuner in the STB.

Thanks to you as well. And I'll ask you too (since you brought up the figure), do you know how 0dbmV equates to db level?

I'd much rather have too much because it can easily be brought down. Whereas too little is a problem and would require trying to find a good correct type of amp.
post #6 of 105
On analog channels, the cable company is required to deliver 0dBmV to a television. 0dBmV is the point at which the "average" viewer will not see any additional improvement in PQ as signal levels increase. For more critical viewers, this level may be +3dBmV. Ideally, you would want to see ~ 6dBmV at the tuning device to allow for variations in signal level. Signal levels above +12dBmV can cause overload to the front end stages of many tuners. There are other factors involved such as difference between adjacent channels and maximum differential between highest and lowest carriers across the delivered spectrum.

When the term dB is used without any reference, it typically refers to a loss or gain ratio across a device. There is no reference to an actual signal level.

When a reference is included, i.e., dBmV, then you are talking about a specific signal level. In the case of CATV, the reference is 1 milliVolt across 75 Ohms... hence the reference dBmV. When you say that you have +6dBmV reaching a device, what you are really saying is that you have 6dB more than 1 mV. If you have -3dBmv, it does not mean that you have "negative" signal, just that you have 3dB less than 1mV.

So... when you are discussing the actual signal level reaching a device, then you are talking about dBmV. However, if you are talking about what is lost or gained in a device, then you are talking dB of loss/gain... but you know nothing about what the actual signal level is.

The reason we use dbmV instead of just reading mV is that it makes the math much easier... you can include both signal levels (dBmV) and relative loss/gain (dB) in an equation to calculate useful information with simple addition/subtraction. For instance:

10 dBmV - 3.5 dB = 6.5 dBmV

would represent what happens when a 10dBmV signal passes through a typical 2-way splitter (3.5dB loss)... simple math.
post #7 of 105
Quote:
Originally Posted by Clint S. View Post

Thanks to you as well. And I'll ask you too (since you brought up the figure), do you know how 0dbmV equates to db level?

I'd much rather have too much because it can easily be brought down. Whereas too little is a problem and would require trying to find a good correct type of amp.

As I noted above, there is no such thing as "dB level" -ALL references to signal level in a CATV system are in dBmV - but even CATV techs and engineers who know better often get sloppy and omit the "mV" when measuring signal levels and call everything "dB", hence the confusion.

It is not necessarily good to have too much signal - there is nothing to be gained but distortion by going higher than what is required for a good picture.
post #8 of 105
Since you don't have a signal meter, you should temporarily take out one or more of the splitters and see if the quality of analog signals improves on any TV on which they are deemed by you to be substandard. If it improves the picture quality, then you need to boost the signal at that or those points. You can often economically boost the signal strength at a distant wallplate by reconfiguring your splits to favor it. You can also insert a small drop amplifier, which you can get cheap on eBay. If you do put in a drop amp, be sure to get one that passes return signal, because you will eventually need that capability even if you do not now.

Do you have cable TV internet? If you do, then ordinarily, the internet feed comes directly off an initial two-ways split, and the other half of the signal is for television viewing.
post #9 of 105
Quote:
Originally Posted by Clint S. View Post

... We were also having problems with several analog channels. He replaced a splitter and that fixed the analog channel problem on most of the sets, but not all, so I'll be checking the attic splitters ...

As everyone knows, analog is a continuum of quality while digital is mostly an all or nothing situation. Therefore, I would suggest the outside wire first go to a 2-way splitter: one for analog, the other digital. Then keep two different house cable distributions in the house. Use high quality low loss RG6. If interference is NOT a problem inside the house, then be aware that quad shielding usually creates a slightly more lossy cable. You can manage the analog side with amplifiers, if necessary. Even better, is to explicity ask the cable company to provide two taps to their outside distribution point. That is what I have done. At one time the cable company was required to provide two taps to the home. The cable company's external wiring, amplifiers, and splitter/distibution hardware is much better than that used inside the home. You can use one tap for analog and the other for digital needs.

iq100
delete my ideas by posting some of your own.
one of perhaps a hundred banned by J. River
post #10 of 105
Thread Starter 
Guys, thanks for all the replies. I'll read over these well tonight and reply then.

To answer one question I happened to see, there's no cable modem, I'm on DSL. But I did use a cable modem at one time. Could there be anything on the lines related to the cable modem that I should check for and remove?

One other thing: is there any way the level can be checked using some other meter than an RF signal meter? I have meters for inductance, capacitance, ohms, volts, and I'd like to find a way if possible to measure some "level" at each wall outlet so I'll know which if any lines or hardware on the line should be replaced. It's pointless to replace ALL the cable if only some or none of it is bad. The cable here is over 30 years old. I am going to replace the splitters I saw that were corroded. I can't tell by looking at the cable if any is RG59. If it is that would of course have to be replaced with RG6. A way to check the signal level at each outlet would help that. I know the line puts out measurable voltage, so hopefully there may be a way for the mV to be sort of compared or and equated to a db mV level. ?? Or would say, if I measured .001v, that would be 1 mV, which would be 1 dBmV, then be simply called "1db" as the tech told me in db terms?

Thanks again.
post #11 of 105
The onlly way to measure RF signal is with an RF signal meter. You can buy a clunker on eBay for about $20, but you would have trouble using it to accurately measure digital signals. A used meter to measure digital, QAM signals on eBay will cost you at least $300.

ut as I said above, go to any TV where an analog picture is substandard and boost its signal strength by temporarily taking out a splitter or two in its signal path.
post #12 of 105
Quote:
Originally Posted by Clint S. View Post

Guys, thanks for all the replies. I'll read over these well tonight and reply then.

To answer one question I happened to see, there's no cable modem, I'm on DSL. But I did use a cable modem at one time. Could there be anything on the lines related to the cable modem that I should check for and remove?

One other thing: is there any way the level can be checked using some other meter than an RF signal meter? I have meters for inductance, capacitance, ohms, volts, and I'd like to find a way if possible to measure some "level" at each wall outlet so I'll know which if any lines or hardware on the line should be replaced. It's pointless to replace ALL the cable if only some or none of it is bad. The cable here is over 30 years old. I am going to replace the splitters I saw that were corroded. I can't tell by looking at the cable if any is RG59. If it is that would of course have to be replaced with RG6. A way to check the signal level at each outlet would help that. I know the line puts out measurable voltage, so hopefully there may be a way for the mV to be sort of compared or and equated to a db mV level. ?? Or would say, if I measured .001v, that would be 1 mV, which would be 1 dBmV, then be simply called "1db" as the tech told me in db terms?

Thanks again.

If there was a splitter installed at one time to feed the modem, then make sure that the splitter is removed... otherwise half of the signal entering the splitter is being totally wasted... and you have a source for potential signal ingress/egress at the unused splitter port.

There is no simple way to use the devices you have mentioned to create a poor man's FSM... the missing ingredient is the front end tuning to single out the particular carrier you are trying to measure. The aggregate voltage across the entire spectrum is meaningless.

BTW... .001V = 1mV = 0 dBmV (not 1 dBmv)
post #13 of 105
The signal level/strength chart on page one of this link doesn't always apply, but you might find it helpful:
http://www.sencore.com/uploads/files...veGoodHDTV.pdf
post #14 of 105
Thread Starter 
Quote:
Originally Posted by iq100 View Post

As everyone knows, analog is a continuum of quality while digital is mostly an all or nothing situation. Therefore, I would suggest the outside wire first go to a 2-way splitter: one for analog, the other digital. Then keep two different house cable distributions in the house.

How can digital be separated from analog when they are both being broadcast simultaneously on the same Cox co-ax cable line?


Quote:


Use high quality low loss RG6. If interference is NOT a problem inside the house, then be aware that quad shielding usually creates a slightly more lossy cable.

That's news to me, I thought the better shielded the cable, the less of a loss you'd get. Or I thought at the very least, it would be the same and you wouldn't lose any more just due to shielding.


Quote:


You can manage the analog side with amplifiers, if necessary.

See what I said above about separating D from A. Why couldn't the digital side be managed with amps? Is that because of the line "digital is all or nothing"? I had heard of that before, but I don't want to get to the "...or nothing" part. I want to be sure that after the splitting, the DTV/HDTV looks as good as possible and is not marginal hovering at that threshold of all "or nothing".


Quote:


Even better, is to explicity ask the cable company to provide two taps to their outside distribution point.

Again, see what I said above. So does the two taps still apply with one coax line I described above?


Quote:


At one time the cable company was required to provide two taps to the home. The cable company's external wiring, amplifiers, and splitter/distibution hardware is much better than that used inside the home.

The other day when the Cox cable guy was here, he asked why were there two lines coming in. I didn't know why. He said "that was illegal". That was done something like 25 years ago when Cox was Cablevision or TCI. He said one of the lines was "bad" anyway, like way too low of a level.
post #15 of 105
Thread Starter 
Quote:
Originally Posted by AntAltMike View Post

The onlly way to measure RF signal is with an RF signal meter. You can buy a clunker on eBay for about $20, but you would have trouble using it to accurately measure digital signals. A used meter to measure digital, QAM signals on eBay will cost you at least $300..

Oh well. Why would it have trouble measuring digital lines? The incoming line here is not "digital", it's both analog and DTV. So wouldn't a typical RF meter give me the level reading? Since "RCbridge" mentioned this:

Quote:
Originally Posted by RCbridge View Post

0dbmv at the STB is ideal it can vary by about + or -10db, however an excessive signal can actually overdrive the tuner in the STB.

......I wanted to find a way to find out if I was getting 0dbmv at the TV.

Thanks.
post #16 of 105
Thread Starter 
Quote:
Originally Posted by jcalabria View Post

If there was a splitter installed at one time to feed the modem, then make sure that the splitter is removed... otherwise half of the signal entering the splitter is being totally wasted... and you have a source for potential signal ingress/egress at the unused splitter port.

Ok will do.

Quote:


BTW... .001V = 1mV = 0 dBmV (not 1 dBmv)

Thanks.
post #17 of 105
Thread Starter 
Quote:
Originally Posted by AntAltMike View Post

Since you don't have a signal meter, you should temporarily take out one or more of the splitters and see if the quality of analog signals improves on any TV on which they are deemed by you to be substandard. If it improves the picture quality, then you need to boost the signal at that or those points. [....] You can also insert a small drop amplifier, which you can get cheap on eBay. If you do put in a drop amp, be sure to get one that passes return signal, because you will eventually need that capability even if you do not now.

I notice you say "analog". Will amps harm DTV or HDTV even if they mention in the specs they are for HDTV? I know about the overload problem. But like I said, I'd rather have too much then attenuate that if need be, then not have enough or be *marginal. (*cont'd below).
post #18 of 105
Thread Starter 
Quote:
Originally Posted by rabbit73 View Post

The signal level/strength chart on page one of this link doesn't always apply, but you might find it helpful:
http://www.sencore.com/uploads/files...veGoodHDTV.pdf

Thanks, that's awesome. Just what I was looking for. Can you please explain what you mean by "doesn't always apply"?

*To continue on what I said above about "marginal", and comments about DTV being "all or nothing", this is what I don't understand when I see in the PDF file the actual term "marginal" (it uses "marg.") for DTV reception. That's exactly what I'm trying to avoid, "marginal" signal. I don't want to get to the point where I may lose a DTV picture from time to time.
post #19 of 105
Thread Starter 
Thanks again to all of you for all the feedback, I appreciate it.
post #20 of 105
Niot that it mattters much, but there is a difference between dBmV and dBmv (small v versus large V). One is simply volts whereas the other is volts needed to develop a certain amount of power across a certain, industry standard impedance. Zero dBmV is the voltage necessary to develop one watt of power when impressed upon a 75 ohm load. A dBmV in commercial audio uses 600 ohms as its reference load, in consumer audio, it is 10K ohms, etc, etc. Anyhoo...

A meter designed to measure analog signal strength incorporates a narrow bandpass filter that results in significanlty under-reported digital signal strength readings, but the conversion factor generally varies from 8 to 12 dB, depending on the design of the meter.

As far as the twin cable wiring is concerned, two or three decades ago, most cable TV commercial line equipment only went up to 400 or 450 MHz and there was no digital compression (typically, a dozen or so standard definition digital TV programs fit on one 6 MHz-wide channel) so cable companies were using two cables to carry the hundred or so, 6 HMz wide channels in their systems.

If a cable tech agreed to furnish a customer with two separate lines, each would carry both analog and digital and they would simply be coming off a splitter. A customer who wanted to separately play with his digital and analog signal reception could put whatever attenuators or amplifiers on each line that satisfied him that he had optimized his digital performance and his analog performance, but a technician would never bother to do that for his own purposes because he wouldn't need to.

I mention watching the analog picture because you can see the gradual degradation in the form of graininess that develops as the signal level at the TV drops below about -5dBmV, whereas you can't observe continuous digital signal level reduction because of the avalanche nature of digital signal processing. Until recently a lot of cable companies used to send out one analog channel at up around channel 115-120 (over 700 MHz) just so the customer and underequipped technicians could visually confirm adequate high frequency signal strength (coax gobbles up more signal at high frequency than at low frequency).

The cable company has optimally balanced the relative strength of digital and analog signals, so if an analog signal is adequate in strength, then digital signals near the same frequency are also adequate in strength.

Quad coax does tend to lose a tiny bit more signal per unit length than does dual shield, but that will be a decimal amount over the lengths of coax used in a residence's internal wiring. Coax companies favor quad shield because they are responsive for limiting "cumulative leakage" to a safe level so that it will not interfere with other broadcast signals at the midband (120-174 MHz) and superband and ultraband (216 to about 400+ MHz) frequencies.
post #21 of 105
The math is as follows.

0dbm = +48.75dbmv

The difference in either scale is measured in db.

-10dbm = +38.75dbmv
+10DBM = +58.75dbmv

This relationship is linear (1 for 1).

Generally cable uses dbmv as a scale to measure power.
post #22 of 105
FWIW, nothing on Coax is digital. It's all modulated analog carriers. Just happens some of them are modulated in QAM 64 or 256. That means that good Analog practices need to be followed and that an Analog signal meter can be used just fine for relative loss readings (ie if you have 10 dB at the splitter but -5 dB at the end of the cable you still have too much loss in that piece of wire).
post #23 of 105
Quote:
Originally Posted by AntAltMike View Post

Niot that it mattters much, but there is a difference between dBmV and dBmv (small v versus large V). One is simply volts whereas the other is volts needed to develop a certain amount of power across a certain, industry standard impedance.

V is always capitalized when referring to volts... the International System of Units (SI) rule is that abbreviations for units named after people are always capitalized (e.g., B = Bel from A.G. Bell, V = volt from Volta, W = watt from James Watt, Hz = Hertz from Heinrich Hertz, etc.) and units not derived from a persons name are lowercase. The only exception is that uppercase L is allowed for liter to avoid confusion with the number 1 when handwritten.

Furthermore, prefixed abbreviations for unit dividers - milli (m), micro(u or μ), pico (p) - are always lowercase, and prefixes for unit multipliers - killo (K), mega (M) or giga (G) are always uppercase.

Interestingly, although the abbreviations are capitalized by these rules, when written out as words they are always lowercase unless they are the first word in a sentence.

Quote:
Originally Posted by AntAltMike View Post

Zero dBmV is the voltage necessary to develop one watt of power when impressed upon a 75 ohm load.

No... no... no... 0dBmV is one millivolt (.001V) across 75Ω - no reference to power. However, by Ohm's law (watts=volts²/ohms), the power dissipated by a single carrier with that voltage level across a 75Ω resistive load is only 13.3 nanowatts... not even remotely close to 1 watt.


Quote:
Originally Posted by AntAltMike View Post

As far as the twin cable wiring is concerned, two or three decades ago, most cable TV commercial line equipment only went up to 400 or 450 MHz and there was no digital compression (typically, a dozen or so standard definition digital TV programs fit on one 6 MHz-wide channel) so cable companies were using two cable to carry the hundred or so, 6 HMz wide channels in their systems.

If a cable tech agreed to furnish a customer with two separate lines, each would carry both analog and digital and they would simply be coming off a splitter. A customer who wanted to separately play with his digital and analog signal reception could put whatever attenuators or amplifiers on each line that satisfied him that he had optimized his digital performance and his analog performance, but a technician would never bother to do that for his own purposes because he wouldn't need to.

What you describe "from decades ago" would refer to a cable system that had dual independent cable systems (trunk/feeder/drops) from headend to subscriber that carried completely different programming on each of the parallel systems. What was being discussed here was running two individual drops from two individual tap ports on the same feeder line. Each cable would have identical content... the purpose of running dual cables would be to save the loss of one splitter at the home... essentially providing 3.5dB more signal than if a single drop was run to the house and immediately split to create the same two signal paths for further distribution within the house.


Quote:
Originally Posted by AntAltMike View Post

Quad coax does tend to lose a tiny bit more signal per unit length than does dual shield, but that will be a decimal amount over the lengths of coax used in a residence's internal wiring. Coax companies favor quad shield because they are responsive for limiting "cumulative leakage" to a safe level so that it will not interfere with other broadcast signals at the midband (120-174 MHz) and superband and ultraband (216 to about 400+ MHz) frequencies.

You are quite correct about cable ops being concerned with signal egress, but it would be relatively rare to find quad shield cable used for subscriber cabling. Quad shield is most often used within headends where very high signal levels (as much as +50dBmV) are encountered. The typical subscriber drop cable is tri-shield - usually bonded foil on the dielectric + 77% or higher braided shield + foil wrap.


Quote:
Originally Posted by RCbridge View Post

The math is as follows.

0dbm = +48.75dbmv
The difference in either scale is measured in db.
-10dbm = +38.75dbmv
+10DBM = +58.75dbmv
This relationship is linear (1 for 1).
Generally cable uses dbmv as a scale to measure power.

Correct but useful only if you are measuring a system typically expressed in dBmV with an instrument calibrated in dBm. I don't believe this has any bearing on the OP's question. He was asking about converting between "dBmV" and a reference-less "dB"... which is really not possible.


Quote:
Originally Posted by olyteddy View Post

That means that good Analog practices need to be followed and that an Analog signal meter can be used just fine for relative loss readings (ie if you have 10 dB at the splitter but -5 dB at the end of the cable you still have too much loss in that piece of wire).

Conceptually correct but, as several of us have noted, you NEVER have "10dB at the splitter" or "-5dB at the end of the cable"... you have 10dBmV or -5dBmV. You are not alone in stating things that way, but such "shortcuts" are precisely the root of the OP's confusion on the subject.

Damn... now I feel like a crotchety old cable engineer... wait a minute... I AM a crotchety old cable engineer, lol.
post #24 of 105
To the original poster.
Do you have a STB or cable modem in your set up?

Quote:


Correct but useful only if you are measuring a system typically expressed in dBmV with an instrument calibrated in dBm. I don't believe this has any bearing on the OP's question. He was asking about converting between "dBmV" and a reference-less "dB"... which is really not possible.

From another one in the business, I concur without a point of reference it is not possible to do this.

If the original poster has a cable modem or STB we can look at a diag screen and perhaps work backwards (making a few assumptions).
post #25 of 105
Thread Starter 
Quote:
Originally Posted by olyteddy View Post

FWIW, nothing on Coax is digital. It's all modulated analog carriers. Just happens some of them are modulated in QAM 64 or 256. That means that good Analog practices need to be followed and that an Analog signal meter can be used just fine for relative loss readings (ie if you have 10 dB at the splitter but -5 dB at the end of the cable you still have too much loss in that piece of wire).

Then how do you get DTV and HDTV from your cable company through the coax line?
post #26 of 105
Thread Starter 
Quote:
Originally Posted by RCbridge View Post

To the original poster.
Do you have a STB or cable modem in your set up?

If the original poster has a cable modem or STB we can look at a diag screen and perhaps work backwards (making a few assumptions).

I mentioned earlier I'm on DSL and have the STB.
post #27 of 105
Thread Starter 
Thanks to you all again, and I'll again read over the new posts tonight.
post #28 of 105
What type of STB do you have?
post #29 of 105
Quote:
Originally Posted by Clint S. View Post

... How can digital be separated from analog when they are both being broadcast simultaneously on the same Cox co-ax cable line? ...

The cable is, of course, agnostic as to whether the devices attached to it use analog or digital modulation schemes. However you can connect your devices that use anlaog tuner/demodulators to one 'analog cable segment' and those devices using digital tuner/demodulation schemes to the other 'digital cable' segment. Why? Because the rules for managing analog are diferent than the rules for managing digital.
Quote:
Originally Posted by Clint S. View Post

... That's news to me, I thought the better shielded the cable, the less of a loss you'd get. Or I thought at the very least, it would be the same and you wouldn't lose any more just due to shielding.
...

There are two distinctly separable concepts. One is insertion loss. The other is shielding. Desiging for maximum shielding often creates additiional insertion loss. The differences with modern RG6 are so small that you do not need to be terribly concerned. Nevertheless, " ... Also, be aware that RG-6 quad-shield cable has slightly higher line losses than non-quad-shield cable.
http://www.highdefforum.com/local-hd...-question.html
http://archive.tivocommunity.com/tiv.../141743-1.html

Quote:
Originally Posted by Clint S. View Post

... See what I said above about separating D from A. Why couldn't the digital side be managed with amps? Is that because of the line "digital is all or nothing"? I had heard of that before, but I don't want to get to the "...or nothing" part. I want to be sure that after the splitting, the DTV/HDTV looks as good as possible and is not marginal hovering at that threshold of all "or nothing".

Sometimes the ideal signal level for analog devices is different than for digitally connected devices. By having separate analog/digital segment you can manage each separately. Don't get me wrong, unless you are running multiple devices all over a very large home, this is NOT going to matter much. But, if you had a very large home with two way transmissions to say camera equipped rooms, it might help.
Quote:
Originally Posted by Clint S. View Post

... Again, see what I said above. So does the two taps still apply with one coax line I described above?

Yes. If you need multiple taps it is usually better to use the outside commercial grade amplifiers/splitters than you own residential ones. I am NOT saying inside splitters/amplifiers will not work, only that if your local tariffs/practice allow you to get two taps from your cable company for no additional charge, then do so.

iq100
delete my ideas by posting some of your own.
one of perhaps 100 banned by J. River.
I use mediamonkey which fully supports the Apple iPhone.
post #30 of 105
Quote:
Originally Posted by Clint S. View Post

Then how do you get DTV and HDTV from your cable company through the coax line?

Quote:


It's all modulated analog carriers.

Even over the air 'digital TV' is modulated on an analog carrier. It's not at all like a USB cable or a printer cable that actually directly carries bits and bytes. The bits and bytes are just cleverly 'superimposed' on a plain old sine wave, only to be separated and decoded in your tuner.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: HDTV Technical
AVS › AVS Forum › HDTV › HDTV Technical › What is the acceptable db range for the best cable (analog, DTV and HD) reception?