I usually just want to shoot for enough boost to cancel out what I'm losing to the split.
A lossless splitter would degrade the signal about 3.0103 db at each of the two taps. People generally call this 3.5 db, probably to account for the loss due to the splitter (rather than the split).
That's for a 2-way split. a 4-way split should double that - figure 7 db loss at each of the 4 taps.
Assuming there isn't too much cable between the amp and where each tap terminates, I'd think that 10 db should be plenty for a no-loss 4-way split. Naturally, each foot of cable adds to the loss. Figure out what kind of cable you're using and see if the manufacturer is kind enough to provide loss/meter figures at various frequencies (the higher the freq, the higher the loss).
The general rule of thumb is to put the amp as close to the original signal source (that is, your cable company) as possible, and don't use more gain than you'll need due to losses inherent in your distribution system.
Another obvious question is what the bandwidth of the AMP is (that is, what frequencies it really amplifies), and how much gain you get at any given frequency.
Then there's the whole issue of tilt compensation, but it seems a bit more involved than I feel like right now.
Basically, if the split signal doesn't look too bad, then too much gain will hurt. Personally, I can't see any reasonable use for a 35 db amp unless you're dealing with a marginal signal to begin with, and you have a small motel you're feeding.
Right now, I'm using the much-touted (by me) ChannelVision 4-way dist amp that as far as I can tell provides 15 db of gain, and it shows signs of having too much gain. I probably should sprung for the 8-way 15 db amp and just terminated the unused taps, just to cut down on the gain.
A lossless splitter would degrade the signal about 3.0103 db at each of the two taps. People generally call this 3.5 db, probably to account for the loss due to the splitter (rather than the split).
That's for a 2-way split. a 4-way split should double that - figure 7 db loss at each of the 4 taps.
Assuming there isn't too much cable between the amp and where each tap terminates, I'd think that 10 db should be plenty for a no-loss 4-way split. Naturally, each foot of cable adds to the loss. Figure out what kind of cable you're using and see if the manufacturer is kind enough to provide loss/meter figures at various frequencies (the higher the freq, the higher the loss).
The general rule of thumb is to put the amp as close to the original signal source (that is, your cable company) as possible, and don't use more gain than you'll need due to losses inherent in your distribution system.
Another obvious question is what the bandwidth of the AMP is (that is, what frequencies it really amplifies), and how much gain you get at any given frequency.
Then there's the whole issue of tilt compensation, but it seems a bit more involved than I feel like right now.
Basically, if the split signal doesn't look too bad, then too much gain will hurt. Personally, I can't see any reasonable use for a 35 db amp unless you're dealing with a marginal signal to begin with, and you have a small motel you're feeding.
Right now, I'm using the much-touted (by me) ChannelVision 4-way dist amp that as far as I can tell provides 15 db of gain, and it shows signs of having too much gain. I probably should sprung for the 8-way 15 db amp and just terminated the unused taps, just to cut down on the gain.