If a user's system performance appears to have been "helped" by a "better" splitter, then he has some peculiar threshold signal problem that needs to be remedied. The input window of DTV digital tuners is probably over 30dB wide. The input window of analog tuners is over 40dB wide. The likelihood that a fraction of a dB difference in a splitter's insertion loss, even if it is real, will make a performance difference in any real world situation is very low.
I have no reason to believe that "better" splitters, if they even exist, have less loss than do "worse" splitters. A splitter might be considered better by one standard if it has greater EMI shielding, but that is of no concern to the end user. It is only important to cable companies that are responsible for the "cumulative leakage" of their systems signal into the aeronautical and other broadcast bands.
A splitter might be considered better if has a premium center conductor seizure mechanism.
I design and install master antenna system reception headends for highrise buildings that balance and mix the digital and analog television signals from half a dozen or more antennas. I use Blonder Tongue distribution amplifiers that cost over $400 each. And I always use the cheapest splitters I can find. I almost never pay more than a dollar for a splitter. In fact, I sometimes buy a bulk quantity of splitters for commercial use for maybe twenty cents each. I evaluate every signal before it goes into each splitter and after with a spectrum analyzer and have never seen any non-linear attenuation that would degrade the quality of the signals provided the splitters were rated for the frequency I was using them for.
The only frequency problems I have ever experienced with any passive devices were that when I tried to use "European-style" splitters rated for 450Mz to 1,750Mz for cable TV signal distribution, I noticed that they choked out channels 4 and cable channel 17, which is roughly one harmonic interval above channel 4, and some of the old Jerrold walltaps that are rated for use up to 400Mz (below the American domestic television UHF band) severely rolled off UHF channels 20 (507Mz) and 26 (543Mz). That's it for passive splitter problems that I have experienced distributing broadcast-band (54Mz-806Mz) television signals in over 30 years of RF distribution work.
For cable TV, they should be sending you the level of signal sufficient for your usage. If their input signal is inadequate, then they either can increase your input tap value slightly or furnish you with a small signal amplifier. I am strongly averse to feeding cable TV signals into satellite multiswitch inputs. The satellite multiswitches that don't amplify broadcast TV signal typically lose about 14 or more 4B in their 4-way models, which is worse than you would do with external splitting and diplexing if needed, and their amplifiers often are not designed to handle a 100+ channel load of modern cable TV and develop intermodulation byproducts that degrade signal quality.
If one or more of anyone's broadcast DTV input signals is so weak that splitter loss really does degrade its performance, then he should put a low-noise, low gain (under 20dB) preamplifier on the antenna lead. That will help much more than a splitter that might fortuitously lose a fraction of a dB less than some other splitter.