You have to first wrap your head around the logarithmic decibel scale and leave percentages behind. (
http://www.hdtvprimer.com/ANTENNAS/g....html#decibels) Also, do NOT confuse the "percentage" displayed on the "signal meter" with the prior. Ignore the number on the TV set screen until it gets to your dropout number, whatever it happens to be for that particular set.
Under lab conditions, the dynamic range of digital tuners is about 80 dB, that's a difference of
100,000,000 times signal power between the weakest and the strongest usable signals. You can see that you can cut that signal in half (3 dB, 50%) numerous times (almost 27) before running out of signal power.
Use a figure of 4 dB insertion loss for a two-way split for planning. Most of the splitters I've tested come in somewhere between 3.3 and 4.5 dB per port.
100' of RG6 and a balanced 3-port splitter will total about 10 dB of insertion loss, worst case. As long as your available signal power at the antenna plus the gain of the antenna exceed this margin, then there's plenty of signal to go around.
Quote:
For example, let's say I've got a tv currently getting an average of 85 percent signal strength from a given network station. If the antenna wire feeding that television is then split one time in order for the antenna to service two televisions, what would be a realistic signal strength drop?
|
Impossible to predict since the displayed "percentage" has no known correlation to anything we know about. What is important is determining where it drops out, then staying above that with some margin.
Quote:
the amount of degradation that takes place with each split?
|
It's not "degradation", it's attenuation. Degradation implies that distortion or damage has taken place but it hasn't, the signal is simply weaker.