View Single Post
Old 10-Oct-2015, 5:41 AM   #1
scott784
Senior Member
 
Join Date: Dec 2010
Posts: 101
Help me understand signal loss when using a splitter

When using an antenna with a splitter, it was my understanding that even one split caused a 50 percent drop signal strength coming down to two individual televisions. That has always sounded significant to me and a driving force to stop me from ever considering a single antenna for multiple televisions.

But then I hear more recently that it doesn't really work like that. I read recently that it's only something like 3.4 db for each split. And in reality, a single split (using a single antenna) is really minimal loss. So now I am just trying to wrap my head around it and get an understanding of what the true reality is.

Recently, I have ordered an Antennas Direct DB8 which I intend to install in my attic and start out by only running the line down to a single television. But ideally, I think it would be great if I could tie the antenna into my cable box whereby allowing me instant antenna access at each outlet for my 3 televisions.

However, I am never going to entertain this idea unless I can get some idea that the signal loss is truly very minimal (at least with using a booster). Can someone shed some light on this topic for someone that doesn't fully understand the amount of degradation that takes place with each split?

For example, let's say I've got a tv currently getting an average of 85 percent signal strength from a given network station. If the antenna wire feeding that television is then split one time in order for the antenna to service two televisions, what would be a realistic signal strength drop? And if a booster is introduced at the cable box, are some for more effective than others in terms of mitigating this drop in signal?

Many thanks to someone who can share their insight to enable me to get a better understanding of any consequences of doing a split, even if it's just for two televisions.
scott784 is offline   Reply With Quote