View Single Post
Old 28-Jun-2012, 7:42 PM   #2
GroundUrMast
Moderator
 
GroundUrMast's Avatar
 
Join Date: Oct 2010
Location: Greater Seattle Area
Posts: 4,773
The answer is, "It depends".

RG-6 type cable has roughly 6 dB loss per 100 feet.

If the signal at the antenna is strong, let's say it has a NM value of 45.0 dB, you'll be able to theoretically connect 500 feet of cable (30 dB of loss) and still have a 15 dB fade margin at the far end.

On the other hand, starting with a weak signal such as your TVFR suggests, you can drive far less cable. Take as an example, KVCR real channel 26 which is predicted to arrive in the air at a NM of 8.1 dB. If you used a high gain antenna that had 12 dB of gain you would add the 12 dB antenna gain to the 8.1 dB giving you a NM of 20.1 dB at the antenna terminals. This is about 5 dB above what I consider a fairly conservative 15 dB fade margin, which means that if you add a 4-way splitter with 7 dB of loss, you'd leave the splitter with only a 13 dB fade margin and every dB of cable loss between there and the TV would drop you closer to the point of signal drop-out.

A preamp will help you if you have a weak but otherwise usable signal at the antenna. Preamplifiers and distribution amplifiers are used to 'push' the signal through lossy devises such as splitters and coax. Amplifiers do not and can not 'pull' more signal from an antenna.
__________________
If the well is dry and you don't see rain on the horizon, you'll need to dig the hole deeper. (If the antenna can't get the job done, an amp won't fix it.)

(Please direct account activation inquiries to 'admin')
GroundUrMast is offline   Reply With Quote