View Single Post
Old 28-Jun-2012, 8:08 PM   #3
ant
Member
 
Join Date: Feb 2012
Posts: 83
Quote:
Originally Posted by GroundUrMast View Post
The answer is, "It depends".

RG-6 type cable has roughly 6 dB loss per 100 feet.

If the signal at the antenna is strong, let's say it has a NM value of 45.0 dB, you'll be able to theoretically connect 500 feet of cable (30 dB of loss) and still have a 15 dB fade margin at the far end.

On the other hand, starting with a weak signal such as your TVFR suggests, you can drive far less cable. Take as an example, KVCR real channel 26 which is predicted to arrive in the air at a NM of 8.1 dB. If you used a high gain antenna that had 12 dB of gain you would add the 12 dB antenna gain to the 8.1 dB giving you a NM of 20.1 dB at the antenna terminals. This is about 5 dB above what I consider a fairly conservative 15 dB fade margin, which means that if you add a 4-way splitter with 7 dB of loss, you'd leave the splitter with only a 13 dB fade margin and every dB of cable loss between there and the TV would drop you closer to the point of signal drop-out.

A preamp will help you if you have a weak but otherwise usable signal at the antenna. Preamplifiers and distribution amplifiers are used to 'push' the signal through lossy devises such as splitters and coax. Amplifiers do not and can not 'pull' more signal from an antenna.
Ah thanks. I just realized I have the wrong zip code entered. Ugh. It should be http://www.tvfool.com/?option=com_wr...1349a677fb8dec but then the results are still poor. So yeah it was weird to see some channels vanished in some rooms and compared to directly to the antenna (short coax cable).
ant is offline   Reply With Quote