Locking on to a signal requires that the signal has to be strong enough to be separated from the background noise. There will always be electromagnetic noise from the environment. If you have a weak channel, the amplifier will amplify what is there, signal and noise.
When amplified, the ratio of the signal to the noise does not change, only the amplitude of both. It's the ratio of signal to noise (the SNR) that is important. Most TVs have a very sensitive tuner that can lock on to tiny signals, as long as the SNR is good enough.
So unless you have a very long cable run (ie 100' or more) you don't need an amplifier.
<edit> Also, you mentioned you get "crystal clear reception" when you get anything. This is part and parcel of digital TV. When the tuner can lock on to the carrier, the picture is perfect. When it can't, you get nothing. You'll never see the snow and banding of the old analog TV - it just doesn't happen with digital. If you are on the cusp of adequate reception, you'll get the signal dropping out, or see pixelation in the picture (ie blocks of random scrambled pixels - a side effect of the data compression method used). But you have to be right on the edge - the boundary is very sharp.
Last edited by timgr; 19-Feb-2015 at 9:03 PM.
|