View Single Post
Old 14-Aug-2020, 3:04 PM   #3
Senior Member
Join Date: Sep 2016
Posts: 244
The voltage signal that the amplifier sees consists of "usable signal" (the part that the TV tuner needs to decode) and "noise". I think in broad terms, amplifiers increase both components equally, so that the ratio of the usable signal to the noise (SNR) stays the same (except for a bit of added noise which is characterized by the amplifier's noise figure). On the way down an your transmission line, the voltage peaks drop, but the thermal noise floor stays the same. So the SNR is decreasing with distance. The ratio between the peaks and the noise floor must be greater than 15 dB (about 32x) at the tuner to decode properly.

So, amplifiers can't improve SNR between their input jack and their output jack since they amplify the whole signal (usable signal + noise) indiscriminantly.
jrgagne99 is offline   Reply With Quote