In the analog days, the most common signal meter measured the voltage at the output of some type of filter designed to limit the received signal to the channel being measured. This method provides an indication of raw power present, but does not distinguish between desired signal and noise that may be mixed into the signal.
The automatic gain circuit (AGC) in the tuner of most radios and TVs would provide an uncalibrated version of the signal strength measurement as it produces a voltage that is dependent on the strength of the received signal.
In the analog days, you would see interference in the video. Ghosting, herring-bone pattern and snow would be common descriptions of interference.
Successful reception of digital video and audio depends on error free reception. The various types of interference that affect analog signals is quite capable of causing errors, even when the received signal is strong in terms of raw received power. So, the ATSC standard used for OTA digital TV includes 'extra' data to make forward error correction possible. The receiving TV is able to receive a signal that has some errors, but if enough of the 'extra' data makes it through, the TV can determine what errors occurred and correct them before passing the video+audio data to the display and audio decoders.
So, a 'good' ATSC signal meter will provide an indication of the signal power and also report the error rate. The problem still remains that just because a meter can decode the signal error free, your tuner may not... there are various generations of tuner components and variations from vendor to vendor.
The average consumer will have a hard time spending hundreds to thousands of dollars on a signal meter... when what counts is, a reliable signal displayed on the screen.
__________________
If the well is dry and you don't see rain on the horizon, you'll need to dig the hole deeper. (If the antenna can't get the job done, an amp won't fix it.)
(Please direct account activation inquiries to 'admin')
|