Quote:
Originally Posted by billg
Ah, so you're taking the Fear of God (tm) approach.
Well, I've seen enough nearby lightning strikes that that makes sense. My main problem is that my electrical service ground is about 70' away (as the crow flies) and probably more like 150' away from the most practical spot to put the antenna (our home is an odd design that consists of two buildings connected by a covered breezeway). I presume that at a certain distance the ground wire is going to be less effective, right? Should I try to split the difference, or run as short a ground wire as possible and have to deal with a much longer coax run?
Thanks again for all of the great information!
|
Maybe you don't appreciate the difference in the resistance of a copper wire compared with the resistance of the earth between the two ground rods, one at your tower and the other serving your house. The resistance of #6 copper wire is about 0.4 ohms per 1000 ft. If the ground resistance between is optimistically 2 ohms, that's about 50 times the resistance of that 100 ft of copper wire. Use a larger wire, and the difference becomes even larger. Two #6 wires in parallel is now 100 times difference. A single 1/0 wire has less than 1/4 the resistance of a #6 wire, making the difference 200 times. This brings that 1000 volt potential difference between the tower and the house in GroundUrMast's example down to 20, 10 and 5 volts, respectively. It's Ohm's law.
This is a big difference. Voltages above 600V placed across dry skin will break down the skin barrier and conduct through the aqueous tissue below - causing a huge jump in conductivity. The difference in risk of electrocution is significant.
My advice - put the tower where you want it, and buy enough heavy wire to bond the tower to the building ground. The distances you are talking about are not far compared to the earth potential differences without the bonding wire.