Offgridkindaguy said:
Heck, Trees eat my signal up. And, there isn't any metal in them!
For another point of reference:
I have a small, battery-operated, portable radio that provides a readout in "dBu" referenced to the received field. It does not measure true field intensity, but hopefully it does show relative fields (comparisons) with tolerable accuracy. At this point I have not confirmed that, though. Anyway...
Earlier today in daylight conditions, I took it outside and recorded the readings of that relative field indication for two, licensed AM broadcast stations -- first 20 feet from my house in the directions of those stations (which are about 180 degrees apart), and then 20 feet on the opposite side of my house in the directions of those two stations. One station was local, the other was not. In both cases the internal loopstick antenna in the receiver was oriented for maximum indication of field, which was normal (90 degrees) to the directions toward the stations, as expected for clear locations both by theory and practice.
The local station showed of field of about 80 dBu on
both sides of my house. The distant station showed a field of about 30 dBu on
both sides of my house.
That 50 dB voltage ratio indicates that the local station has about 316 times more field intensity near my house than the distant station has.
But the more important indication in this experiment is that the wires in residential houses along the propagation path cause very little loss to radiated signals in the AM broadcast band -- whether those signals are strong, or weak. Non-metallic trees even less, probably.
This finding also rather well supports my original post in this thread, inasmuch as my house contains a goodly number of vertical and horizontal wires used in a.c. power distribution and also copper water supply/sewer pipes, and an approved "ground connection" for the incoming a.c. service -- which is much more than 2 ohms at its single-wire connection point, no doubt.
RF