^I've read discussions on that specific report. Best explanation thus far is that the MW's heated thermally sensitive portions of the cells, which in turn caused the increased activity. Basically, a byproduct of the only know reaction of living tissue to microwaves: thermal. Entirely possible, but a long stretch from that to cancer.
30KW is a fair number. In the US it ranges from 20KW for most licensed stations to 50KW for superstations. However, that's individual broadcasters. Many antennas will mount multiple stations, one on top of the other, with a cumulative output in the 100KW range. Such stations operate at max power for almost all of their uptime, unlike a mobile which only operates at max in severely adverse signal conditions (roughly -110 db). So while 0.8-1.6 watts is the tested limit listed by the FCC, most cellphones will be operating in the 0.1-0.2 watt range most of the time.
Regarding antenna placement, as said previously, most modern cellphones put the antenna in the base of the phone, and this is usually taken into account in FCC SAR ratings, which is why you can have some phones at the 1.6W/kg limit, while others at half that.
Plugging all that in to your calculator (nice find, I really didn't want to run the calculations manually), we still run into something close to what you reported previously, that cellphones irradiate tissue at a distance of 1-2 inches to a similar dosage that high power unit does at 100-200 feet. Given that we largely agree on that point, what does that say about my earlier assertion? It pretty much confirms what I was saying. RF workers (and in the case of office towers with broadcast facilities, possibly unassociated workers) are exposed to similar levels as cellphone users. Possibly a 3rd group as well, high-tension power line workers, which due to operational necessity, often work on or right next to active lines (often from helicopters to avoid grounding and therefore electrocuting themselves) at a proximity almost as close as mobiles. However, I coludn't dig up the precise EMF emissions of a high voltage power line, so it may or may not generate enough power to be relevant.
Of those three groups, it would be easiest to separate out the RF and line engineers. Of those remaining two, which have much higher exposure times than even the heaviest cellphone user (8+ hour shift versus 3 or so hours reported as a "heavy user" in most phone studies), there is no statistical variance in cancer rates from the general population. That combined with the fact there is no conceivable physics that could bridge the gap from thermal heating to cancer, leaves one to conclude that it is highly unlikely there is any link, or a link that will be uncovered in the future.
I don't know, plug in the numbers into the RF exposure calculator and find out. I'm not sure what power a typical TV station broadcasts at. I'm not claiming to know all the answers.
30KW is a fair number. In the US it ranges from 20KW for most licensed stations to 50KW for superstations. However, that's individual broadcasters. Many antennas will mount multiple stations, one on top of the other, with a cumulative output in the 100KW range. Such stations operate at max power for almost all of their uptime, unlike a mobile which only operates at max in severely adverse signal conditions (roughly -110 db). So while 0.8-1.6 watts is the tested limit listed by the FCC, most cellphones will be operating in the 0.1-0.2 watt range most of the time.
Regarding antenna placement, as said previously, most modern cellphones put the antenna in the base of the phone, and this is usually taken into account in FCC SAR ratings, which is why you can have some phones at the 1.6W/kg limit, while others at half that.
Plugging all that in to your calculator (nice find, I really didn't want to run the calculations manually), we still run into something close to what you reported previously, that cellphones irradiate tissue at a distance of 1-2 inches to a similar dosage that high power unit does at 100-200 feet. Given that we largely agree on that point, what does that say about my earlier assertion? It pretty much confirms what I was saying. RF workers (and in the case of office towers with broadcast facilities, possibly unassociated workers) are exposed to similar levels as cellphone users. Possibly a 3rd group as well, high-tension power line workers, which due to operational necessity, often work on or right next to active lines (often from helicopters to avoid grounding and therefore electrocuting themselves) at a proximity almost as close as mobiles. However, I coludn't dig up the precise EMF emissions of a high voltage power line, so it may or may not generate enough power to be relevant.
Of those three groups, it would be easiest to separate out the RF and line engineers. Of those remaining two, which have much higher exposure times than even the heaviest cellphone user (8+ hour shift versus 3 or so hours reported as a "heavy user" in most phone studies), there is no statistical variance in cancer rates from the general population. That combined with the fact there is no conceivable physics that could bridge the gap from thermal heating to cancer, leaves one to conclude that it is highly unlikely there is any link, or a link that will be uncovered in the future.
Last edited: